Thank you!
The full article is available below.
You will also receive a follow-up email containing a link so you can come back to it later.
Introduction
In the words of British economist and Nobel laureate Ronald Coase “If you torture the data long enough, it will confess to anything.”
For years, our fellow bar review companies have been torturing their data sets to arrive at remarkably high pass rates that they publish on their websites and use in marketing campaigns. How do they do it? In a word, by putting so many caveats on the data (the fine print) that it becomes virtually meaningless.
Here is just a partial list of the types of conditions they have used to get the U.S. Bar exam pass rates they want to report:
-
Percentage of course completions
-
School size
-
Jurisdiction and geography
-
School accreditation
-
Past exam performance
-
Academic background of test-takers
-
Self-reported, unverified exam results
-
Inexplicable margins of error
Of course, no credible statistician would ask anyone to rely on such tortured data. At BARBRI, we measure bar review efficacy scientifically: We model expected results and compare the model’s predictions to actual results. This method proves that taking BARBRI increases the average student’s score by 19 points.
* R. H. Coase, Essays on Economics and Economists (University of Chicago Press, 1995)
No Universal Standards
The problem with using bar pass rates as a proxy for course quality is that there is no standardized way of calculating them. To do so, somebody must make dozens of decisions about how exactly to “do the math”, and without a globally standardized objective method, each bar prep company will make those decisions in the way that produces results that are most favorable to them.
Why Published U.S. Bar Pass Rates Can Be Misleading
That is why all our competitors show “pass rates” around 90 percent, while the National Conference of Bar Examiners (NCBE), which administers the exam, publishes national pass rates in the 70%s. And if BARBRI were to add the types of conditions that other companies use, our pass rate would also be above 90%. But how can every bar review company have a pass rate that high when the true national pass rate is below 80%? It’s simple: all of the caveats that our fellow bar review companies are using limit the data so much as to render it meaningless.
For example, when calculating the total number of students who took a course (and should therefore be included in the number who passed/total exam takers fraction denominator), if a student purchased the course but never logged in, should they be counted? What if they completed only 5% of the course? What about 20%? 50%? Whoever does the math has to arbitrarily set that threshold somewhere. A company might say something like: “we count all of our students,” but that is just a semantic trick: How are they defining “students”? Everyone who enrolls? Everyone who completes their course? A different number? There are too many small decisions like this that have a major impact on how the final numbers turn out.
National Conference of Bar Examiners July 2023 Pass Rates
Fairness, Bias, + Inclusivity
Even if all the bar prep companies got together and developed a standardized, independently auditable way of calculating bar exam pass rates, the results would still be a terrible way to measure course quality. Why? Because it is harder for some students to pass than for others. According to NCBE data, students with lower undergraduate GPAs and LSAT scores tend to score lower on the MBE, and BARBRI’s internal data shows that students with lower law school GPAs pass bar exams at lower rates. Comparing raw pass rates without controlling which students take which courses tells us nothing about the quality of the courses.
BARBRI is dedicated to making our programs effective for students of all abilities from all backgrounds. This is why we attract the most law students from disadvantaged backgrounds and those who require accommodation. This commitment may impact our pass rate statistics, even though it is a marker of BARBRI’s superior quality and dedication to serving all students. Read more about BARBRI’s mission and values.
Ethically Measuring Bar Prep Quality
Here is how a professional statistician would measure the quality of a bar review course:
-
Obtain a valid dataset from a law school over multiple years that tracks perfect pass/fail/did not take data along with other relevant data about their students, such as LSAT scores and law school GPAs.
-
Create a model that predicts bar exam pass rates based on statistically significant variables (such as LSAT scores and law school GPAs) agnostic of which bar review course each student took.
-
Divide the students by bar review company and use the statistical model to predict a pass rate for each company (for example: of the 100 students who took BARBRI, 72 should have passed based on the model above; and of the 50 students who took Company X bar prep, 38 should have passed, etc.)
-
Compare the actual results to the predicted results (e.g., 77 BARBRI students actually passed, compared to a prediction of 72, whereas 36 Company X students actually passed, compared to a prediction of 38; therefore, BARBRI is adding significantly more value).
This method is, in fact, exactly what we at BARBRI have done at more than 30 law schools. The result? Taking BARBRI increases a student’s likelihood of passing the bar exam by a statistically significant amount nearly every time. This is how and why we calculated and published that “students who follow BARBRI’s proven process score an average of 19 points higher on the bar in UBE jurisdictions than those who don’t use BARBRI.” This is a more accurate and meaningful statistic than pass rates. Will 19 points matter to you? It depends on how close to the pass/fail line you end up, but that cushion of extra points certainly buys you some insurance, and could make all the difference.
Conclusion
Comparing pass rates of different bar review companies is like comparing the finish time of runners who got to choose their own starting and finishing lines in a race. The results are meaningless because everyone ran different distances. Without a standardized, auditable method of calculating pass rates, bar review company pass rates are just as meaningless.
About the Authors
Andrew Suszek holds a Master’s Degree in Data Science—Analytics and Modeling from Northwestern University and graduated cum laude with a J.D. from DePaul University College of Law.
Sam Farkas graduated with a J.D. from Florida State University College of Law and is licensed to practice law in Florida and Georgia.
Unlock the Full Article
Bring Your Goals Within ReachTell us a little about yourself and your goals to display the full article and gain access to more resources relevant to your needs.
Interesting in reading more? Fill out the form to read the full article.