Scholarly Impact Study

Scholarly Impact of Law School Faculties:
Extending the Leiter Rankings to the Top 70 (Sept. 2010)

Gregory Sisk, Valerie Aggerbeck, Debby Hackerson & Mary Wells
University of St. Thomas School of Law (Minnesota)


This study explores the scholarly impact of the faculties at all law schools accredited by the American Bar Association and then ranks the top seventy law faculties. Refined by Professor Brian Leiter, the “Scholarly Impact Score” for a law faculty is calculated from the mean and the median of total law journal citations over the past five years to the work of tenured members of that law faculty.  This study extends Professor Leiter’s study of the Top 25 law faculties to rank the Top 70 law faculties in order by scholarly impact (by reason of ties in ranking position, 71 schools actually are included).  Following the same methodology and search parameters, we also applied a discount rate to back-date Scholarly Impact Scores to January 15, 2010, so that the results for additional law faculties could be integrated with Professor Leiter’s Scholarly Impact Ranking for the Top 25 law faculties reported earlier this year. See “Top 25 Law Faculties in Scholarly Impact, 2005-2009.” In addition to a school-by-school ranking, we report the mean, median, and weighted score for each law faculty, along with a listing of the tenured law faculty members at each ranked law school with the highest individual citation counts.

Representing about one-third of accredited American law schools, the law faculties ranked in this study have demonstrated concretely a strong collective commitment to legal scholarship.  As previously ranked by Professor Leiter, the law faculties at Yale, Harvard, Chicago, and Stanford stand out nationally in scholarly prominence, followed by several others that are traditionally ranked among the elite law schools.  The new law school at California-Irvine shows early signs of becoming a scholarly leader, while Florida State continues its upward movement by entering the Top 25.

Extending the ranking to the Top 70, Cardozo and Ohio State fall just outside the Top 25, while George Mason, Hofstra, Case Western, the University of St. Thomas (Minnesota), Pittsburgh, Hawaii, Brooklyn, Nevada-Las Vegas, the University of San Diego, Chicago-Kent, and Missouri-Columbia achieve strong rankings well above those assigned by U.S. News.  Several law schools accredited within the past twelve years—the University of St. Thomas, Nevada-Las Vegas, Chapman, and Florida International—have already made a scholarly impact that dramatically outpaces their academic reputations.

For further discussion about the scholarly impact of law faculties, information about the nature and methodology of this study, comparisons with other rankings of law faculty quality and law schools, and commentary on the results of this scholarly impact study, please refer to the report, “Scholarly Impact of Law Faculties:  Extending the Leiter Rankings to the Top 70 (Sept. 2010).

UST Law Scholarly Impact Study Group
scholarlyimp@stthomas.edu

Updates Closed for 2010

We have completed our continuing review of the data and our responses to law school inquiries for 2010.  When Professor Brian Leiter next prepares a ranking of the Scholarly Impact Top 25, whether next year or the year after, we likely will conduct another study extending that ranking to the top 50 or 60 or 70.  While the methodology likely will remain much the same, and we will carefully track any changes made by Professor Leiter in the methodology so that the extended ranking is consistently calculated, we are considering changes in our protocol to seek information from the law schools that will be studied, beyond the inquiries about tenure status that we made to each school this year.

If a full new ranking is not conducted in 2011, we may invite law schools in the summer of 2011 to submit their own updated citation counts, which should be prepared consistent with the scholarly impact score methodology outlined in our full 2010 report (with the search period revised to the five-year period of 2006-2010).  We would then post those results along with a notation on where that law faculty would fall in the ranking based the 2010 results (and assuming all other things remained equal). We could not endorse the accuracy of the results submitted by law schools, but we would spot-check the submissions for apparent compliance with the methodology.  Thus, an interim quasi-ranking would be available for those schools that wish to demonstrate increased or consistent scholarly impact scores in 2011.

Update (9/17/10):

Notre Dame Moves Up to # 38 (Weighted Score: 324):  Tied with Case Western, University of St. Thomas (Minnesota), and Pittsburgh

Case Western Moves Up to # 38 (Weighted Score:  328):  Tied with University of St. Thomas (Minnesota), Notre Dame, and Pittsburg

Georgia Joins the Ranking at # 61 (Weighted Score:  242) :  Tied with BYU, Boston College, Cincinnati, Tulane, and Florida International

In reporting the scholarly impact of law faculties and extending Professor Leiter’s important study, we have been committed to doing it right, fixing mistakes that we find, and learning from the experience of our first application of this methodology.  As described in more detail in the full report, when running citations on more than three thousand law professors, we necessarily relied on the faculty rosters submitted by each law school which appear in the AALS Directory.  And, indeed, for most purposes, we are obliged to stand firm in our reliance on the rosters submitted by the schools, particularly in terms of faculty titles, status, etc.  In two instances, a law school had sent a roster of faculty names to the AALS that included a name presented differently from the name by which a highly-cited scholar at that school publishes.  The variation between the school’s roster listing and that publication name meant that our careful search parameters nonetheless missed most citations to that individual scholar.  We think that this inadvertent and substantial under-counting adds up to a significant departure from the most accurate results, even if the error was not of our making or in the methodology itself.  In each instance, an outstanding scholar within the scope of the study was not given full credit for citations.  In a third case, we made an error, and the responsibility is ours.  In seeking information from each school on tenure status of associate professors, we mis-recorded a report from one school, which led to accidental inclusion of more than one untenured faculty member in the tally for that school, resulting in a significantly under-stated mean, median, and weighted score.

Three updates are noted above and are included in the report and the tables as super-imposed on the existing ranking framework.  Because the results for the other schools remain unaffected and each other school in the ranking continues to hold exactly the same place on the scaled scoring, no other adjustments in ordinal ranking have been made.