"Peer assessment score" – the opinion of deans and certain faculty about the overall quality of a law school – accounts for 25% of a school's score in the U.S. News ranking. It is the most heavily weighted item. Bar passage, for comparison, is just a bit over 2%. When told this my pre-law students almost inevitably say — "why would I care what deans and faculty at other schools think?"  

Below are the 25 schools that have the lowest peer assessment relative to overall rank and the 25 schools with the highest peer assessment relative to overall rank. Tier 2 schools are not included because they do not have a specific overall rank. TaxProfBlog provided the data

I am not unbiased here. I teach in the business school at Belmont University, and our law school has the biggest negative gap between peer assessment and overall rank. There are some reasonable reasons for this gap — e.g., the school is young (the law school founded in 2011, though the university was founded in 1890) and a lot of deans/faculty may not know that the law school is doing well on incoming student credentials, bar passage, and employment. FIU, the #2 school is also relatively young (founded in 2000). But it seems to me that the fact Belmont University is a Christian school and (former attorney general under George W. Bush) Alberto Gonzales is our dean is doing at least some of this work. 

10 out of the 25 biggest gaps are among religious law schools (in bold below). George Mason also likely gets hit for being openly conservative. Granted, this cannot be the only driver of the gaps . Also, there are 6 religious schools among the list of schools that have a high peer assessment relative to rank, so religion doesn't seem disqualifying. That said, there are exactly 0 Protestant schools among the high relative peer assessment score list (and I am not sure any of them are significantly conservative in reputation…so maybe it is the conservative reputation more than the religious reputation doing the work). 

Anyway, I'm pretty interested in these gaps. Peer Assessment is supposed to measure overall quality of the school. What part of that "overall quality" is not already captured in the rest of the measures? Faculty research? Faculty Twitter followers? Faculty SEALS/AALS attendees? Moot Court National Championships? Something else? Feel free to leave comments below.  

Updated to correct confusion between FIU and Florida Coastal (H/T Matt Bodie); Updated to show San Diego and Seattle are religious.

Low Peer Assessment v. Overall Rank

  1. Belmont (-43)
  2. Florida Int'l (-31)
  3. New Hampshire (-31)
  4. Wayne State (-30)
  5. Baylor (-25)
  6. Drake (-25)
  7. Texas Tech (-25)
  8. Cleveland-Marshall (-25)
  9. BYU (-23)
  10. George Mason (-23)
  11. Missouri (Columbia) (-23)
  12. Penn State-Dickinson (-23)
  13. St. John's (-23)
  14. Dayton (-22)
  15. Duquesne (-22)
  16. Villanova (-20)
  17. Samford (-20)
  18. Pepperdine Caruso (-18)
  19. Washburn (-18)
  20. Tulsa (-16)
  21. South Dakota (-16)
  22. St. Thomas (MN) (-15)
  23. Cincinnati (-14)
  24. Drexel (-14)
  25. Penn State-University Park (-13)

High Peer Assessment v. Overall Rank

  1. Santa Clara (+53)
  2. Howard (+43)
  3. Seattle (+43)
  4. Loyola-New Orleans (+37)
  5. American (+33)
  6. San Diego (+30)
  7. Indiana (McKinney) (+28)
  8. Rutgers (+27)
  9. Hawaii (+25)
  10. Denver (+22)
  11. Georgia State (+22)
  12. Baltimore (+22)
  13. Gonzaga (+22)
  14. Arkansas-Little Rock (+22)
  15. Tulane (+20)
  16. Miami (+20)
  17. Idaho (+20)
  18. New Mexico (+19)
  19. Chicago-Kent (+18)
  20. Brooklyn (+17)
  21. Maine (+17)
  22. Memphis (+17)
  23. UC-Irvine (+16)
  24. Loyola-L.A. (+16)
  25. Oregon (+16)