Reminder: LTC Heffington’s letter and Supe’s letter
LTC Heffington says that
The Superintendent refuses to enforce admissions standards or the cadet Honor Code,
while the Academy responds
Our current Corps of Cadets is comprised of the finest young men and women we have ever gathered here at the Academy as evidenced by their performance in the classroom, in their athletic endeavors, and in their field training.
So which is it? Is the Academy failing to enforce high admissions standards, or is the Corps the “best ever”? We investigate both claims numerically using our dataset, obtained by Freedom of Information Act (FOIA) from USMA. (see end of post for notes on data and analysis.)
In addition to West Point’s reputation for having a high admissions bar, the CQPA (cumulative quality point average) trend tends to bear USMA’s point out (barring considerations such as grade inflation or class difficulty, which we have no data on). Standard Deviation is included for an idea of the distribution shapes.
A word on CQPA:
The Cadet Performance Score (CPS) was the weighted average of the cumulative Program Scores.
We don’t know what each of the weights were by year, but in the mid 2000s, the APS (academic program) contributed 55%, the MPS (military program) contributes 30%, and the PPS (physical program) contributed 15%. Program Scores are standardized before being combined with their weights. This may have changed over time, but the upshot is that the CQPA is the combined, weighted average of all aspects of the USMA program – or, in a single number, the best holistic indicator of performance of a cadet, and therefore presumably as an officer.
Update 5/19/19: a commenter pointed out that we were incorrect in equating Cadet Performance Score and the Cumulative Quality Point Average. CQPA is properly defined as “… an index of cumulative performance in all academic, military science, and physical education courses. It generally corresponds to grade point average (GPA) or grade point ratio (GPR) in other colleges and universities.” The CPS includes other extracurriculars in the program such as PFTs, MIADs, etc, but for the purposes of this post, we’ll consider CQPA as the most directly measurable and dominant component of the cadet performance. Evaluation of the true CPS will require additional analysis, since the data set includes CQPA, CAPS, CMPS, and CPPS, but not the cadet’s CPS; and it’s unclear whether the performance scores provided have already been normalized.
But admissions excellence should not be judged just on the cadets in the class – we need to understand the opportunity that USMA had to build a class. If USMA is pursuing excellence, it ought to seek to build the class with the best outcomes as measured by its metrics, which include test scores, CQPA, and graduation rates.
The best leading indicators for performance are test scores.
A general outline of USMA admissions methodology can be found here, in a study conducted by RAND. For our purposes, it notes that test scores (ACTs, SATs) are predictive of cadet and later officer performance, and are weighted more heavily than other components in admissions scoring.
We will use the combined Math + Verbal (critical reading) SATs for the classes of 2010-2017 to provide a guide to what USMA could have done. We note that the SAT scoring methodology was consistent for these classes, as shown here, that the three-subject-matter test (math, critical reading, writing) was administered from 2006-2016. So this covers scoring consistency for the classes of 2010-2017 considering when the candidates had to take the test. However, 3 SAT scores only start showing up in applicant data for class of 2013. Therefore, we use Math + Verbal only for this analysis.
The SAT score distributions for the pool of applicants vs those attended looks like this:
So far, so good! Just from looking at the relative curves, USMA is taking a higher distribution of test scores than the general applicant pool.
What about against the pool of those deemed “qualified,” that is, those who have applied and gone through the various tests, and that USMA admissions has considered as admit-ready, barring perhaps a congressional nomination?
We see that the “qualified-not-attended” pool actually has a slightly higher SAT M+V average (1287) than the attended pool (1252). So USMA is taking more candidates on a lower end of the lower range of scores than the numbers would tell us it needs to.
Let’s look at it another way and calculate percentage yield of qualified-not-attended vs attended by score range:
(We note here that we do not have data indicating who actually received an admissions offer from USMA; just categories for “Candidate IDs”, those “qualified”, and data indicating who actually started at school. So we cannot calculate true offer yield, which we would want in order to see USMA’s true opportunity to get talent.)
We see that USMA is taking all of the qualified folks at the low end of the scale, and only about 55-65% of those qualified at the higher end. The candidates at the lower end likely accept at higher rates due to having only worse options elsewhere, while the most-qualified folks likely have other options. But shouldn’t USMA be targeting a higher yield of the more qualified applicants?
Publicly Published Class Data
2022 update: A reader shared a compiled table USMA’s publicly shared information on separations and attendance (Thanks!). We have added the calculated shaded columns.
This does not include SATs for the Qualified & Nominated (Q&N)-but-not-offered group, which we analyze in depth in another post. We note that Q&N-not-offered group is larger in all years except 2012 than the non-accepted group, even after accounting for candidate non-acceptances.
We also note that “Nominated” on these tables is misleading because it is in passive voice and implies that nominations are a given, that West Point has to accept who was nominated. And in fact, there are nominations for which the Academy has no discretion in who to accept. But there are other nominations available for which it has great discretion. In some cases USMA (the Superintendent or SecArmy) can even choose specifically who is nominated. See this post [link] for more information.
So to imply that “nominated” is outside of the control of the Academy is misleading, when the Academy can in fact select from the pool qualified cadets to nominate. It might be more accurate to note the number of Qualified candidates and leave “nominations” out of it, or to specify nomination source.
We see that even after nominations and accounting for acceptance yield, USMA has on annual average a significant pool of qualified candidates to choose from. We recall from the charts earlier in the post that the qualified candidate pool is more qualified than the low end of the group offered admissions.
SAT & Separation Rates
We see that SAT scores are, as in other environments, strongly correlated to outcomes – separation rates by SAT are shown below for the cadet pool for classes of ’10-’17. We go through only 2017 because those classes are graduated, including handling of any turnbacks.
We look at all separation reasons excluding medical, death, personal, administrative, and religious. So these separations include academic, military, physical (including not meeting weight standards), conduct, and honor reasons, and include separations, resignations, suspensions. In other words, this includes separations that can be influenced by Admissions.
Even though this includes non-academic reasons, from the above, we see the expected probability of someone with a 920 SAT not graduating (in other words, failing) is 30%. The expected probability of someone with, say, a 1320 score failing is 17%. This means that the 920 individual is 1.8x more likely to fail than the cadet with the 1300 score, and someone with a 1040 score is 1.5x more likely to fail.
Recall from the attended vs Qualified-not-attended chart earlier that were many other qualified applicants available – 866 other “qualified” applicants at the 1240 score range, for example, with an 81% probability of graduating. So USMA is deliberately admitting individuals with lower scores than they could, knowing they’ll fail at much higher rates than more qualified & able individuals.
In the data we see some unexpected results at the 840, 880, and 1600 range. The data indicate that most of the 1600s who do not graduate leave the Academy very early, and with very low CQPAs, most likely due to pursuing opportunities elsewhere. The 840 and 880s are likely artifacts of low sample sizes.
Does that 3.02 cumulative GPA that Academy leadership bragged about actually mean anything about West Point’s performance? If it does, they might be less eager to point it out.
USMA and advocates argue that it does not place exclusive value on test scores as a predictor of performance, or that performance at USMA is more holistically assessed than just through academic prowess. This is true; the program scores on Academic, Physical, and Military components. But there is a strong correlation and fit between test scores and outcomes.
The below charts use classes 2010-17, as later classes had not graduated yet. We do not attempt to adjust for any time-series grade inflation, and assume that inflation is constant across the score ranges.
The very low-end CQPA behavior is due to small sample sizes. We see that the 1600-scoring cohort had a higher tendency to leave West Point very early. But we note that there is a .97 correlation between combined SAT scores and CQPA, with an Rsquare (fit) value of .94, and slope of .0022.
This is for all cadets who attend USMA, even those who do not graduate.
So SAT points directly correlate to better – by USMA’s CQPA metric – outputs.
What about graduation yield by CQPA? We see above that as CQPA goes up, graduation rates trend up (separations go down) as well. The slope of the separations/SAT line is -.000278, indicating that for every additional SAT point, the probability of separating goes down by .03%. The separations/CQPA slope is -.15, indicating that for every additional CQPA point, probability of separating goes down by 15%.
With all this together, we can draw the conclusion that higher test scores lead (on average) to higher CQPAs, and to lower separation rates. By extension, there is a strong correlation between test scores and likelihood of graduating, regardless of the proximate cause of separation.
So what is USMA doing to increase its yield of high-end performers, and therefore officer quality and graduation rates?
Apparently, not a whole lot. We review classes of 2010-2020 for SAT band consistency:
The table above can be read as, for example, 13% of the class of 2010 achieved between 1001-1100 combined Math + Verbal SATs. The 1100 score is approximately the 59th percentile of scores according to the college board. (Updated March 2022 after finding an error in bin calculations.)
The performance bands have remained mostly stable over the years, indicating either a lack of ability or willingness to try to improve yield at, say, the 1400+ SAT range. Surprisingly, it looks like the Academy tolerates scores in the lower 60% of percentiles for ~20% of the class.
We refer again to the Admitted vs Qualified-not-admitted (QNA) comparison:
Reminder, these are *qualified* candidates. Absolute numbers do not include everyone (ACT-only candidates/cadets) but give us enough data to ask why the Academy chooses to admit 900 cadets with scores at/under 1000 when there are 5,000 qualified candidates with scores over 1200.
Again, test variations and data variances over years may mean the exact numbers or percentiles differ. The message is that there is a lot of admissions “opportunity.”
Impacts of Taking Low Performers
Class yield has remained mostly stable over the years at about 22%, depending on which statuses we include in separations and at what times. When a class first graduates, for example, there are late grads and turnbacks who influence the separations rate. Also, this is all-cause separations, including medical, death, religious, and all other separations categories.
But is there opportunity to improve this? What impact would that have?
From our earlier tables, we posit:
If the Academy moved up recruiting standards so that it no longer admitted applicants with SATs below 1160 or equivalent aptitude test scores around the 64th percentile (arbitrarily chosen to illustrate the point), with separation rates averaging 25%:
Then we could assume a drop in separation rates from 25% on average for the sub-1160 cohort to the separation rate of ~18% for those above it. In this case, changing the number separated from 971 cadets to 699 cadets, or an improvement of 272 cadets, or about 34 / year.
And this is not implying that all the ‘replacements’ would have to be 1580-level scorers. The 1200-1440 range would provide plenty of candidates (4700+, from the data) who would meet retention requirements if that were a concern over having the highest-caliber mental talent in the class.
You can do more other analyses of potential yield for marginal gain in admissions quality. And there may always be a person for whom test scores do not tell the whole story. But regardless, this is currently a systematic and huge amount of wasted effort.
To put it in context: These 34 additional cadets per year year imply, at an average $100,000 per cadet to matriculate and train, $3.4M annually that is wasted in dollars alone. This doesn’t count opportunity cost to the Army. It is wasted because the Academy knows the yield percentages, has the opportunity select more-likely-to-succeed classes with better future officers, and does not do so.
Implementing admissions standards would be a substantial improvement. The current situation shows a large opportunity cost from continuing to admit people with strong indications of being sub-par performers. Why is the Academy continuing to do that?
Back to the Academy:
This great institution continues to evolve to meet the needs of today’s Army, and, in doing so, we steadfastly uphold the highest academic, military, physical and disciplinary standards.
While it’s never stated that the goal is “continued improvement,” or exactly what “the highest… standards” are, we clearly see that from all of the above that:
- There are known admissions metrics correlated to performance as measured by USMA
- USMA continues to admit known likely low performers
- These low performers are significantly more likely to separate than higher performers. This causes them cost and obligations they would not have otherwise incurred.
- The opportunity to improve the cadet pool does exist
- The continued practice of admitting low performers means significant opportunity cost (and real cost) in terms of number and quality of officers produced
- There is no explicit rationale or acknowledgment of these practices (that we are aware of), though there is a slick marketing blog.
Lastly, this seems to be, as Heffington says, “not unintentional; it is a deliberate action that is being taken by the Academy’s senior leadership, though they refuse to acknowledge or explain it.” USMA should explain it.
Thoughtful criticism and factual corrections are welcome.
Notes on Data
- Dataset is obtained by FOIA from USMA.
- We have observed several discrepancies in the data. For example, the column “admitted” reads “N” for all candidate records, even those with GPAs and activities elsewhere in the data clearly indicating that the candidates were admitted and attended. We therefore have to use our own assumptions and dummy variables to determine who actually attended West Point. See Errata and Data Notes for more details.
- We have observed duplicate records in the data. These are identical records with different candidate IDs in the same or different class years. It is unclear which records are correct for each year. These confound calculating totals with perfect accuracy for each year. Update 4/7/22: We have updated tables and numbers to remove duplicates that we could identify in the data.
- Many of the data columns are self-explanatory (“pr_sex_cd” as M/F). However several are not, such as usma_st_cd, which gives several codes (“S”, “C”, “G”, etc) we have to make assumptions on based on context.
7 thoughts on “On Admissions Excellence”
I appreciate the analysis but just want to point out that your definition of CQPA is actually incorrect. According to the Academy’s Redbook, “The CQPA is an index of cumulative performance in all academic, military science and physical education courses. It generally corresponds to grade point average (GPA) or grade point ratio (GPR) in other colleges and universities.” This means the CQPA doesn’t capture grades for anything that isn’t tied to credit hours (so summer training performance, semester military grades, APFT and IOCT scores, along with a few others I think). The CPS calculations of 55/30/15% are still in effect today, so you could just create a new variable in your data for CPS and that would be the most holistic indicator of cadet performance. If you compare CPS results with CQPA results, this could let us know if diversity has any effect on outside-the-classroom performance relative to just classroom performance. (Please post an email address if you want a copy of the RedBook or any of the other class rank calculation information).
Thanks for the note. I’ll update it soon.
Updated with a link to the redbook. The data provided has the CQPA, CAPS, CMPS, and CPPS scores, so we can look at that. It’ll be in the next post.