“Wow, it must have been hard to get in there.” We’ve gotten that a lot when acquaintances find out that we attended USMA. Other grads probably get it too. West Point is often described as “highly selective.” But how hard was it? How selective–or perhaps “elite” is a better word — is West Point?
We ran across the article by USMA law professor Tim Bakken entitled “Corruption In U.S. Military Academies Is Harming Our National Security.” In it, he raises the alarm about the quality of Academy admissions practices. Bakken fills out the rest of his arguments in his worth-reading book “The Cost of Loyalty”. While on principle we agree that the Academies, and in particular West Point, haven’t had first priorities as war-winning in quite some time, we didn’t know how the Academy manipulates its admissions selectivity numbers. Bakken fleshes this out with detail on how unselective the Academies actually are:
A civilian English professor at the Naval Academy, Bruce Fleming, was first to describe that Academy’s false reporting of admissions statistics after serving on its admissions board. “Our military academies aren’t filled with the best and the brightest,” Fleming wrote in 2015. “They are a boondoggle, on your dime, and serve no one.”
The military academies today are simply not the selective institutions they pretend to be. They claim falsely some of the lowest college acceptance rates in the nation, from 9 to 11 percent in 2018, when the real acceptance rate is apparently over 50 percent each year. The Naval Academy once claimed 20,000 applicants in one year, when the actual number was under 5,000. The Academy inflated its acceptance rate by counting “all 7,500 applicants to a week-long summer program for 11th graders . . . as well as anybody who fills out enough information to create a candidate number,” …
This sleight of hand is achieved through a system that the academies have long used, despite no other colleges employing it. They count as an applicant anyone on whom they’ve created a “candidate number” (Navy) or “started a file” (West Point)—a pool of tens of thousands of high school students. Any teenager who has requested a packet of information, completed a form expressing interest, or attended a summer camp at one of the three academies can be counted as an applicant, creating a wildly inaccurate percentage of students who are ultimately accepted.
Bakken goes on to note misleading ways of counting applicants designed to make Academies appear more selective than they actually are, likely for purposes of boosting rankings such as US News & World Report’s, in which West Point is rated as “More Selective“, though #21 in… national liberal arts schools. He says further, that:
“the acceptance rate should not be tied to files opened since that really is a meaningless number—the number fully qualified is a better metric” (emphasis mine). (Some students are disqualified because of a medical or physical condition.) According West Point’s own statistics, the actual acceptance rate ranged from 56 to 70 percent for “nominated and fully qualified” students—who number from about 2,200 to 2,400 each year—for six classes graduating from 2008 to 2018. During that period, West Point claimed publicly acceptance rates ranging from 9 to 16 percent.
He points out that
“West Point accepted students who scored… in the lowest West Point accepted students “who scored in the Category IV range on the test, the lowest allowable qualifying score.” At a time when “cognitive ability emerged as the strongest predictor of academic and military grades….”according to a 2019 study, for which West Point supplied the data, the all-military administration decided to reduce the value of “mental ability or achievement (SAT scores and high school GPA) . . . [which] dropped from 60 to 55 percent while the weight attached to physical measures rose from 10 to 15 percent,” according to the scientists.
What does “selective” really look like?
Examining a truly selective school would be helpful for context. Harvard College admitted about 5% of its 43,300-strong applicant pool for the class of 2024. A profile is available here. We only have 3rd-party sources for talent/aptitude profile of its applicant pool or its admitted class , and look to testing for standardization purposes. Those 3rd-party sources indicate that 30% of applicants had SATs over 1400, and that the average SAT score for admittees is 1520, which is in the 99.5th percentile of test takers. 30% of applicants is about ~13,000 high-performing kids. So they’re taking 10% of what might be reasonably called “qualified” compared to the class profiles, and most (90%-ish) in the very top decile of their high school classes. This is selective indeed!
Similarly, at Stanford [Edit 10/25/21: The page originally linked to was 404’d. We’ve updated the link to the current admissions profile for the class referenced], we see ~47,451 applicants (wow, that’s gone up in recent years!) and 1,706 enrolled, for 3.6% admissions. The SATs of that group are between 1500-1600 on average. The historical class profiles helpfully let us know that: 60% of applicants had high school GPAs over 4.0, 78% were in the top 10% of their high school classes, 58% had SAT math scores between 700-800, and 46% had SAT Critical Reading scores between 700-800. Again, that’s the applicant pool, not the admitted class.
Of the admitted class, 95% was in the top 10% of their high school classes, and 77% had Math SATs between 700-800, etc. Very selective.
Here’s a nice summary view of the class of 2016:
It’d be nice if USMA put up similar stats. But compare that to the West Point class profile below.
Further, we do know that every one of the applicants at those schools actually applied, which is more than we can say about our USMA profiles.
What West Point Claims
Back to West Point.
At the USMA Admissions class profile page we see numbers indicating that:
Apart from the very different talent distribution compared to truly selective schools (72% of the class in the top quintile of HS class vs Stanford’s 95% in the top decile, and 10% of USMA’s class in the 3rd and 4th quintiles vs trace percentages in Stanford’s), there are some problems with this scheme.
First, we don’t know how many students actually applied. How can USMA claim any level of selectivity without reporting applications?
Perhaps an application is considered complete at the “qualified” stage, after the paperwork is submitted and tests are complete. Then we’d see 50%-70% admissions for qualified applicants hardly rates as “selective.” Those are the types of numbers you see at second- or third-tier residential state colleges. And Bakken’s comments about the predictive ability of the aptitude testing are spot on.
But with “files started”, it looks like only 10% of applicants get in, comparing the “applicant files started” number to the “admitted” number. Very selective, on par with CalTech!
If we look at admissions versus the Nominated numbers, then it’s about 25%.
If we look at the fully qualified applicants, that’s more than 50% admissions, and more in line with, say, Cal Poly Pomona. CPP is a fine polytechnic, but it doesn’t promote itself as elite.
This selectivity distortion only makes sense if you register applicants as “Any teenager who has requested a packet of information, completed a form expressing interest, or attended a summer camp at one of the three academies…“
None of this is news to those following this blog. We would like to see similar statistics for admissions of qualified students from other schools such as the Ivies or higher-tiers, but they’re generally not publicly available (except Stanford, as we showed above). Though as Bakken noted above, “no other school” measures admissions selectivity the way the academies do, with “files started.” **see Updates at end of post **
The Academy leadership might respond that they stipulated “files started” in the statistics, so what’s the beef? Applicants and taxpayers could quote the Honor Code guidance on “quibbling” back at them.
“using evasive statements or technicalities to conceal guilt“
Unless the candidate completes an application and sends it to the Academy, recording a “file started” from sending literature or a checked “tell me more” box on a website is a misleading technicality. In the admissions funnel described on the web page, the number of applicants who actually applied is nowhere listed.
How To Fix It
The Academies ought to report truthfully and transparently. This means giving numbers that align with how most people intuit their meaning.
The current steps for application, according to the Academy’s website, are:
- Apply Online – This link takes you to the admissions “apply now” page, which USMA uses to screen applicants. Is this the application, or just a “form expressing interest”? We note that this is where the Academy collects initial test scores, but “Start your Application Here! Click to open the Candidate Questionnaire and begin your application” doesn’t sound like “Submit your application here.”
- Contact field force rep “to assist you with completing the application process”, implying that the application is not yet complete.
- Attend the Summer Leaders experience. Is this really necessary for application?
- Apply for a Nomination. This is required for attendance, but can an application be “submitted” without it? We seem to recall getting a Letter of Acceptance which the Congressman’s office looked upon favorably in considering an appointment. But had we completed the application by then?
- Complete tests “to remain competitive for appointment”; does that mean the application is still not yet submitted?
- Schedule a visit
- Ensure file is complete. This implies that a file is an application. If the file isn’t complete, then the application isn’t complete, which moots the Academy selectivity reporting and supports Bakken’s assertions above.
It’s unclear where in all this is the actual point of having a completed application.
In contrast, another selective school – Harvard College – does it this way:
Fall (of the year you apply)
As early in the fall as possible, please submit:
Your application to Harvard, via the Common Application, the Universal Application, or the Coalition Application. This is needed to open your admissions file, track your documents, and set up an interview.
Final deadline for all Regular Decision application materials. You must send all application materials by this deadline for Regular Decision consideration.
Even with a pending interview, that’s pretty definite! Regardless of the exact definition of “an application”, a good way to measure selectivity would be–hold on to your seat–using the number of completed applications as the denominator for the “applied vs admitted” metric. Using the number of admitted vs “nominated” cadets is informative but perhaps too specific to military academies for use as comparables.
Perhaps the misleading stats are to make the Academy a scarcer good to help attract the most desirable candidates for the Army. Perhaps Grads want to maintain the perception that they attended an elite, highly selective institution. Maybe this accounts for the jargony “files opened” language that’s designed to give the impression of selectivity while hiding the truth.
So let’s not kid ourselves. If USMA wants to be an elite institution, it should act like one. While painful for this grad to admit, since a lot of blood, sweat, tears, and time were spent for and at USMA, it has increasingly disregarded pursuit of excellence in favor of trendy photo-ops and news releases. And in juicing selectivity numbers, it is currently acting like a second-rate school riding its historical halo.
To fix this, we first suggest that the step of submitting an application be reported in the funnel. None of this “if we mailed you something, we’ll call it a file” or “fill out the candidate questionnaire, and if we like you we’ll have you do more” stuff. Call the completed document an application and report it as such.
Next, analogous to other schools’ interviews after the application, the West Point nomination, medical, and fitness tests can be considered post-application screening. This would provide a solid metric and clear delineation of who had applied, and therefore what the true selectivity rates are.
Then we can figure out how selective West Point actually is.
Why This Matters
Isn’t all this just some arcane discussion that only gray hogs care about? No, it’s not. It matters for anyone who cares about West Point, the Army, and national security.
We have documented the decline in admissions standards, particularly vis-a-vis the opportunity USMA had to get better class compositions. We pointed out that LTC Heffington was correct, and that lowest-quartile students are regularly admitted. Not only are they regularly admitted, but they’re admitted at the opportunity cost of much higher-performing students. And they’re much more likely to fail out at their own and the country’s cost.
As Bakken says, “the fabrications are not just PR or marketing; they are serious business” when the products of the Academies will be in charge of our military. They’re no longer getting “the best of the best of the best, with honors.” Citizens, taxpayers, potential applicants, and–not least of all the cadets involved–are paying and will continue to pay the price for weak leadership and a refusal to stand for what’s right and important.
Factual corrections or thoughtful criticism are welcome.
We reviewed the admissions data files and have produced an estimate of the impact of reviewing actual applications vs “files opened.” We found that the files contained an average of 21% of ‘applicant IDs’ with no test scores, indicating that files were created but no applications submitted. We also found that some 10-12% of applicant IDs had test scores, but no high school GPA or sports. Since there was no “applied” flag, we have to make our own assumptions.
We then reviewed the USMA class profiles to see what they said. The profiles gave information inconsistent with the files we have on record, indicating that USMA is following some other methodology and including some arbitrary portion of the “No Scores, Sports, or GPA” group as “Files Started.” We note that there were no flags for “homeschooled.” Perhaps home-schoolers make up 10% of applicants each year.
In a “best case” scenario, the actual selectivity of USMA is only 3.5% “worse” than reported. For example in the above table, this would be the difference in the class of 2018 between (1,223 admitted) / either 15,060 (USMA Files Started), or 10,331 (our “assumed applied”).
This would be roughly in line with the official story. But again we don’t know for sure how many applicants actually applied. Perhaps students sent test scores to USMA and were registered with a file opened. Applications received would be our preferred metric because most of our calculations in other posts rely on the totals of submitted scores to make judgments about the admissions pool.
In other posts, we foolishly assumed that test scores and information on file meant that an application was received; perhaps that was not the case. The class of 2020, for example, had (in our files) 1595 candidate IDs with test scores but no high school sports registered and no high school GPA. Did they apply and get rejected, or were their data obtained and files started anyway?
We have found another discrepancy in our files data. A significant number of cadets had the same calculated (by us) CPS. We looked at some of these and found that these CDT IDs frequently had the same entry test scores, High School GPAs, sports, and so on. We conclude that that USMA registered multiple CDT_IDs to the same individuals.
This seems to coincide with class year changes and a high level of separations, and may also have reflected multiple application attempts. This resulted in our overcounting total files by approximately 1840 files across the class years of ’16, ’17, ’18, and ’19, or 460/year. This actually increases the gap between our “assumed applied” number and USMA’s reported files opened per class. Since we now know that each year has ~460 excess cadet IDs who did not apply, the “assumed applied” number decreases.
And some of these candidate IDs spanned 3 or 4 years! How could this happen? We’ll write more on this in future updates.
For some examples, see below. Note the identical CPSs, candidate GPAs, and Race/Sex codes. But different class years. In the case of r_08660/s_08508/ t_08176, a CDT_ID spanned 3 years. Perhaps this candidate applied three different times, as indicated by the different SAT scores?