Mailbag: Are we doing this right?

Recently we got a comment on our post on the most Black women ever to graduate USMA:

Thank you for sharing your analysis. From one academic to another, I would recommend that you add a section discussing the strengths and weaknesses of your research method and design. There are a couple of weaknesses with your study that I would like to highlight. First, I am not sure if SAT scores are accurate predictors of a cadet’s future performance at the Academy. The popular book, Grit, seems to argue against this. Additionally, the fact the admission’s process looks at a range of factors beyond the SAT score would indicate the same. Second, your study heavily relies on the QPA as a predictor of success in the Army. Additionally, the study failed to define what Army success actually looks like. Is it OER ratings, number of promotions, years of service, or something else? Without clarifying this part of your research, your conclusions seem to be based more on your opinion and not data. Lastly, your quantitative research method overlooks the lived experience of cadets. As a graduate, my learning experience at the Academy went beyond the classroom, and those experiences impacted my military and professional performance. And when dealing with matters of race and gender, the lived experience of individuals is a significant factor. I would argue that women cadets “experience” the Academy definitely than their male counterparts. Your analysis ignores this, and this is also a significant weakness in your study. Kudos for taking the time to do this work. GABN!

We appreciated the feedback and said we would address these points. And so:

SATs and Cadet Performance

First, I am not sure if SAT scores are accurate predictors of a cadet’s future performance at the Academy. The popular book, Grit, seems to argue against this.

SAT scores are excellent indicators of a cadet’s performance at the Academy – we observed a .925 correlation between the CQPA and SAT scores. The College Board found that HS GPAs and SATs are both good predictors, and that the SAT has better predictive power at the higher end of aptitude. We did not use HS GPAs because they weren’t standard enough within our data set for us to draw conclusions. While our statistics are relatively unsophisticated, and there may be a better predictor available, we did not find one. This is supported by vast academic research elsewhere showing that test scores – if highly g-loaded – are the single most predictive factor in academic, and indeed, life outcomes available to researchers.

We had to read Grit to get up to speed on the argument. Grit is a pseudo-scientific Malcolm Gladwell knock-off , and argues that “passion” and “perseverance” matter more for success than aptitude. It uses multiple anecdotes, including quizzes of West Point students, to make its case. We won’t offer a full deconstruction of the work here (Duckworth didn’t manage to get through the third sentence – “Top scores on the ACT or SAT and outstanding high school grades are a must” – without making an obviously false statement), but will offer some other critiques of the concept and a meta-study showing that it’s a repackaging of known Big 5 personality traits.  It’s not at all surprising that personality traits influence success, but that’s a very different contention than saying that they’re more predictive than aptitude.

Relevant bits:

His analysis found an overall correlation of 0.18, looking at papers by Duckworth and others.

This compares to a much higher correlation of 0.50 between, say, SAT scores and performance in college.

Duckworth’s own numbers, in a paper published in 2007, are only slightly higher: 0.20.

SATs are clearly a superior predictor of performance.

We’d also observe that if Grit was such a large component of success, and measurable and teachable, then West Point’s grading and evaluation system is a failure. We’re not aware of personality tests at West Point (beyond the obvious 4-year experience) determining graduation or post-grad placements.

Additionally, the fact the admission’s process looks at a range of factors beyond the SAT score would indicate the same.

We mostly disagree with this. “Looks at” is vague. While there are multiple characteristics that the Academy looks for – physical, leadership, etc. – those are weighted less in admissions than the academics, which are again correlated heavily with cognitive ability, which the SAT measures well (or used to measure well, before the new adversity scores). And, again, once at West Point, the academics are the bulk of the evaluation. West Point clearly thinks academic performance is the most single important thing in graduation.

Success in the Army

Second, your study heavily relies on the QPA as a predictor of success in the Army. Additionally, the study failed to define what Army success actually looks like. Is it OER ratings, number of promotions, years of service, or something else?

We don’t think defining “Army success” necessarily filters for the best officers either, and have tried to avoid conflating “performance after graduation” with “success” in the Army. While aptitude tests are known for being generally predictive of a range of life outcomes, “success” in the Army is generally thought of as rank advancement, which can be influenced by “diversity goals”, sucking up to superiors, nepotism, or any number of non-pure-performance related factors. So finding out who ended up with the most promotions, years of service, best OERs, etc., would not be conclusive in determining whether the best cadets ended up as the best officers.

We’re concerned that West Point is not selecting the best talent it can for the Army; retaining and managing that talent  is a whole other issue. We’d hope that success in the Army comes from talent and achievement, but we have no data and no means of assessing of what that actually looks like.

Without clarifying this part of your research, your conclusions seem to be based more on your opinion and not data.

This blog is a part-time effort by concerned grads. It presents more actual data to the public about what the Academy is doing than any other source we’re aware of, although some – RAND and CEO studies – are much more statistically sophisticated. We’ve been  transparent about which data are used and how they’re derived, and presented the base data set for further analysis. Readers are welcome to do their own and come up with conclusions (and numbers that support them) that are different, or present actual methodological criticism and refinements.

Lived Experience

Lastly, your quantitative research method overlooks the lived experience of cadets. As a graduate, my learning experience at the Academy went beyond the classroom, and those experiences impacted my military and professional performance. And when dealing with matters of race and gender, the lived experience of individuals is a significant factor. I would argue that women cadets “experience” the Academy definitely than their male counterparts. Your analysis ignores this, and this is also a significant weakness in your study.

“Lived experience” is unmeasurable (again unless you’re the College Board) and subjective. That “race,” “gender,” and “lived experience” are positive additions to the warfighting capability of the Army needs to be proven, not assumed. In the last analysis, we don’t care about having more “lived experience”  with worse performance, unless it’s shown to improve our ability to win over having less “lived experience” and better performance.

Weaknesses

Our self-assessed weaknesses do include:

  • Incomplete data – we didn’t get everything we’d want, including better visibility on the LOI / LOE process, who was offered, who accepted entrance; and the extracurricular dataset was presented in an inconvenient format that we’ve yet to rectify and analyze.
  • Statistical prowess – while we know enough to write what we’ve written, there are more powerful and cleaner ways of analyzing the data. So, while we believe what’s been written is accurate, we welcome refinements to the models and observations.

Factual corrections and thoughtful criticism are welcome.

1 thought on “Mailbag: Are we doing this right?”

  1. Keep up the great work. This needs to be discussed throughout the United States at the highest levels for this nations future. A recent Georgetown University Study(Im sure you have seen)shows that approximately 159,000 of our highest performing HS graduates are displaced each year by the very same practices you are exposing here. These have potentially dire consequences for our Nation in general but even more so when it comes to national security. https://cew.georgetown.edu/cew-reports/satonly/

    As a founder of a large urban Charter School in MA, 20 year Board member, with much diversity we have seen HS students in the top 1 percent, by every academic measure, inconceivably be passed over by schools that should have accepted them. As an example a young man, Asian, received a 1590 SAT score, third in his class, active in clubs and sports as a Captain, was denied by a half dozen Ivy League schools, accepted to none of them, and his parents are immigrants from one of the poorest communities in the State.

    Our School would like to send more students on to West Point and the other U.S. Service Academies and your work is allowing us to understand the data to help students hopefully navigate the challenging process to an appointment with realistic expectations. We also wish only to send those who wish to be there for the right reason.
    [removed personal info]

    Reply

Leave a Reply to NeilCancel reply

Discover more from usmaData

Subscribe now to keep reading and get access to the full archive.

Continue reading