When Will We Learn?

A new report comparing the college performance of students who submitted standardized test scores for admission and those who didn’t finds that there are no significant differences between them. In other words, whether or not testing was used for admission, the results were about the same. What’s nice about this report is that it looks at data from 33 test-optional colleges over eight years. What’s sad about this report is that it simply reiterates yet again what many of us have been saying for years: Standardized testing is simply not needed when assessing students’ ability to do college work.

We’ve known for a long time about the relationship between scores and zip codes (guess which ones have higher scores?) and how intensive preparation for the test crowds out real educational time in the classroom. We know that low-income and first generation-college students typically have a difficult time with standardized testing even though they may do well in their classes. We know that obsessive focus on SAT and ACT scores upends the educational process, becoming an end in itself instead of a mechanism for measurement of what students are learning and doing in the classroom. We know, we know, we know, and yet…

Standardized testing is more entwined around education than ever and sucking the life out of it for entirely dubious reasons. Bill Hiss of Bates, co-author of the new report, introduced his findings of research at Bates at a presentation I moderated at the NACAC conference 10 years ago. It found the same thing as this new report. The Chronicle of Higher Education has dozens of articles about standardized testing with similar themes going back many years, and colleges consistently report that high school grades are the most important aspects of a student’s application.

The testing cancer has metastasized into the lower grades as well, with some calling for testing of kindergarteners. We already have absurdly constant testing thanks to NCLB, although it seems to be being challenged finally. The testing industry continues to grow fat on all of it and has managed to convince people of its value despite both quantitative and qualitative evidence to the contrary.

But we are seduced by data, no matter what it may be. Let’s face it, “education” is supremely difficult and hard to quantify. There may be a Platonic ideal of it somewhere in the Cloud, but down here it may be the most analyzed and least understood human endeavor we have. Our reliance on testing is a sign of how little we understand: No test can isolate and determine absolutely why, for example, a lesson plan that goes well in one class bombs in another. No study can definitely answer the question of why one teacher clicks with students and another doesn’t, or what a teacher clicks with some students and not others in the same class. No data can truly predict why some students living in poverty can outshine their peers while others are swept away by it.

But data seem to have the answers and seem to make sense of it all. With fewer teachers and resources than ever, it’s easier to make data the foundation of our behavior; we don’t have time, energy, money, or commitment enough to encourage the kind of classroom environment that might in turn encourage real learning. It’s easier to try to mechanize it all than have to deal with complex human behaviors at every level. I suspect that the number crunchers who now dominate educational policy making would be ecstatic to have an algorithm that would automatically select the right courses and coursework for students the way Netflix or Amazon does. Something like that already exists in online testing where the test changes relative to the student’s answers. No teacher necessary!

Standardized testing has become just a manifestation of the mechanization and commercialization of education. “Standardizing” attempts to make the ineffable somehow manageable, which requires the paring away of anything that doesn’t fit. Thinking that more and more data can somehow get us better educational results is delusional because education exists in between data points, not on them. But it’s of a piece with MOOCs and online education in general–commodifying education as a consumable, not an ongoing enterprise. Subjecting students of any background to more and more of it brings us that much closer to a kind of plutocratic managers’ paradise where the labor supply is docile, well-trained, and incapable of anything but responding to “prompts” and blank ovals. It may be too late to stop, but surely we don’t need any more testing studies to remind ourselves of its ineffectiveness.

NEW as of 2/23/2014: As if that weren’t enough, see this article in Newsweek online about using data to help find the “perfect” college for you. An algorithm is all you need! Thanks to colleague and former Dean of Admission at Pomona College Bruce Poch for the posting on Facebook.

Advertisements

About Will Dix

I am currently writing a book about college admission. I'm interested in the intersection of the college process and American culture. I attended Amherst College in the 1970s, taught high school English and theater at The Hill School in the '80s, returned to Amherst in the '90s as an admission dean, and began the '00s as a college counselor at the University of Chicago Laboratory School. I then joined Chicago Scholars as Program Director. Currently, I blog about college admission for Forbes.com. I also help community organizations serving low income students understand the college admission process so more students can consider gaining access to higher education. I have a few private college counseling clients that I take by referral only. The views expressed in this blog are mine alone.
This entry was posted in college counseling and tagged , , , , , , . Bookmark the permalink.