Education

Want Students to Pass Standardized Tests? Curve Them!

Shares

One of the least understood elements of  No Child Left Behind is the nature of the tests students are given to measure proficiency. While many believe the tests are a straightforward comparison of the achievements of students across the nation, the tests are anything but. Instead, they are often nothing more than politicized nonsense which do nothing to improve student achievement.

The recent results of Florida’s testing demonstrate just how flawed the system can be. As the New York Times reports, the state used a writing scale from 1-6 to measure proficiency. To pass, students needed to score a 4 on the test, but with new, tougher standards in place, the percentage of proficient students dropped from 81% to 27% in one year, prompting an unusual response:

The high failure rate was based on measuring proficiency as a score of at least 4. First, the state considered lowering the cutoff to 3.5. That would have resulted in a passage rate of about 50 percent. People would probably still be angry. So on May 15, Florida’s education commissioner, Gerard Robinson, held an emergency conference call with State Education Board members, while 800 school administrators from all over Florida listened in. The board voted to lower the cutoff to 3. Presto! Problem solved. The proficiency rate for fourth graders was now exactly what it had been in the 2010-11 school year, 81 percent.

All those blaring headlines about student achievement and failure we’ve grown accustomed to every year don’t mean anything, because the underlying tests don’t really measure anything. The scoring and content change from year to year; the measurement changes every time it’s measured. Some states challenge their students with difficult tests, while others dumb them down to gin up artificial success.

And this is the system we’re using to punish schools and students for “failures”? No one, least of me, argues that we don’t need higher standards for students in our schools, but relying on tests like these will never generate improvement. At best, they’ll generate headlines for politicians and privatization advocates to exploit.

About the author

Don Pogreba

Don Pogreba is a seventeen-year teacher of English, former debate coach, and loyal, if often sad, fan of the San Diego Padres and Portland Timbers. He spends far too many hours of his life working at school and on his small business, Big Sky Debate.

His work has appeared in Politico and Rewire.

In the past few years, travel has become a priority, whether it's a road trip to some little town in Montana or a museum of culture in Ísafjörður, Iceland.

2 Comments

  • As a former English teacher, I wonder how the writing test was scored. The teacher who prepped his students to take it apparently had his students look at examples of what the evaluators considered good essays. He then encouraged them to imitate these examples.

    My guess is that students were encouraged not to take chances. Write only simple or compound sentences — most errors occur when students try to write complex sentences. Make sure you use words you know how to spell. Have an introduction and a conclusion that basically restates the introduction. In between, have several obvious points that have a clear relationship with the introduction. Above all, don’t write what you really think. What you think is irrelevant and can only pull you away from the safe cookie-cutter essay the evaluators want to see.

    I suppose this is a reasonable way of getting students to please the test evaluators. But it sure isn’t a good way to teach writing.

Leave a Reply

%d bloggers like this: