Minnesota students' science test scores take big jump
Minnesota's students made dramatic gains on state science tests this year...
Overall, 46 percent of students exceeded the expectations the state set out for them, up from 40 percent last year, when the Minnesota Comprehensive Assessments-II science exams were given for the first time. Of the three grade levels tested this spring, 45 percent of fifth-graders, 43 percent of eighth-graders and 50 percent of high school students succeeded.
… increased familiarity with the online test this year probably contributed to the increase.
The test results come seven months after the state's students were found to be near the top of the world in math and science, based on an international assessment …Let me count the ways of wrongness.
Firstly, the headline on the dead tree Strib read something like “Science scores absolutely stink” (ok, I’m paraphrasing). The Strib needs to get its spin straight.
Secondly, given that the online test was still fairly new last year, is a 6.7% gain selected from a range of options (they could have chosen to compare the high school grade, etc) really meaningful? I doubt it. Neither journalists nor educators nor physicians nor, really, anyone, seems to have a working grasp of the notion of statistical significance.
Most of all, the entire process is a cosmic waste of time, money, and student psyches. A test with a 50% pass rate means the system has failed. Minnesota needs to stop, fire all the responsible executives, and restart.
Either the test is wrong, or the preparation is wrong, or Minnesota is testing the wrong group or all of the above.
Minnesota needs to take a tiny fraction of the cost wasted on its testing programs and hire a man who now keeps a very low profile – Steve Yelon. Really, I’m sure he’d do the work for a million or two, even if he’s now a “professor emeritus”.
Steve taught me about curriculum and instructional design back around 1991 or so when I was an OMERAD fellow at MSU’s College of Human Medicine (ie. not the famed vet school [1]). He was a good enough teacher that the basics still stick in my head. They’re roughly like this ..
- Figure out what you want your learners to be able to do. This has to be something they can do.
- Design a test that measures achievement of the desired skills.
- Design a course that fits the test.
- Teach the course.
- If the results are bad, revise one or more of test, course, and tested group.
Hire Professor Yelon Minnesota, and stop wasting my money. There’s a lot less of it than there used to be.
[1] If anyone from OMERAD is link checking, John Gordon is a pseudonym.
Update 9/1/2015: Wow, I had a pretty angry writing style back then. I think I'm mellower now, must be getting old. Anyway, I came across a wallet handout he did in the 90s. Copy here ...
The bidirectional arrows aren't just for symmetry. You start with real world goals, build objectives, and then build tests that demonstrate goal achievement. Each feeds back on the other. From what I remember of his teaching (mutated by time and experience) I'd add another set of arrows that drop from real world performance to terminal objectives to the tests -- illustrating importance of designing test such that "teach to the test" is a fine praise.
Yelon's 1990s book is available used on on Amazon. If copyright allows it ought to be put online.
No comments:
Post a Comment