Failure of state school tests leave few answers

Tuesday, October 10, 2017

Evans sees bigger challenge for Monett in brand new 2018 test

Usually each fall, the Missouri Department of Elementary and Secondary Education (DESE) releases results from standardized tests held in the spring, then calculated the Annual Performance Report (APR), used to assess accreditation qualifications, plus showing school districts if they have improved.

That won’t happen this year, leaving school leaders scratching their heads to figure out where they stand with the state.

“It’s a bit of a mess,” said Mike Evans, Monett assistant superintendent.

Scores for standardized tests for grades three to eight are available. Evans said those numbers show Monett students continue to improve.

The big picture, however, comes in the APR, combining all the scores into a mathematical measure. The formula doesn’t work if one of the ingredients falls apart.

“Essentially we all give end-of-course exams in high school,” Evans said. “We give four of them. The English 2 and algebra 1 tests are the issue. Districts get a raw score back with tiers identifying advanced and proficient performance [the two top levels]. Then, once everybody has taken the test, the tests publisher and vendor gathers all the data and plugs it into a computer. By the middle to late summer, DESE sees all the data. They determined something was amiss.

“When you give a test over multiple years, you expect a certain amount of variability. There are limits to the ups and downs you’d expect to see. If it’s too far, there’s a problem. DESE went to their psychometrician and statisticians, nationally known people and asked, ‘How can we make this work? Can you put it where it will work?’ They all said no, the variability is too much.”

Evans said DESE hasn’t given a straight answer as to whether the numbers are too high or too low. He talked to colleagues at other districts who track similar information. Some have told him they saw big rises in some districts and big drops in others.

“Our scores stayed very consistent,” Evans said. “We had the same percentages in proficient and advanced. In our raw data, nothing looked out of balance.”

The consequences of not having a state average for rankings and comparisons have appeared significant. The scores tie to the A+ program, Evans said, and some for final grades in the class. DESE has been running models on district scores to see if a solution could emerge measuring without two of the four tests. No information has yet emerged suggesting a solution was in sight.

“From DESE’s point of view, the tests came back this way because of the vendor,” Evans said. “DESE doesn’t write the test. It’s all contracted out. When results came back and they’re not comparable, and DESE just hired the vendor, it’s up to the test manufacturer to be fair and reliable. DESE says, ‘The vendor didn’t give up a product that met our requirements.’

“Legal teams have had conversations. It’s likely to end up in legal proceedings. DESE has its hands tied. DESE invested a lot of money in the process, and the product didn’t deliver.”

From the meetings he attended, Evans said he feels DESE has handled the situation professionally and is doing everything possible to correct the situation.

As for the information that has arrived, Evans said he has only begun studying the report. He sees little value in comparing fourth graders this year to fourth graders last year, but prefers to look at the same group of students year after year to see if their numbers continue to show growth, then compares that with state averages. Data from other districts for comparisons is not yet available.

“[Other districts’ numbers] give us a good picture of how our sample compares to the total population,” Evans said. “Do we see shifts? Yes. I’m all about improving. We’re seeing movement where we’re trying to improve. We want to make sure the things we are teaching the same things the state tests for while maintaining what we think is vital and important.”

Improving scores becomes a bit more tricky. Evans said the state tests ask questions here and there about specific standards. Educators try to identify which questions relate to which standards. However, the standards, from Common Core to Missouri Standards, have changed three times in four years. The number of questions per standard has also changed. The higher the grade level, the more complicated the standards become, making it harder to figure out what the state is testing for, what is the intent of the standard, and how the test measures it. Evans indicated he has to look for trends over a five- or six-year period to get any sense of the big picture.

“This school year, it’s another new test,” Evans said. “We don’t get to see the test. We don’t know what’s on it. We get a blueprint of what’s on the test, so we know what the ‘big rocks’ are, such as the items that will take up 35 percent of the test. They haven’t released the test questions, so we can find out how they are asking questions. They haven’t released those items to us for several years.

“I hate to say it, but we’re guessing. We try as best we can to make sure our assessments mirror what the state test looks like. Without those [sample] items, it’s challenging. They retire some questions from year to year, enabling us to see then what questions they were asking. There’s no questions to retire this year because the test is all new.”

From what he’s heard, Evans said the 2018 test will be much more of a Missouri test than in past years. Even though the effort has been going on for over a year, the process has to go through the psychometricians, the publisher and the vendor. DESE had used the same vendor for many years. Now, after the 2017 mess, it’s unclear if even that vendor will return for another round.

“September [2016], we were going through state standards for the third time in four years,” Evans said. “We looked at what each grade level has determined are its priorities, what we’re calling essential learning outcomes. Then we looked at them vertically, kindergarten through high school, to see it what we were teaching continues. If we saw a gap, we questioned it. Teaching some concepts maxes out over time. Other gaps we fixed. We’ve done that with English and math. We’re going to do that for science and social studies this year, at all grade levels. It will take a year to build the curriculum.

“Our job each year when we get the state data back is see what questions were asked X amount of times, see if it’s in the curriculum and ask how are students doing. If they’re not doing well, we redo the curriculum. The fun part of a curriculum is it should always be an evolving process. We want to prepare kids for everything they face.”

The rising complexity that comes with teaching at higher grades, interpreting the test as it applies to instruction, makes this year’s glitch that much more frustrating. Evans said the lack of test results from last spring may have little impact in light of a brand new test next spring. Scores he has seen show the gap closing on state averages. He counts on the district’s ongoing strategy to find answers, even if state test results fail to show the way.

Respond to this story

Posting a comment requires free registration: