Report Fails to Allay Concerns about State Testing

A report commissioned by the Texas Education Agency to allay parent and educator concerns about the reliability and validity of standardized state tests may just have added fuel to the fire.

Under last year’s House Bill 743, the Legislature required TEA to arrange an independent, third-party analysis of the reliability and validity of the battery of STAAR exams (State of Texas Assessments of Academic Readiness) used in the state accountability system to show how well students, schools, and school districts are doing. Though the independent analysis confirming the merit of the STAAR exams was required to be completed “before” STAAR tests could be administered, a first installment of the third-party contractor’s report was not made publicly available until after this year’s STAAR testing began. Another installment is said to be forthcoming ahead of the next round of STAAR exams in May.

A summary of the independent analysis posted by TEA states the third-party contractor’s conclusion that the STAAR exams reviewed meet the Legislature’s requirements of scientific reliability and validity. But a closer examination of the report’s details turned up troubling evidence of misalignment between STAAR test items and the state-mandated curriculum content (the Texas Essential Knowledge and Skills, or TEKS) that the test items are meant to reflect.

For instance, State Board of Education member Thomas Ratliff, Republican of Mount Pleasant, noted that for one test relating to English composition some 25 percent of the test items were found to be only partially aligned with the TEKS curriculum guidelines. Ratliff asked a parent from the advocacy group Texans Advocating for Meaningful Student Assessments (TAMSA) if that sounded like a “fair” test of a student’s mastery of state curriculum. “Absolutely not,” replied TAMSA’s Dineen Majcher.

Majcher cited other concerns as well. She challenged the last-minute publication of the third-party report before the SBOE meeting as evidence of a “lack of transparency and accountability by the state to us as parents.” She and other witnesses also noted that this year’s STAAR tests in many instances exceeded HB 743 limits on the average time required for students to complete the exams in the lower grades. One teacher said “exam exhaustion” for third-graders becomes a real issue and casts further doubt on the accuracy of test results as a measure of student achievement.

TAMSA President Theresa Trevino testified that many test items also appear to have been written at a reading level that is above the grade level being tested, again raising doubts about the accuracy of the tests. Her point was backed up by educators who testified as well, and SBOE members said they have been hearing the same thing from teachers and parents across the state.

A related complaint from parents and educators concerned the prevalence on STAAR exams of “tricky” rather than straightforward questions. For instances, students missing subtle cues in the wording of a math question might be perfectly capable of handling the math operation at issue yet fail the test item.   TAMSA’s Trevino and others pointed out that two school systems—Massachusetts and Singapore—held up as models to emulate because of their students’ achievement results use much more straightforward test items.

Though one TEA official suggested that there actually is “no evidence” of problems with inappropriate test items, SBOE member Ratliff said that the multiple defects identified in the STAAR exams indicate that the state is basing high-stakes accountability decisions on “a questionable instrument.” As a result of all the questions raised and the last-minute arrival of the third-party report on the STAAR exams from TEA, the members of the State Board voted to leave the matter pending and take up the issue again at their next meeting in July. In the meantime you can expect further Hotline coverage of the new wave of concerns about state testing that have developed this spring.