Each spring, California public high school students sit for a battery of state-mandated exams. The results are machine-scored and tabulated. Performance levels are set and imposed. Schoolwide reports are generated by the California Department of Education (CDE) and posted on their website by August 15.
The CDE passes then passes districtwide data on to each school district. Each district is charged with generating a report for each teacher, showing how that teacher's students performed on the exams corresponding to the teacher's course. For example, I am to be given a report showing how my students performed on the physics exam. I confess that I am genuinely curious about how my students performed on the exam. I hope that they do well.
The most valuable aspect of the report is that is gives some detail regarding the specific areas of strength and weakness among my students. There are six areas of the physics test for which data is collected: Motion and Force, Energy and Momentum, Heat and Thermodynamics, Waves, Electric and Magnetic Phenomena, and Investigation and Experimentation.
In theory, I could see how my students performed in each of these areas and modify my instruction so as to fortify any areas of relative weakness. Indeed, I interpreted an early report as an indication that I needed to beef up my treatment of electricity and magnetism. In subsequent years, my students have performed better on the exam's electricity and magnetism questions.
The problem is that the teacher report generated by my school district is amazingly useless. It was designed by someone who thought I would find utility in knowing how the 10th-graders in my 1st period class performed on the test, then how the 11th-graders in my 1st period class did, then how the 10th-graders in 2nd period, etc.
To me, such reports are mind-bogglingly useless. Who would want to know any of that? How could it possibly useful to a classroom teacher. "Next year I'm going to strengthen the treatment of heat and thermodynamics for my 4th period sophomores."
I spend a nontrivial amount of time and Excel energy melting down the data provided and reshaping it into a useful report. I'm pretty sure that I'm *the one* who does this at my school. Other teachers have lives.
What I want to know is how my students--all of them--did in the various areas of the test. Not how the 1st period sophomores did and how the 2nd period juniors did and the 3rd period sophomes and 4th period juniors, etc., did. I want all my students pooled together. Maybe I'm just lazy, but I don't differentiate instruction between grade levels or periods.
The report that I want would be much simpler and would require less than one tenth the paper to print.
Is it just me? Does anyone else out there even care? Bueller?