Public Policy Updates: January Board of Education on school and district report cards, automated scoring

The Board of Elementary and Secondary Education met on Monday, January 14 for an evening meeting. While much of the attention focused on Commissioner Jeffrey Riley's announcement of a proposed deal on expanding Alma del Mar Charter School in New Bedford, two reports were the purpose of the evening special meeting: one on the complete overhaul of the school and district report cards, and one on the use of computers in scoring student writing on the MCAS.

School and district report cards communicating to parents and guardians are among the requirements of the Every Student Succeeds Act, as it was a part of No Child Left Behind. The requirements of what is included have changed, and what the Department has chosen to include has changed even further.
As Associate Commissioner Rob Curtin noted, "We speak a foreign language in this building." Ensuring that families receiving information that they need and want about their school and district in language that is meaningful to them thus became the work; to do this, the Department partnered with Learning Heroes, an organization which researches best methods of communicating with families.
Learning Heroes combined four different research projects in the work they did for the Department. Among their findings is the tendency of parents to believe their child is doing better than they are; while many parents persist in this belief even when presented with further supporting data, Massachusetts parents are more likely to change their view.

Learning Heroes found that the information most valued by parents is information about the school learning environment, teachers' credentials, disciplinary data, and school progress. Information parents felt was nice to have but less necessary included comparison of data to national averages, test score information, percentage of students enrolled in challenging classes, and money spent per student. There was concern expressed around how subgroup data is expressed; it is important that not all subgroups are presented together in a way that gives the impression that the struggles of one are those of all. There was also less interest in attendance rates and in the numbers of students learning English who no longer needed instruction. Parents are in need of context for information; they need information presented without lingo. Parents also tend to go straight to graphs when graphs are presented; sometimes this means that they interpret the data being presented incorrectly.
All of this was incorporated in the redesigned report cards. The Department priorized what parents felt was most important in the presentation of data, as Curtin noted, "these report cards are not for us [the Department]." The new report cards, which are designed to be accessed online but will be available as a PDF for printing, open with information about the report cards and directory information about the school and district. This is followed by a list of question, each of which can be clicked on to open that information: 

  • Who are the students and teachers?
  • What does the student engagement look like in our school? (attendance and discipline)
  • What academic opportunities are available to our students?
  • How prepared are our students for success after high school?
  • How do our students perform on state tests?
  • How much does our school spend per student?
  • How is our school doing in the state accountability system?

"We have typically led with MCAS results," noted Curtin, "but when we talked to parents, they told us that wasn't what they were looking for" first. In many cases, those interested will be able to click through for futher information, including civil rights documentation, as required by ESSA. The school level spending, which gives an average per pupil spending amount, broken by federal or non-federal funding, is new this year, and it is anticipated that there will be further work on this and other sections. There are a few more fixes the Department is making, and then these report cards will be released. 
Several members of the Board expressed concern around the term "engagement" when the data being reported is limited to attendance and discipline; this is among the sections that will be evolving over time. Member Mary Ann Stewart requested that net school spending levels of districts be considered for inclusion; Member Michael Moriarity asked and received reassurance that the school and district profiles are not going away. Member Maya Mathews asked about incentivizing family engagement. Secretary James Peyser said that he was more interested in where the money was spent than where it is coming from.

The second report the Board received was an update on the Department's continued investigation into automated scoring of student writing in the MCAS. Deputy Commissioner Jeffrey Wulfson again noted the state's slow pace in looking into this, in contrast with states that have rushed ahead. As staff move ahead in this research, the Department is not expecting perfection, as human scoring is also fallible, and is not expecting in-depth responsiveness, as that is the role of the teacher, Wulfson said.

Associate Commissioner Michol Stapel reviewed how MCAS written responses are currently scored by human scorers (60% of whom are teachers) who are trained on each individual item. Items are scored on idea development and on conventions. There is 100% double blind scoring in grade ten, with the student receiving the higher of two adjacent scores if there is a discrephancy. 

In two years of review, the Department has found a high rate of correlation between automated and human scoring (see the linked presentation for more specific data). In particular, Stapel said, it tends to show high rates of agreement with scores assigned by expert [human] scorers."

Member Katherine Craven asked about possible monetary savings, but Wulfson was quick to note that any such savings, due to start-up costs and such, are at this point not clear. He said that this was not why such scoring is being pursued. There would, however, be a significant savings in time, with the goal of returning results to districts within the same school year. The Commissioner further wishes to pursue adaptive testing.

At this time, the Department will use automated scoring as a double blind (a second check) in at least one question per grade; all grade 10 writing will continue to be scored on a double blind basis by two human scorers. The Department will be analyzing the results of such work in the summer and will again present on this matter in the fall.

The Board will meet for their regular meeting on Tuesday, January 22.