Does student test data change public opinion about education policies, public leaders?
Americans’ political opinions have become increasingly polarized over the past 50 years, with voters’ views more often being determined by party ideology. One cause of that polarization is the public’s changing relationship with the media. A fragmented media environment allows people to choose the news coverage that supports their views. Meanwhile, social media lets them view news stories chosen by like-minded friends and connections.
Though the Internet may be contributing to the problem, it also has given voters access to more information than ever before. The deluge of data may not, however, change voter preferences. A 2015 study published in the Quarterly Journal of Political Science suggests that partisanship affects factual beliefs about politics. The study notes that people’s opinions about the condition of the economy sometimes depends more on the political party of the sitting president than on concrete economic data. Voters can seize on any factor in forming their opinions about an issue or person. For example, less-informed citizens tend put more weight on how attractive a political candidate looks on TV, according to a 2011 study by political scientists at MIT.
But if voters are personally presented with objective information on a subject, will it cause them to reevaluate their mistaken beliefs about that issue? A December 2015 study published in the Journal of Public Policy, “Public Information, Public Learning and Public Opinion: Democratic Accountability in Education Policy,” examines the reactions of residents who learned that their beliefs about the performance of their public education system were wrong. Researchers Joshua Clinton and Jason Grissom of Vanderbilt University surveyed 1,500 people in Tennessee to determine how presenting people with student testing data affects their evaluations of the state public school system as well as their evaluations of local school boards and education policy reforms. Survey participants were asked questions related to student achievement on Tennessee’s end-of-year math tests. They also were asked about gaps in achievement for black students and white students who took the tests.
The study’s findings include:
- Before being presented with data, survey participants generally knew little about Tennessee students’ math test performance. Only 20 percent of the Tennessee residents surveyed gave the correct answer when asked to identify the approximate percentage of students who scored at grade level or better on math exams. Only 8 percent were able to correctly identify the true amount of the achievement gap separating black students and white students.
- Tennesseans tended to overestimate student performance. More than half (54 percent) thought student performance was higher than it actually was.
- Many participants tended to overestimate that size of the achievement gap between black and white students. More than one-third of respondents (36 percent) thought the racial performance gap was larger than it actually was.
- Respondents who overestimated student testing achievement tended to assign high letter grades to Tennessee schools and the state Department of Education but not necessarily to their local school boards.
- After receiving testing data, residents’ opinions changed. The authors wrote that “the average effect of receiving the informational update containing the true student performance level is negative” with regard to opinions about Tennessee schools, the state Department of Education and local school boards.
- Tennesseans who thought there was no achievement difference between black students and white students and Tennesseans who estimated the performance gap to be larger than 35 percent gave the same letter grade to educational institutions.
- Receiving information about student test scores had no impact on the probability that residents would support any of the six policies designed to improve student performance with which the residents were presented.
The study suggests mixed implications. Survey participants were able to reevaluate their views about Tennessee schools, the state Department of Education and local school boards after receiving objective information about statewide student performance. This indicates that assessments were driven by student achievement rather than an ideological leaning. However, the fact that Tennesseans were not more supportive of education reforms after realizing student performance was lower than they had realized is “potentially sobering for the prospects of citizen-led policy change,” the authors state. Residents’ opinions about education policies, in this case, appear to be driven mostly by ideological and partisan affiliations. It also appears that opinions related to education policy might not be influenced by concerns about achievement gaps between students of different racial groups.
Related research: A 2013 study in the American Journal of Political Science, “Informing the Electorate? How Party Cues and Policy Information Affect Public Opinion about Initiatives,” examines the impact of political endorsements and policy information on voter decisions. A 2014 study in Political Psychology, “Political Parties, Motivated Reasoning, and Public Opinion Formation,” considers the influence of political parties on the public’s political opinions. A 2012 study published in Political Psychology, “Who Deserves Help? Evolutionary Psychology, Social Emotions, and Public Opinion about Welfare,” looks at how culture and perceptions drive public opinions about public welfare programs.
Keywords: education, education reform, school boards, test scores, public opinion, public support, media, accountability