Against the ‘Against Disclosure’ column in the New York Times

by

In this week’s New York Times’ Gray Matter column, political scientist David M. Primo has penned a piece with a provocative title: “Against Disclosure.” In it, he highlights his own survey research, in which respondents given a hypothetical ballot measure and exposed to news reports that included campaign finance disclosure data did no better identifying the position of different interest groups than those who merely read news accounts and saw a voter guide.

Both the Times article and Primo’s underlying research are misleading.

The problem with the Times article

Let’s start with the Times article. First there is the problem of the headline: “Against Disclosure.” Actually, what the article says is that in an experiment that was almost certainly designed to show no effect (more on that shortly), a very narrow construct of campaign finance disclosure impact didn’t materialize.

Primo concludes: “For too long, disclosure has been viewed as the policy equivalent of a “free lunch” — all benefit and no cost. Disclosure is not always a bad thing. But it is not always a good thing, either.”

While his article argues that shareholder disclosure (which he also looked at) may be a wash, it does not demonstrate a single cost of campaign finance disclosure. The worst you can say about campaign finance disclosure from his article is that it didn’t help voters to identify interest group positions in a hypothetical ballot as part of an online survey. This says nothing about the many other benefits of disclosure that Primo did not bother to evaluate.

Second, there is the problem of Primo’s own disclosure. Primo is listed as “an associate professor of political science and business administration at the University of Rochester.”  It isn’t until one follows the link to his article in the scholarly publication Election Law Journal (and either makes it through the paywall that the article is behind, or is clever enough to find it elsewhere) that one can read that the survey was “funded by the Institute for Justice, a group that litigates cases challenging disclosure laws.” To be more specific, the Institute for Justice opposes disclosure and campaign finance laws.

The Institute is also funded by the Koch Brothers. In fact, they are “cornerstone supporters.”

Primo got money from a group opposed to disclosure. The Times has a responsibility to identify this.

The problem with the research

Then there is the problem of the research itself.

Primo’s survey was conducted as part of a Harris Poll Online panel, between October 14 and 25, 2010. It was conducted on registered voters in Florida. Participants were given a hypothetical ballot issue, and exposed to one of three treatments.

Here’s how Primo summarizes it in the Times:

A control group saw the wording that appeared on the ballot; a second group saw the wording and was also given access to news accounts, advertisements and a voter guide summarizing the pro and con positions; and a third group could view all of this information as well as campaign-finance disclosure data integrated into news stories. The material provided to the second and third groups explicitly mentioned the positions of various interests, including corporations.

Primo then found that the third group did no better at identifying the position of various groups than the second group.

While surveys tend to be good for measuring public opinion, it’s a lot to ask people to remember the positions of various groups on a hypothetical ballot measure, especially if they are only exposed to one or two news articles. Primo doesn’t say in his Election Law Journal how many other questions there were on the survey, or whether other questions were asked in between the original exposure and his recall test.

Nor does he describe the additional information that was in the two newspaper articles that supposedly had disclosure-related information. He does note that respondents were not required to view the disclosure-related information, and that the disclosure information was not always “up-front-and-center” in the news article.

If these survey respondents were typical, they were probably trying to get through the survey as quickly as possible. There was no reward to them for correctly identifying groups.

In other words, as far as I can tell from his write-up, the survey appears to have been set up in a way to generate the null result.

There is also the problem of what social scientists call “external validity.” In other words, are these results valid in the real world?

In the real world, a ballot campaign goes on for several months. Respondents will likely see multiple news reports, and be exposed to information many times. They will also know that this is a real ballot measure in which they have an actual stake. They will be more motivated to care and pay attention, because they will actually have to vote.

An even more fundamental problem with this experiment is that it does not attempt to measure the outcome that we actually care about: the vote. What we really want to know is: does disclosure change how respondents evaluate a ballot proposition?

Primo didn’t bother to ask this question. Instead he just evaluated how many interest groups respondents could identify.

Even this ignores the real likelihood that all voters need to know is the position of one really important interest group. For example, if you learn that the tobacco industry is behind a ballot initiative on cutting public health education funding, do you really need to know what a dozen other groups think and do?

Finally, Primo’s research design ignores that campaign finance disclosure enables a community of watchdogs and journalists to investigate the politics around a ballot initiative.  Primo treats newspaper articles as interchangeable, assuming any two disclosure-related articles stand in for any other two disclosure-related articles. This is simply not the case. Disclosure empowers the kind of investigative reporting that can truly inform citizens about their choices. All a state or community needs is one really good piece of investigative reporting to reverberate.

Certainly, we at Sunlight welcome a conversation about the impacts of disclosure.  But that conversation needs to ask the right questions in a responsible and open manner, not put forward poorly-designed research funded by anti-disclosure advocates under the guise of objective science.