Courtesy of Trisha Phillips
Nature – Researchers are still debating what academia can learn from last month’s political science scandal: a now-retracted paper in Science reported that gay canvassers could sway voter opinions on same-sex marriage (Science http://doi.org/4zt; 2015).
Michael LaCour, a graduate student in political science at the University of California, Los Angeles, has admitted misrepresenting some aspects of the work, and there is now little evidence left that he ever conducted the survey he detailed in the paper. An investigation is ongoing at LaCour’s institution, while Princeton University in New Jersey, where he was due to start a new job in the coming academic year, rescinded its employment offer.
The LaCour case is symbolic of a larger problem in science, says Trisha Phillips, a research ethicist at West Virginia University in Morgantown who studies the factors that lead people into research misconduct. She spoke to Nature about her research, which suggests that, in some ways, political science seems to be less committed to research ethics than other disciplines.
Can we learn anything from the LaCour case?
The details are fascinating and complex, but a common response is to explain away the problem with the bad-apple hypothesis. That is: there will always be fraud in science — as with anything else — because there will always be a few bad apples. So this isn’t a problem that we need to worry about on a regular basis. But if you tease out all the features of the LaCour case that trouble people, you will find that they are a lot more common in academia than anyone wants to admit.
How common are these problems in science?
If LaCour’s data is fabricated, that’s obviously the major problem. Surveys over the past decade have shown that up to 2% of scientists self-report fabrication, falsification or modifying data at least once. If you ask scientists how often they have witnessed this behaviour in colleagues, that jumps to 14%.
There are also questionable research practices, which fall short of fabrication, falsification or plagiarism (FFP). Surveys that ask scientists about whether they have committed questionable research practices show a self-reported incidence of 33% — which jumps to 72% when scientists are asked how often they have witnessed this behaviour in colleagues.
People are making a big deal out of how LaCour lied on his CV, for example. We’ve done a meta-analysis, to be submitted for publication, of studies looking at the incidence of CV falsification and it’s surprisingly high. Across 8 studies, most of which examined applications for academic medical training programmes, around 22% of those claiming publications had falsified one or more of those publications on their CV.
Then there’s problems of LaCour not having institutional review board (IRB) approval for human-subject research, which was needed for his study; LaCour lying about how the study was funded; LaCour’s co-author [Donald Green at Columbia University in New York] not spotting his lying; peer review not catching the problems with the paper before it was published; and publishers and readers eager to accept unbelievable results. Putting that all together makes the LaCour case an extreme and complex example of FFP and other questionable research practices, but again, data indicate that some of these behaviours are more common than we would like to admit.
Is there a particular problem with political science?
There’s no data on the incidence of problems in political science as a discipline. But an ongoing research project I am doing with graduate student Franchesca Nestor clearly shows that political science does not seem as committed to research ethics as do related disciplines, such as psychology.
We judged this by examining indicators such as whether these disciplines’ US graduate-school curricula mention ethics in their curriculum requirements or course descriptions, and whether a discipline’s journals require researchers to note IRB approval, or to state in their manuscripts how they compensated and recruited subjects. Our preliminary results show political science is pretty far behind other disciplines.
Why is political science so far behind?
Scott Desposato, a political scientist at the University of California, San Diego, has done important work on ethical issues in political science experiments. In the introduction to a forthcoming book, Ethics and Experiments [Routledge, 2016] he points out that it’s only recently that there’s been a spike in field experiments [such as LaCour’s] in political science, compared with lab experiments and work on pre-existing datasets. And commitments to regulation and training for research ethics seem to follow the ‘fire-alarm model’, Desposato says. We tend to think that researchers are good people who don’t need regulation and training, and it takes scandals — fire-alarms — to show that this is a bigger problem than people want to admit. Because field work has only recently become more popular, political science has had relatively few public ‘fire alarms’ like LaCour’s — whereas health sciences had the Tuskegee study back in the 1970s, so their discipline-wide response came sooner.
What can we do about this?
We should pay attention to research on the factors associated with research misconduct. I think people engage in questionable research practices largely because of systemic issues, such as the pressure to get funded, and publish or perish. These are obvious issues, but are difficult to fix.
Although there’s no magic bullet, some aspects of the wider research environment are easier to address. Some political scientists are already doing some great things, such as promoting the pre-registration of experiments. More commitments to research integrity at the institutional level are needed — including training graduate students and more direct supervision over people’s data.
Our question should not just be how do we catch people like Michael LaCour, but what kind of commitment to ethics and integrity do political scientists want to make, and how can we prevent such cases from happening in the first place.
This interview has been edited for length and clarity.
- Nature doi:10.1038/nature.2015.17866