Dutch Psychologist Faked Data In At Least 30 Scientific Papers 254
Attila Dimedici writes "A professor at Tilburg University has been caught using fake data in over 30 scientific papers. Diederik Stapel's latest paper claimed that eating meat made people anti-social and selfish. Other academics were skeptical of his findings and raised doubts about his research. Upon investigation it was discovered that he had invented the data he used in many of his papers and there is a question as to whether or not he used faked data in all of his published work."
Sokal Affair (Score:5, Informative)
Obligatory reference to the Sokal Affair [wikipedia.org].
The Sokal affair, also known as the Sokal hoax,[1] was a publishing hoax perpetrated by Alan Sokal, a physics professor at New York University. In 1996, Sokal submitted an article to Social Text, an academic journal of postmodern cultural studies. The submission was an experiment to test the publication's intellectual rigor and, specifically, to learn if such a journal would "publish an article liberally salted with nonsense if it (a) sounded good and (b) flattered the editors' ideological preconceptions."
Re:Published in Science (Score:5, Informative)
Reviewers make sure that the experiment is described clearly and completely enough for it to be replicated, which is the best way to verify the dates authenticity/accuracy. They also strive to make sure that the methodology was sound, conclusions don't over reach what the data can support, and that the discussion was complete with regards to the pre-existing relevant literature. Those checks can find fabricated data, but aren't designed to necessarily.
Journals have no way to verify that you ran a trial, never mind that the data wasn't massaged or flat out replaced with fabricated data. That part is just taken on faith because it is the authors reputation that is on the line.
Re:Sokal Affair (Score:2, Informative)
What makes this news troubling is that the researcher succeeded in being published in Science which was supposed to have a rigorous and effective peer-review process
Not really. The peer review process isn't set out to look for fraud. It is set out to look for bad data, poor experimental setups, poor interpretation of experiments, etc. The system assumes that the submitters are acting in good fatih. And this is a pretty good assumption: the vast majority of the time they are. The occasions where a problem occurs are few and far between. It would be a massive waste of resources and exhausting for all involved for peer review to try to actively look for signs of fraud.
Re:Sokal Affair (Score:2, Informative)
What makes this news troubling is that the researcher succeeded in being published in Science which was supposed to have a rigorous and effective peer-review process.
Peer review can't detect faked data, only bogus methodology.