Among the glaring issues with the Higher Education Quarterly paper is that the authors are not affiliated with the University of California at Los Angeles, as they are listed. . .
The editors of the journal did not respond to a request for comment. Wiley, the journal’s publisher, also did not respond to a request for comment.
But one of the purported authors of the paper did respond to an email from The Chronicle, writing that the journal “ought to be embarrassed” for accepting such obviously shoddy work. “No referee asked to see our data,” wrote the alleged author, using the name Sage Owens, from the email address [email protected]. The writer declined to provide any other identifying details.
“No referee examined whether the list of universities was real,” the author said in their email. “No referee noticed the Forbes ratings cannot be correct. Every page has some glaring errors, but the central error is that the regression model is all wrong. Peer review does not protect against fraud,” the person wrote. “It should protect against nonsense and bullshit. In this case and in others, it did not.”
As of this writing, the article is still up at the website for Higher Education Quarterly. I’ll be checking back regularly to see when the editors finally cop to the academic malfeasance and take the article down. I’ll also be interested in finding out of there will be any consequences for the editors, and/or if they will be forthcoming about their “peer review” process for this article.
While I’m on the subject, every time I looked over the article again I spotted another incredible claim or assertion. Here are three of my favorites:
Our response rate was high. 1000 faculty (in political science, economics, and philosophy) and 1000 administrators were contacted. Of these, 832 faculty (83.2%) and 672 (67.2%) administrative staff fully participated. 1500 were selected for a treatment group (750 faculty and 750 administrators), while 500 were selected for control groups (250 faculty and administrators).
As the Chronicle of Higher Education pointed out, this is an astounding response rate, and simply not to be believed. Most faculty surveys like this are lucky to get a 5 percent response rate.
To facilitate blindness and ensure participant safety, faculty and staff lists were downloaded from university websites. Subject recruitment took place anonymously. Our survey program randomly emailed a subset of faculty and staff from each list, but we, the authors, do not have record of who was emailed or who responded. Thus, our survey responses cannot be traced back to any particular faculty or staff member, and so survey respondents were at no risk of harm by participating in our study.
The boldface part means the study’s response data cannot be verified or replicated. Which is ridiculous. How conveeeenient!
Finally, who would seriously believe this claim from supposedly anonymous responses (or even non-anonymous responses):
We included an anonymous form to indicate reasons for declining our invitation. Only 37 invitees who declined filled out this form, but of those respondents, 30 indicated that they believed responding to the survey could endanger their position in some way despite its anonymity. Since these responses are non-standardized, we cannot and do not include a statistical evaluation of their content, but we note this reported fear is consistent with our findings. A striking feature of our survey results is that faculty and especially administrators believe they are in danger if they speak out publicly against politically motivated donations.
Memo to “Nigerian princes” and other con artists out there: I think I’ve found an easy mark for swampland sales, multi-million dollar bank transfers, leftist talking points, etc.