So apparently a new hastag, #overlyhonestmethods, is burning up the twitterverse.  It appears to be driven by students, technicians, post-docs in science labs blowing off steam about the challenges of doing research.  It’s funny and probably a good thing in the overall sociology of science — I think.  It is a good thing if it helps people find the appropriate level of “healthy skepticism” to approach science reporting with.  Sometimes people are a bit too quick to accept statements like “Science found X to be true” without consideration of how the conclusion was drawn.  On the other hand, it isn’t a good thing to further strengthen the anti-science crowd too much (e.g., climate change).

Fans of neuroimaging may find this blog post on that topic particularly entertaining:

http://blog.ketyov.com/2013/01/overlyhonestmethods-for-neuroimaging.html

Amusing excerpt:

Data were preprocessed 8 different times, tweaking lots of settings because Author One kept screwing up or the data kept “looking weird” with significant activation in the ventricles and stuff, which can’t be right.

and one that we hope won’t hit too close to home:

The methods section is difficult to write because the data on which this paper is based were collected 4 years ago by a roton (Author Two) and dumped on Author One because he didn’t have any other idea what to do for his PhD and this is, quite frankly, his last hope.