Sexed-up statistics, misleading headlines, and plain inaccurate stories: the pressure to lure those clicks to news sites can result in some pretty bad science journalism.

Some time-poor journalists, when confronted with a bulky research paper, cut huge corners and jump to conclusions when told they’ve got to file a story on it, and this can lead to some surprisingly basic misunderstandings of the material.

So to help journalists improve their research coverage, Poynter, an American school that aims to foster responsible journalism, has come up with a list of five ways to avoid dropping science clangers in their stories.

These include fairly basic tips such as remembering the fact that correlation is not causation, things you would hope would be obvious to most people but, sadly, are not.

Other suggestions include considering how true-to-life the trials are (as Poynter points out, studying how playing computer games makes people more aggressive by looking at how much hot sauce they give someone isn’t exactly an accurate portrayal of how folks behave in everyday life – at least not where I come from), as well as the fact that people tend to behave differently when they know they are being studied.

It’s a useful list of tips which are worth bearing in mind. Though as I mentioned, you’d hope some of them would be obvious, but it’s also amazing how often it doesn’t appear to be, judging from some stories out there.