When Too Much Information Is A Bad Thing
We’re inundated with information all the time, from every source imaginable – traditional media like newspapers, television and radio; new media like blogs, forums and podcasts; conventional in-person interactions and a host of other forms. That’s a fantastic thing. To think that now I can type “vaccination” into Google and get more than 15.2 million results in less than one-fifth of a second is phenomenal. Twenty years ago, we could only dream of such a huge volume of information. It was amazing back then, when a complete and searchable encyclopedia could fit on a compact disc. Now, of course, just the English version alone of Wikipedia (only 3.37 million out of a total 16 million articles for all languages) is over 230.3 gigabytes – or 337 compact discs. This, as The Wire’s Marlowe Stanfield would say, “sounds like one of them good problems”.
But the problem isn’t that there’s so much information, the problem is that the quality doesn’t match the quantity. Of those 15.2 million vaccination results, some will be from blog posts saying “today I took Billy in for his vaccination, he was very brave” while others will be useful, factual information from peer-reviewed medical journals. Sure, search engines do an incredible job of finding and sorting relevant information. That blog post isn’t going to get nearly as many links as the Wikipedia page or the website of the Australian Vaccination Network – the top two search results – so it will be buried further down in the results. And right there is the problem – that’s relevancy, not authority. Yes, those sites are more relevant to most people, but are they the most informative, authoritative sites? There’s no way for a search engine to know if the Australian Vaccination Network gives accurate, scientific information or not.
And guess what, it doesn’t.
After investigating the group, the NSW Healthcare Complaints Commission (HCCC) has released a damning report that claims “the AVN provides information that is inaccurate and misleading”. The report reveals that the group “provides information that is solely anti-vaccination” and that it “quotes selectively from research to suggest that vaccination may be dangerous”.
The story is best covered by Walkley Award winning journalist Steve Cannane on Lateline:
[youtube http://www.youtube.com/watch?v=29tciApImhI&hl=en_US&fs=1&color1=0x2b405b&color2=0x6b8ab6]
The problem of authority is obviously a problem not just on the internet, but in real life as well. And just as finding relevant information online was a challenge before Google came along, I think finding authoritative information is our current – and much harder – problem. But at least on the internet it’s easy to reference the sources of information and determine its accuracy. That’s perhaps what the quest for authority demonstrates – the awesome power of the link. By showing sources, by linking to the facts, a site demonstrates its authority. It’s self-regulation, and clearly not particularly effective, but for now it’s the best we can do.
Looks like there’s two problems here: What makes a source an ‘expert’ or ‘authority’ and what happens when that expert is wrong?On an internet level, Google’s algorithm aims to sort the wheat from the chaff for the first point – although this will become harder as niche commercial sites target specific areas, pushing down the listings of more reputable sites.In terms of what happens when an authority gets something wrong, well we can only hope that it’s rectified quickly and communicated to as many readers as possible. Unfortunately our system is not good at issuing corrective statements as they are either boring or not as newsworthy as other stories. To overcome this, cross checking facts is the way to go.
Very good points, David. I agree that those are essentially the two main issues.In terms of determining expertise or authority – there are already rudimentary ranking systems for scientists. I forget the names of them, but basically the more articles you have in peer-reviewed scientific journals, and the more they are cited in other journals, the higher your ranking. As I said, it’s very rudimentary – what defines an objective peer-reviewed journal, for example. The Chiropractic Journal of Australia is peer-reviewed – but if it’s reviewed by peers within the organisation it’s not truly authoritative. If anyone can figure a way to determine authority, and has the computing power to do so, it’s Google.As for correcting errors and mistakes – well, that kind of does happen anyway. If there’s a credible study done showing a vaccine is unsafe, for example, that gets published in authoritative peer-reviewed journals and then filters down the chain.I guess there’s no solution, though, to sites like the AVN when they cherry-pick data from different studies, or misrepresent them completely. The lesson, then, is always check the source, and read the studies and conclusions yourself.Not an easy thing to do with so many vaccinations on the market, and so many tests done on them.