The Future of Publishing and the Accuracy of Information

August 4, 2009

One of the continuing controversies we all face as the Web becomes our primary tool for news, knowledge, reference…indeed for all forms of information, is determining the quality of the information published on the Web. The topic arises frequently in articles and blog entries. Much of the controversy has been associated with Wikipedia. Wikipedia has certainly suffered a number of embarrassing gaffes, but a study released by the journal Nature in late 2005 claimed that Wikipedia is essentially as accurate as the Encyclopedia Britannica. In reviewing 42 articles on the same topics from each source it found an average 2.92 mistakes per article for Britannica and 3.86 for Wikipedia. (The original article in Nature is available only to subscribers or for a $32 purchase, however the link to Nature provided here does offer some free material on the subsequent controversy). The respected technology writer Nicholas Carr later criticized the study, concluding his detailed analysis with:

If you were to state the conclusion of the Nature survey accurately, then, the most you could say is something like this: “If you only look at scientific topics, if you ignore the structure and clarity of the writing, and if you treat all inaccuracies as equivalent, then you would still find that Wikipedia has about 32% more errors and omissions than Encyclopedia Britannica.” That’s hardly a ringing endorsement….The open source model is not a democratic model. It is the combination of community and hierarchy that makes it work. Community without hierarchy means mediocrity.

It’s worth noting here that Wikipedia continues to take new steps and implement new procedures to improve the accuracy of its content. This blog entry is not intended to suggest the Wikipedia is fatally flawed: I think it’s a great resource and a publishing miracle. I raised the Wikipedia versus Britannica story because it well-illustrates the main point of this entry.

The broader controversy over information quality on the Web has myriad ramifications. Without exploring them all in this entry, I stand by the statement that the value of the proliferation of new and original voices on the Web is seriously marred if the accuracy of what is represented as fact remains suspect and undependable.

This leads me to recommend a very fine site I discovered the other day while researching this issue. The Virtual Chase: Teaching Legal Professionals How To Do Research, has an excellent section called “Evaluating the Quality of Information on the Internet.” Chapters include “Why Information Quality Matters,” “How to Evaluate Information,” and my favorite feature, a checklist for evaluating the quality of information found on the Web.

While the site is targeted at the legal profession, and therefore not relevant in all of its aspects to each of us, what better profession to turn to than one where a false fact can mean in some circumstances the death of a client or in others the loss of a $5 million lawsuit!

I’ll be continuing to cover this topic. I hope this entry provides a suitable introduction.

Tags: , , ,

Wikipedia and the Meaning of Truth

October 24, 2008

In the November/December issue of the marvelous MIT Technology Review is a very fine article by the respected author and professor of computer science, Simson Garfinkel on the ever-controversial subject of what can we expect and trust from Wikipedia.

The topic has been a challenge for some time, mainly pitting the Encyclopedia Britannica, that most respected source, authored by experts in their respective fields, against Wikipedia, the most anarchic of resources, but which with some 7 million contributors manages as Mr. Garfinkel points out to be “remarkably accurate.”

What makes this article a special pleasure is that Garfinkel acknowledges Wikipedia’s success, but delves below the surface and notes that “with little notice from the outside world, the community-written encyclopedia Wikipedia has redefined the commonly accepted use of the word ‘truth’.” The topic is an important one.

The article is fascinating for many other reasons, but here’s a tidbit:

“Wikipedia considers the ‘most reliable sources’ to be ‘peer-reviewed journals and books published in university presses,’ followed by ‘university-level textbooks,’ then magazines, journals, ‘books published by respected publishing houses,’ and finally ‘mainstream newspapers’ (but not the opinion pages of newspapers).”

Do you think these are the best sources to verify information? They certainly conform to standard publishing beliefs, but do they conform to this new medium?

The article is worth careful reading.

Tags: , ,