Illustration, Comix & the Future of Publishing

August 6, 2010

Let’s start with the fun stuff. Then the dry commentary. Check out two visual cornua copiae (or, as you might have it [and as I had it until I looked it up], “cornucopias“):

1. http://christophniemann.com/   (take some time in his galleries…A LOL experience guaranteed, or your time cheerfully refunded.)

Source: Christoph Niemann

Source: Christoph Niemann

 2. http://www.asofterworld.com/index.php?id=579
(That’s just one example of a softer world’s comix. More to choose from in the archive.)

now are you going to take your shirt off or not?I put it to you that Christopher Niemann’s delightful and brilliant illustrations are unique to the digital age, in style, content and often in form. A Softer World meanwhile represents a new format for comics for the digital age, combining photography, illustration, and an edgy contemporary wit.

Amidst all of the debate about rethinking books, newspapers and magazines for the web here are two related media that have quietly reinvented themselves while everyone stared blankly into their repsective screens.

Wikipedia won’t define “comix” per se. The editors insist that there are three different kinds of comic(x)s:

  1. Comics: i.e. “mainstream comics”
  2. Underground Comix: “depict content forbidden to mainstream publications by the Comics Code Authority, including explicit drug use, sexuality and violence”
  3. Alternative Comics: “a range of American comics that have appeared since the 1980s, following the underground comix movement of the late 1960s and early 1970s”

The wiki differentiates between “underground comix” and “alternative comics” (with a “c”, with a “c”!) strictly by method of distribution: “The distribution of underground comix changed through the emergence of specialty stores,” i.e. once comiX found commercial acceptance they were no longer underground. Now they were “Alternative” and had their “X” replaced with a “C”.

Can you think of another publishing example where the product had to be reclassified when only the distribution method changed (while the content remained the same)?

A banner headline above the alternative comics’ article notes:  “This article does not cite any references or sources.” Well I guess we can rely on it then. (Please see the New York Times: “a student reprimanded for copying from Wikipedia in a paper on the Great Depression said he thought its entries — unsigned and collectively written — did not need to be credited since they counted, essentially, as common knowledge”.)

 

By coincidence I heard today that Digital Book World will be offering a free webinar next Tuesday that’s right on topic: Digital Strategies, Learning from Comics Publishers. More info and registration link here.

Tags: , , , , , , , , ,

The Future of Publishing and the Accuracy of Information

August 4, 2009

One of the continuing controversies we all face as the Web becomes our primary tool for news, knowledge, reference…indeed for all forms of information, is determining the quality of the information published on the Web. The topic arises frequently in articles and blog entries. Much of the controversy has been associated with Wikipedia. Wikipedia has certainly suffered a number of embarrassing gaffes, but a study released by the journal Nature in late 2005 claimed that Wikipedia is essentially as accurate as the Encyclopedia Britannica. In reviewing 42 articles on the same topics from each source it found an average 2.92 mistakes per article for Britannica and 3.86 for Wikipedia. (The original article in Nature is available only to subscribers or for a $32 purchase, however the link to Nature provided here does offer some free material on the subsequent controversy). The respected technology writer Nicholas Carr later criticized the study, concluding his detailed analysis with:

If you were to state the conclusion of the Nature survey accurately, then, the most you could say is something like this: “If you only look at scientific topics, if you ignore the structure and clarity of the writing, and if you treat all inaccuracies as equivalent, then you would still find that Wikipedia has about 32% more errors and omissions than Encyclopedia Britannica.” That’s hardly a ringing endorsement….The open source model is not a democratic model. It is the combination of community and hierarchy that makes it work. Community without hierarchy means mediocrity.

It’s worth noting here that Wikipedia continues to take new steps and implement new procedures to improve the accuracy of its content. This blog entry is not intended to suggest the Wikipedia is fatally flawed: I think it’s a great resource and a publishing miracle. I raised the Wikipedia versus Britannica story because it well-illustrates the main point of this entry.

The broader controversy over information quality on the Web has myriad ramifications. Without exploring them all in this entry, I stand by the statement that the value of the proliferation of new and original voices on the Web is seriously marred if the accuracy of what is represented as fact remains suspect and undependable.

This leads me to recommend a very fine site I discovered the other day while researching this issue. The Virtual Chase: Teaching Legal Professionals How To Do Research, has an excellent section called “Evaluating the Quality of Information on the Internet.” Chapters include “Why Information Quality Matters,” “How to Evaluate Information,” and my favorite feature, a checklist for evaluating the quality of information found on the Web.

While the site is targeted at the legal profession, and therefore not relevant in all of its aspects to each of us, what better profession to turn to than one where a false fact can mean in some circumstances the death of a client or in others the loss of a $5 million lawsuit!

I’ll be continuing to cover this topic. I hope this entry provides a suitable introduction.

Tags: , , ,

Wikipedia and the Meaning of Truth

October 24, 2008

In the November/December issue of the marvelous MIT Technology Review is a very fine article by the respected author and professor of computer science, Simson Garfinkel on the ever-controversial subject of what can we expect and trust from Wikipedia.

The topic has been a challenge for some time, mainly pitting the Encyclopedia Britannica, that most respected source, authored by experts in their respective fields, against Wikipedia, the most anarchic of resources, but which with some 7 million contributors manages as Mr. Garfinkel points out to be “remarkably accurate.”

What makes this article a special pleasure is that Garfinkel acknowledges Wikipedia’s success, but delves below the surface and notes that “with little notice from the outside world, the community-written encyclopedia Wikipedia has redefined the commonly accepted use of the word ‘truth’.” The topic is an important one.

The article is fascinating for many other reasons, but here’s a tidbit:

“Wikipedia considers the ‘most reliable sources’ to be ‘peer-reviewed journals and books published in university presses,’ followed by ‘university-level textbooks,’ then magazines, journals, ‘books published by respected publishing houses,’ and finally ‘mainstream newspapers’ (but not the opinion pages of newspapers).”

Do you think these are the best sources to verify information? They certainly conform to standard publishing beliefs, but do they conform to this new medium?

The article is worth careful reading.

Tags: , ,