July 2008

Has the day of the e-book finally arrived.  There are signs that the answer is a definitive yes.  Though what the long term impact o the e-book will be is still not certain.   The good news has come from several quarters.

Amazon Kindle

Amazon Kindle

Amazon’s launch of Kindle has been successful.  Though Amazon has not released unit sales of the Kindle, the Silicon Alley Insider reports that some analysts estimate monthly volumes are running at 48,000 units and they could go as high as 72,000 per month by yearend.  As one mesaure of the uptake of e-books on the Kindle, Amazon has indicated that for titles available in both print and e-book form, e-book sales are running at about 12% of print sales.   

In another piece of positive e-book news, Sony has announced it will switch its Reader to the International Digital Publishing Forum’s (IDPF) EPUB standard.  This gives a significant boost to the beleagured format and could eventually put pressure on Amazon to abandon the proprietary format of the Kindle.

wholesale e-book sales US

wholesale e-book sales US

E-book sales figures are also growing smartly, though they still represent a small fraction of print sales.  The latest e-book wholesale revenue figures for the US, reported by the IDPF, show sales on track to reach about $40 million in 2008.  Worldwide e-book sales figures are also growing rapidly.  Ebooks Corp. expects the world market to reach $220 million this year and could be $3-5 billion in the next 5 years.  Macsimum News has specultedtht Apple may be coming out with an e-book reader, perhaps based on an enhanced iPod with a larger screen or a revived version of the iBook. 

Clearly the buzz around e-books is increaing.  So what might it mean for publishers?  Here are a few thoughts on how the e-book may take its place in publishing:

  • Serve as a testing platform for a new title to determine whether a publisher should invest in a print version.  This is already occuring in some areas o the publihsing world – e.g. romance and science fiction.  If sales of the e-book version pass a designated trigger point, a print version of the book is offered. 
  • Provide an inexpensive launching platform for new authors who would otherwise not be able to get their book published in a print format. 
  • Become the format o choice for publishers in fields where content changes rapidly – e.g. books related to computer technology.
  • Act as a viral marketing teaser.  E-books may serve as an excellent way to build interest in the print edition. 

Not everyone will in the industry will welcome a robust market for the e-book.  In the long term, perhaps the group most susceptible to the growing market for e-books will be distributors and wholesales o print books.  Digital distribution of more works will reduce their revenues and importance unless they are able to reinvent themselves.

I love my (print) book!

book love

Though it’s been a happier time for e-books, it is still to early to tell how significant a role they will play in book publishing.  In a recent post, Chris Webb argues that the book publishing industry might not follow the same path as the music industry with regard to digital content.  The book, after all, has been portable and mobile for years.  The experience of reading has changed little and the new crop of readers may not enhance that experience enough to pry customers away from the printed book.  He sums his case thus:

So, I guess the point I’m trying to make is, unlike the music business the relationship customers have with the printed book is still quite strong and have been meeting the experience expectation for hundreds of years. And I don’t say this because I’m overly sentimental traditionalist. But I do think it’s an important time to remind ourselves that the printed book still provides an excellent user experience. And this is a real strength that only enhances our position in a digital age.

But for now, the creators and purveyors of e-books can just be present in the moment and enjoy their summer of love.

Related Posts
Share this Post




In an age focused on all things global – it’s instructie to see who are the most global authors in terms of their work being translated into other languages.  UNESCO’s 2008 index of translations, published annually and called the Index Translationum, provides an interesting look at the leaders. 

First a little background.  The Index Translationum is a list of books translated in the world, i.e. an international bibliography of translations. The Index Translationum was created in 1932 and this year is celebrating its 75th anniversary.

The database contains cumulative bibliographical information on books translated and published in about one hundred of the UNESCO Member States since 1979 and totals more than 1.800,000 entries in all disciplines: literature, social and human sciences, natural and exact sciences, art, history and so forth. It is planned to update the work every four months.



You can view the Global 50 on the UNESCO site.  The Old and New Testament and Walt Disney are at the top of the list.  No surprize there.  Among individual authors, Agatha Christie leads the pack, ahead of such notables as Shakespeare and lmost double the next closest author.  Despite the immense popularity of the Harry Potter series, J. K. Rowling hasn’t yet made the top 100 showing she is still primarily an English phenomenon.

English was by far the top original language from which translations were made.  Approximately 50 percent of all translations are from English into another language.  The top 10 target languages for translation were, in order:  German, Spanish, French, English, Japanese, Dutch, Portuguese, Russian, Polish and Italian.



Harry Bingham, of the Financial Times of London, provides a good analysis of the index in his post Strong Language.  In particular, he notes some of its flaws.  For example:

  • It doesn’t take into account sales volume and generally underweights the importance of authors who write only a few books that sell many copies – e.g. J.K. Rowling.
  • The index also pays little heed to the stature of a writer – e.g. on the list, Barbara Cartland ranks ahed of Shakespeare despite the quality of her writing.
  • The index was created during a period of American hegemony in the book publishing world.  Hence the strong showing of American authors and works translated from English to other languages.
  • Some institutions are included in the list that may not deserve a ranking based on UNESCO’s methodology – e.g. the Roman Catholic church.

However, these issues aside, the index is an interesting read and provides one proxy for how global book publishing and reading is evolving.

Related Posts
Bookmark this Post


lake wobegon

lake wobegon

If we lived in the Prairie Home Companion’s village of Lake Wobegon, all of our book titles would have above average sales.  Want to know whether your book’s sales are above average?  The staff at the Southern Review of Books (SRB) have come up with a rough benchmark based on some recent industry sales data.  They have computed that the average book sold 7,608 copies in 2007.  How did they determine this? 

First, they started with data from W.W. Bowker showing the number of books published in the U.S. in 2007.  On May 28, 2008, Bowker released its latest statistics on book production in the U.S. and estimated that 411,422 new titles were published in 2007.  Of those, 276,649 were traditionally manufactured titles (ink on paper), and 134,773 were digitally manufactured titles (toner on paper), also referred to as print on demand (POD) titles.  Bowker derives its book production figures from year-to-date data from over 72,000 U.S. publishers and include traditional print as well as on-demand titles.  Audiobooks and E-books are excluded.  

Next, the Southern Review of Books staff considered unit sales from the Book Industry Study Group (BISG), which shows that U.S. publishers produced 3.13 billion copies of their books in 2007. Dividing the BISG unit sales by the Bowker figure of 411,422 new titles, one gets a mean average of 7,608 copies per title.  Thus, any title selling more than 7,608 copies would be defined as “above average,” and any book with fewer copies sold would be “below average.”

normal distribution

normal distribution

Here are some caveats from SRB about the average they computed:

  • The mean is an average of low-run POD titles combined with high-run traditional titles.  POD runs typically don’t exceed 1,000 copies, since this is not economically feasible.  This means POD titles are generally below average from the start . 
  • Similarly, offset runs for ink-on-paper titles typically become economical above 1,000 copies, so traditionally published books are more likely to be found in the above-average category.
  • The average above applies only to titles published in 2007. 
  • The numbers from BISG and Bowker are based on reporting systems that use samples and don’t necessarily produce absolutely accurate information.
  • It’s unlikely that unit book sales published in 2007 would follow an unskewed normal bell-shaped curve distribution. That’s because roughly two thirds of new titles were conventional ink-on-paper titles with large press runs, while one out of three, or a third of the total, were toner on paper print on demand books that likely had small press runs well below 1,000 copies on average. That would produce a distribution skewed to the right, or positive/traditionally published side of the distribution.

With regard to the last point, here is what we would expect from an unskewed normal distribution.

# of std. dev. from avg.

# of titles

% of titles



68 %



27 %



4.7 %

4 or more


0.3 %



Unfortunately, we live in a world where such distributions are skewed.  But the 7,608 average is at least a starting point.  So if you want a title to be above average, figure out what it will take to sell 7,608 copies (plus one).  Averages are useful, but here are some suggestions for the future:

  • A more interesting average in today’s multimedia, multi-format world might be average IP revneue units sold, which would include copies sold in any format – print, electronic, audio, etc. 
  • Compute averages based on a more comprehensive and detailed sales tracking database like BookScan.
  • Compute averages for categories.

Perhaps uUsing mulitple averages, will allow all of our titles be above average.

Related Posts
Bookmark this Post



The July 2008 issue of Wired has a thought provoking article by Chris Anderson entitled The Petabyte Age.  A petabyteis an unimaginably large amount of data – 1,000 terabytes or a quadrillion bytes.  The article catalogs a number of important applications using datasets measured in petabytes; everything from agriculture to politics.  Anderson has asserted that the availability of these huge datasets is lessening our reliance on the predictive value of theory and mathematical / statistical models.  Models have an elegant and convenient compactness, but often a limited predictive ability.  “Big data” closes the predictive gaps if you have the storage and processing power to manipulate and make sense of it.

The Petabyte Age is the natural outcome of three “laws” (ahem, recall those compact models).  These are Moore’s Lawwhich governs the growth in computer processing power; Kryder’s Law which predicts hard disk storage cost per unit of information; and Butter’s Law which measures the capacity of the fiber optic network underpinning the Internet.  These laws are synergistic – processing power can be greatly amplified by hooking servers and PCs together in computing networks; storage can be extended via disk arrays; and huge datasets can be accessed over high speed, high capacity fiber optic networks. 

So all this discussion of “Big Data” got me thinking: The whole bibliosphere could be radically changed.

  • Authors could tap into gigantic databases to do incredibly detailed research on people and places.  Novelists could scan the entire body of literature to see where “story gaps” might exist to be exploited. 
  • Publishers could track readership trends based on accumulated book sales data and accurately predict the success or failure of any book prior to its publication.   
  • Readers could go to their favorite online bookstore get a pinpoint recommendations based upon analyses of buying histories, correlated with with behvaioral, demographic and psychographic profiles. 

But the one thing that probably won’t change is the way we package all the new knowledge that “peta processing” delivers?  We will likely use the same book size packets – whether in print or electronic form – we use today.  Why?



The book is the anti-petabyte.  It is perfectly tuned to the human mind.  Stories are how we make sense of things.  Our brains are confronted by petabytes of raw data during our lives; yet the memories we create out of that torrent can be squeezed into a terabyte or two.  The stories we tell – whether of fact or fiction – represents the imprecise model, the compact and convenient approximations that leave us wanting more.  Biologists tell us that this filtering is the core of our success as a species. 

I can marvel at the power of Big Data and Cloud Computing.  But being human, I will always believe the real power lies in the Little Story.

Related Posts
Bookmark this Post