By Edward Nawotka
“Discovery” is one of the hot buzz words of 2011. Every few months for the past year, a new online platform promises to make it “easier than ever to discover the books you’ll love.”
It’s true that readers face a daunting situation when trying to find their next read among the millions on offer. Bowker reports there were several million books published in the US market in 2010 and globally, the number of new titles seems to grow exponentially each year. In the face of this, it would seem most books don’t have a fighting chance of finding a reader unless certain steps are taken to help them along.
Traditionally, that means marketing, review coverage and prominent placement in bookstores. Increasingly, social media buzz, metadata and search engine optimization are being thrown into the mix. All these tools are used to plant a seed in the reader’s mind. Yet, the efficacy of such strategies has been very much under question. The extent of the struggle a book faces to find a reader was illustrated at the Frankfurt Book Fair by Aaron Stanton, CEO of BookLamp, who referenced a study of GoodReads.com, the United States’ top social networking site for book lovers. The study revealed that out of 918,000 book lists created by users on the site, 69% of the books listed on the site were referenced just a single time. (The book appearing most often? Twilight.)
Given the broad range of interests of the average reader, do you think it is truly possible for an algorithm to replace one’s own instincts? And, as book publishers and self-motivated authors become ever more savvy about how to use these online discovery tools, do you really trust them?
Let us know what you think in the comments.