Search engines, RSS feeds and content aggregators make a reader’s life easier by providing new ways to scan for articles and to discover news. One result of this is that readers may no longer feel the need to regularly visit their local paper’s website in order to stay informed about the goings-on around town.

Following this logic, publishers work hard to make their content as searchable as possible, to make it accessible outside of a newspaper website. Conventional wisdom dictates that websites should be optimized for search engines.

But what if your content is very specific in nature? Suppose that you have a respected brand, and that people in your community look to you to provide information that is relevant to them? When newspapers give their readers alternative ways to access their information, they are gambling that the a la carte traffic coming back from the search engine will more than make up for the loss of direct traffic they previously received.

The theory goes that the easier your content is to find, the more traffic your site will receive. But a recent experiment by a few newspapers in Northern California suggests there’s value in keeping come content away from search engines and aggregators.

Papers Prevent Search Engines, Aggregators

During the first quarter of 2008, three small newspapers in Northern California with website pay walls edited their robots.txt files to disallow search engines and aggregators from indexing any content on their websites. I am vice president of digital media for the newspapers in question. I run web strategy, sales and operations for dailyrepublic.com, davisenterprise.com, and mtdemocrat.com. We made the change when local advertisers started buying Google AdWords instead of ads on our website. Realtors, for example, buy the keywords “Fairfield Real Estate News” and advertise on our content through Google, which is not good for us.

As a result, management at the papers decided to cut off search engines and aggregators. You can view some of the results here:

i-912e2aba38ee64b5eb8d98e36bc5cc2d-boydston-figure1-thumb-500x306-1570.png

As the charts above illustrate, website traffic has grown steadily in each of the four key metrics we studied. What was most surprising, however, was the impact this change had on our ad-serving effectiveness. The click-through rate for ads rose from a modest 0.29 percent in 2008 to an average of 2.87 percent today on paid access pages. (You can also view some related data here. It compares paid and free websites of similar size.)

It appears that for these papers, traffic volume alone does not impact click-through rates. What I’m suggesting is positive correlation between increased reader frequency and the click-through rate. Frequency is key to generating advertising response. Simply put: Newspapers who give their readers too many ways to read their content may be inadvertently destroying the advertising effectiveness that sustains their business.

I am not trying to convince you that every website should block search engines, or that newspapers should all try pay walls. But I implore the news community to consider that it is plausible for a news organization to thrive without search engine traffic.

It’s a concept that stirs up emotional responses from many in the news industry — but it deserves more logical contemplation.

Reblog this post [with Zemanta]