Patch and the Sonoma paywall

While writing about the Patch business model, I noticed in Romanesko that the Sonoma Index-Tribune dropped its $5/month paywall after Patch opened up in its market.

That action pretty well crystallizes the challenges news organizations face as they move toward paid-content models in competitive information markets. And it suggests that many, or even most, will ultimately fail.

Although incumbent print media may have good reasons to try to get value for their content online, even if only to slow the erosion of their print products, other actors may not be so compelled. Companies like Patch are operating purely under the Axiom of the Audience Imperative, where they rationally will always seek any additional reader they can attract for less than the marginal cost of production and delivery. Since the marginal cost of serving one additional reader online is essentially zero, they will always tend toward free. After all, they have no interest in limiting their attractiveness as an advertising platform by charging for content and restricting their audience.

The only way to escape this imperative is to carve out a non-competitive niche. News outlets could, in theory at least, so distinguish themselves on the basis of quality or depth or timeliness or perspective or ease of use or some other dimension that they can effectively lift themselves out of the competitive fray and charge for access. The Wall Street Journal is probably the best example of this, with its unique value in the business world; the New York Times may have a shot, too, because of its high quality and depth.

And countless publishers are now counting on the unique value of their local news reports to pull off the same trick in their markets.

As someone who hopes they succeed, at least for a while, and who has indeed pushed down the same path, what’s discouraging about the Sonoma reaction is not that they caved, but that they caved so quickly and to such a weak competitor.

I’m not familiar with the Index-Tribune, but it appears to be a pretty typical small weekly, or twice-weekly in its case, with a paid circulation of about 8,000 and a sister lifestyle magazine. (It actually sounds similar to the Monadnock Ledger-Transcript, where I was publisher for a few years, before the paper absorbed the competing Transcript and added a second day of publication.) It has an editorial team of about six. It’s got a Town News-built site; nothing flashy but serviceable.

In September, the Index-Tribune started its paid content experiment. The numbers on Compete appear show a big hit to their traffic, cutting it by maybe a third to a half, but it’s hard to tell precisely, because of seasonal variation and the short time frame. In any case, Patch comes in less than three months into the experiment, presumably with its typical single-reporter staffing plus stringers, and the Index-Tribune throws in the towel.

The inescapable conclusion is that the Index-Tribune’s site and content was not sufficiently differentiated and irreplaceable in the eyes of readers.

Now it may well be that the Index-Tribune is not a very good paper and had no chance of succeeding with paid content, with or without Patch. Maybe all of its 8,000 print readers are just traditionalists who need to feel a paper in their hands, no matter how bad it is. Certainly their website is not as slick as the Patch site, which can’t be helping them in this battle.

But I suspect the Index-Tribune’s experience points to a more discouraging truth, one that will dismay publishers across the country in the months and years ahead, as they try to charge for content while facing an ever-widening array of free, community-based competitors: Our readers don’t love us as much as we think.

For most people, I suspect, general community news is pretty much a commodity. Journalists may see a huge gulf between established players and thinly reported news sites that rely on community contributions, but for many people the competitors are good enough. Readers may still respect and turn to the traditional outlets. But if there’s a free, “good-enough” alternative online, they are unlikely to pony up the subscription fee to support the old standby.

Sigh.

Even so, it’s probably still a good idea for print publishers to push ahead on paid content models. (As long as they do it wisely. And as long as they are prepared to act quickly if conditions in their markets change.) It certainly won’t help their web operations. But it may help to preserve their flagging print operations long enough for them to step smartly into the electronic future.

Patching together a hyperlocal business model

In the forthcoming AJR, Barb Palser argues that journalists should “give Patch a chance.” After all, she says, they’re hiring reporters and making a good effort at doing hyperlocal news.

For all the debate about the work environment and expectations (see the the Business Insider and the Chicago Reader), at least they’re not just ripping off teasers from local newspapers and using that to sell directory listings, classifieds, deals, online display ads and whatever other revenue streams they can conjure. So kudos to them for that.

But I don’t envy them their business challenge.

Several people have weighed in with skeptical looks at the business model. (See this quick list of published critiques.) For the most part, these are prospective estimates, developed before there was much of a track record to evaluate.

Now, with some of Patch’s early sites entering toddlerhood, we’re starting to get a better sense of the business realities behind hyperlocal publishing Patch-style.

And it’s daunting.

Slim pickings on traffic

Compete.com gives an approximation of the traffic coming to various Patch sites. Compete is not perfect, as anyone with access to detailed site logs will know, but it’s a reasonable enough outside view, and at least it gives us a little more to chew on in trying to deconstruct the Patch business model. So far, according to Compete, Patch sites don’t seem to be generating anywhere near enough traffic to sustain any substantive, dedicated editorial effort — even the more established sites.

Last month, according to Compete, the top Patch subdomain attracted fewer than 32,000 unique users. The next subdomain on the list had just over 13,000 users.

Interestingly, that top Patch subdomain was for Ashburn, Va., a site that just launched in October. (As a point of reference, the town has a population of about 88,000.)

The second site on the Compete list, for the far smaller town of Maplewood, NJ. (around 24,000 population), has been around for more than a year. It had its best month of the year in October, when it topped 13,000 users, but mostly it has bounced around between about 2,000 and 10,000 UUs for the last 12 months.

I don’t have the high-dollar Compete membership, so I don’t know how many page views any of that translates into. But Patch overall gets about two visits per user per month, and if you grant them five PVs/visit, you can estimate the October PV volume at about 320,000 for Ashburn and about 130,000 for Maplewood.

It’s pretty tough to develop a business plan that supports ad sales, ad and web production and editorial on top of 320K PVs/month, let alone on 130K PVs/month.

But let’s try.

Hyperlocal costs

The costs are relatively easy to estimate. Patch is apparently paying around $40,000 for reporters, and they have committed to hiring dedicated editorial staffer for each community, with editors overseeing clusters of Patch sites. (See Ken Doctor’s description of the structure.) They’re also paying for freelance contributions, at least initially. With benefits and the rest, you’re looking at $55,000 or $60,000 per site for editorial, ignoring the editorial structure at the cluster level and the freelance budget.

On the ad side, the basic structure as described by Ken Doctor suggests between one and two salesreps per town at final build-out. If they pay their local salesreps no more than their editorial staff, they’d be burning through another $55,000 to $120,000 on local ad sales alone. Call it $80,000 on average for local ad sales, and you’re up to about $140,000 for ad sales and editorial alone.

Then you’ve got production and technology and corporate overhead. That certainly will be centralized. Leaving site development aside as a sunk cost, let’s lowball all of this at $20,000 a year. And let’s assume the operations are entirely virtual, with no local storefront presence.

So we’re looking at a nut of about $160,000 a year per site.

A rough look at revenue

On the revenue side, we can try to build up an estimate by looking at each of the potential revenue streams, such as display ads, marketplaces, directory products, Groupon-like deals and so on.

But first, let me propose a very crude rule of thumb approach that works reasonably well for modestly sized news sites. My experience has been that a very well-run news operation would be happy to get, on an annual basis from all web-related revenue sources, about 40 cents for every monthly PV. So a site with 5 million monthly PVs could gross $2 million annually. A site with 1 million monthly PVs might get about $400,000. More typically, I’d suggest the number is closer to 20 or 30 cents per monthly PV.

But maybe Patch, with all of the accumulated wisdom of AOL and its serious technology and sales chops, will do much better than 20 or even 40 cents per thousand monthly PVs. They’re certainly in a much-better position to get national ads. Maybe they’ll get 50 cents.

For Maplewood, that suggests annual revenue of about $65,000 at the 130,000 PV level. You’re certainly not going to be able to afford a dedicated salesrep and dedicated editorial staff on that kind of a revenue base. And remember, that’s their second-best site, and it has been up for more than a year.

For Ashburn, at 50 cents/K PVs, you’d be earning $160,000. Not bad — they could be breaking even.

But that 50 cents per thousand PVs a month is a tough bar to hit. It depends on flawless local sales execution, great national ad sales support and significant penetration through self-service windows. Certainly, as is clear from even a cursory view of their sites, they’re nowhere near that now.

So what happens if they hit more typical revenue levels? What kind of traffic do they need to support the sites?

Given a more realistic but still generous estimate of 40 cents per thousand monthly page views, the break-even rises to 400,000 PVs. At 30 cents per thousand monthly page views, they’ve got to hit more than 530,000. If they get “just” 25 cents per thousand — a level many community news publishers would be thrilled to reach — they’d need 640,000 PVs/month.

I’ll give them some credit for executional skill. Even though we’ve probably low-balled the expenses, this crude approach suggests the Patch model is doable at about 500,000 PVs a month, but it depends on really good sales execution and a very tight rein on expenses.

A build-up view of revenue

I also tried building revenue up more atomically, estimating the potential from online display sales and other sources. That process is a whole ‘nother thesis in itself. I won’t go into the grinding details, but here’s an overview.

Assume the revenue pillars for sites like this are online display ads (including video, standard IAB units and so on), marketplaces (classifieds and related verticals) and directories (including coupons micro sites and Groupon-like deal programs). There are lots of ancillary revenue possibilities (archives, photos, events, auctions, mobile apps, what have you), but it’s probably safe to assume the Patch sites need to cover their costs with the tried-and-true revenue pillars. That said:

  • The potential for online display revenue is easy enough to estimate; make your own guesses based on average sell rate, spots per page and average CPM. With my most generous estimates, I see a chance to cover about half the $160,000 nut through display sales at a site with 500,000 PVs.
  • The potential for marketplace revenue is also pretty easy to estimate — it’s basically nil. With craigslist, ebay, job sites, car sites and real estate sites killing long-established newspaper marketplace franchises, the odds of a new player gaining a toehold here are tiny. And a look at Patch sites today shows minuscule use of their marketplaces I checked out the four sites with the most traffic that had been open for a year and counted 14, four, six and six classifieds.
  • The key to Patch’s success, then, seems to be in their business directories and in their ability to extend revenue from their directory infrastructure through deal programs, business micro sites and blogs, niche interest sites and whatever else they can cook up. And the directories are pretty darn impressive, with tons of photos, nice layout and good execution on all the things you’d expect. (Check out techcrunch for more on their plans here.)

So can they get $75,000 or $100,000 a year out of local businesses? Maybe. Certainly other small-market online publishers have. Selling directory listings is a tough job, though, with lots of competition. If they can get close to $100 a month or $1,000 a year from each advertiser, the price goal cited by techcrunch, they only need to sell 75 or so to hit the nut. But if they get more like $25 a month, then they have to sell closer to 400 businesses on the listings. In a town like Maplewood, that has to be a pretty high percentage of the potential advertisers.

Ultimately, their ability to sell deals and directories and all the rest (or at least to retain the advertisers they do get) will depend on whether they can deliver real value to advertisers. And that depends at least in part on simple volume. At 500,000 PVs per month, or even a million, precious few trickle down to the listing for any individual business.

Climbing the mountain

Now maybe 500,000 or a million PVs per month doesn’t sound like much to an AOL exec based in New York. And maybe there are lots of underserved towns and suburbs with desirable demographics and commercial centers, all of them just waiting to flock to Patch.

But 500,000 PVs is 12,500 loyal users coming to the site twice a week and looking at five pages each time, for a total of 40 pages a month.

Certainly, Patch sites today don’t seem to be generating that kind of reader engagement. Check out the Compete charts for Maplewood, Ridgewood, Darien and Garden City — all of which have been open for more than a year. They’re all growing decently, but outside of Garden City, which had a spike mid-way through the year, the charts don’t show them on a path to hit 500,000 PVs any time soon. Most importantly, as I noted above, the top-level Patch stats show an average of only two visits per user.

To develop deep reader engagement, with all of the competition and alternatives out their in the real and virtual worlds, you’d better have some useful and lively content resources. No one editorial person, short of Clark Kent, could do that. The only hope is that community members will step up and make the site their own, turning to it as the focal point for community dialogue.

And maybe that will happen, at least in some Patch communities. The sites aren’t bad, and they’ll get better. I wish them luck with that. But how many of Patch’s now-planned 500 sites will strike gold like that? And can they carry the rest?

I can’t help think that when long-established media companies with deep roots and journalistic expertise struggle to develop the traffic and the revenue models that support meaningful journalism online, maybe it isn’t so easy for a tech company to waltz in, hire a tiny editorial staff, stuff a community site with reader comments and rake in the dough.

More reading on Patch’s model

See main post, “Patching together a hyperlocal business model.”

One of the best looks at Patch’s business model comes from Ken Doctor on Seeking Alpha.

The Atlantic Wire raises some key questions about the Patch model and links out to some good resources.

Business Insider’s early take, “AOL’s Patch Revenue Model Makes No Sense,” has been much quoted, but it doesn’t delve far beyond the revenue from display ads.

And check out TechCrunch’s look at the Patch directories.

Trends: Newspaper advertising revenue from 1979

Thanks to Alan Mutter for his update on recent newspaper advertising revenue trends reported this month by the Newspaper Association of America, where he observes that the ad revenue rebound experienced by other media has largely bypassed the newspaper industry.

Smaller newspapers, especially those in relatively isolated markets, probably have done better than the industry as a whole. But the much-hoped-for rebound for newspapers does not seem to have materialized.

For some perspective, I charted trends from 1979 to the present, using data from the NAA site and adjusting that for inflation. One thing that’s especially striking is the steep decline in dollars per reader over the past few years, even as the number of readers has fallen substantially, providing a smaller denominator, which you would expect to flatten the line a bit more.

And consider this: You have to go all the way back to 1984 to find a time when newspaper print revenue was lower than 2009, in inflation-adjusted terms.

Chart of print ad revenue from 1979

The blue line shows total print advertising revenue, as compiled by the NAA and adjusted for inflation. The red line shows revenue per annual sold copy.



Chart of daily print sales from 1979

The blue bars show morning papers, the red bars show afternoon papers. Data from NAA.

Everything old is new again

Remember CueCat?

Somewhere, I’m sure I still have one of those little proprietary bar code scanners that got mailed out to millions of people in the hope that we would all use them to follow bar-coded links in printed products to online resources and offers.

I loved CueCat, partly because it was a bold, even visionary idea, and even more because it was such a blatantly bad business and product plan. It earned a place of honor in my own personal pantheon of spectacular and foreseeable dotcom failures, right up there with my all-time favorite, the iSmell Personal Scent Synthesizer from DigiScents.

(Both of these, I see, made it to a fun list of “The 25 Worst Tech Products of All Time” published by PCWorld back in 2006.)

Now the idea behind CueCat is back again. But this time it might just take off.

QR (quick read) codes have their roots in manufacturing, but now they are being used to attempt the same magic CueCat was intended to perform — turn print products (and signs, billboards, displays or anything else) into interactive media.

There are a couple of key differences, though. First, the QR code inventors are apparently not exercising their patent rights, so the technology is effectively open. Even more importantly, the rise of smartphones has replaced the need for a proprietary and cumbersome device to read the codes. Plus, QR codes seem to fit right in with some of the other scan-reading technologies being used on iPhones and Android devices, so it’s not much of a stretch to imagine people taking shots of QR codes, too.

Today, Sean Burke, publisher for Gatehouse Media New England in Fall River and Taunton, Mass., announced that The Herald News will be experimenting with QR codes to provide “hardlinks” back to editorial and advertising resources.

That’ll be interesting to watch. My guess is that use of QR codes for additional news content will be minimal, but if they can get advertisers to send coupons back to readers’ phones they’ll get some traction. It might also be useful for iCal links, map links and other things that tap into smartphone functions.

Want to play along?

You can generate QR codes at bit.ly or goo.gl or lots of other ways you can find with a simple search of “QR code generator.”

At mashable.com, you’ll find some creative examples of QR code use that should get your juices flowing. One place to start is with the story “5 Unique Uses for QR Codes.” A more recent article there called “Why the Best Online Marketing May Be Headed Offline” updates things a bit. And you can scoot over to memeburn for some more ideas in “9 creative uses of QR Codes for your business.”

Re-engaging readers

The stats I’ve seen about online reader engagement are pretty discouraging, as I discussed in “Disengagement.”

Leaving aside, for now, the challenges related to the medium itself, which are difficult to control, I think there are some basic approaches and principles that news sites can keep in mind to engage their readers more deeply.

Beyond the basic requirement that you offer compelling and distinctive information and services, I think it’s useful to think about three important characteristics of what you could think of as web-positive publishing. I call them the Three I’s: immediacy, interactivity and interneticity.

Immediacy is about recognizing the voracious desire among our readers for good old-fashioned news, with the emphasis on new. Having spent many years poring over news site traffic logs, I can guarantee that the big preview of coming legislative battles and priorities, however important and well-written, will drop like a stone to the bottom of the most-read list, while the short newsy item about the fire that consumed a meth lab will rise to the top. Also: Quirky news wins, short wins, timeliness and urgency win, images help and feature stories can be either big hits or traffic flops and there’s no way of knowing beforehand. And, yes, controversy loves a crowd.

That’s not to say that a steady diet of meth lab fires and short quirky blurbs is the solution for every news site. (That’s what local TV news is for.) And that’s not to say that you shouldn’t do the big takeout on the upcoming legislative session, or the heart-warming profile or what have you. All of those may be essential to your mission and equally essential to maintaining your brand and personality, both in print and online.

But the content strategy for news sites has to begin with the understanding that online readers are looking for information fixes. You need to meet that need with a steady flow of timely, newsy, relevant information.

Pew offers some insight into news readership habits. According to Pew, online news junkies typically patronize a slowly evolving core list of trusted sites that they visit frequently. For most internet users, the core list includes just a handful of sites. Only 11 percent regularly visit more than five news sites on a typical day. Critically, news readers are not terribly attached to the sites they visit — about two thirds do not have a favorite site.

One key aspect of online news readership appears to be missing from the Pew study, though — the frequency of vsits from regular users. As far as I can tell, the Pew study only went as far as measuring daily use. My belief — and it is a key assumption — is that a news site’s best users will come to it habitually, perhaps several times over the course of a day.

Or at least they will if the site rewards repeat visits. A site that doesn’t change quickly kills that impulse to check in. A site that has a sense of urgency and recency rewards repeat visits — and stays on that reader’s short list of core news sources.

So how do you engineer immediacy? It doesn’t have to be a huge effort. News blogs, alerts (even automated alerts) on stories that are hot, most-read and most-commented lists, “coming tomorrow” promos, integrated wire tickers, quick mid-day updates and quick page remakes to emphasize popular stories all help. You can look beyond the main news stream, too, to things like featured reader comments, Twitter feeds, reader queries, rotating classified displays, featured reader business reviews, reader-submitted content and anything else that’s relevant and timely.

Interactivity is the second pillar of engaging readers, and it should include both content interactivity and institutional interactvity. Make it possible for readers to grab hold of your offerings and make them in some sense their own, through comments, social media integration, personal clipboards, ratings and so on. And embrace the notion of dialog with your readers, making it easy for them to submit tips or photos or news, offering chats or other interactive events, giving them opportunities to shape presentation, coverage and priorities.

(Update: Robert Niles has written a post at OJR: The Online Journalism Review on baby-stepping your way toward interactivity.)

The third “I,” interneticity, is my own horrible neologism, for which I apologize to anyone of linguistic sensibility. By interneticity, I mean taking advantage of the possibilities of the medium for smart, useful and engaging information presentation. Some of the best examples of what I mean by interneticity can be found at the New York Times, where their team of data-smart designers regularly produce stunning information graphics. Fortunately for the rest of use, there are lots of tools out there for integrating maps, timelines, charts that can be manipulated by readers, interactive Wordles, tag clouds, slide shows, etc.

There’s no one path to re-engaging readers online. But keeping immediacy, interactivity and interneticity as touchstones for news site content strategies will reward and promote more frequent, deeper site use.

Disengagement

I spent a couple of days recently modeling the impact of paid content on a representative small daily newspaper.

I hope to post a little more later on my approach and results. (Spoiler alert: The near-term loss of ad revenue can be minimized fairly easily, to the point that it’s offset by even microscopic subscription uptake, when you have a lot of unsold ad inventory.)

But right now what I’m really wrestling with are some of the implications of the reader segmentation I did to prepare for the modeling.

Much as Jonathan Stray did several months ago with his paywall calculator, I started with a look at who’s coming to the site.

The site I was looking at does not have great data on its users. It relies mostly on Google Analytics and gets some corraborating information from the ad-serving platform. Registration currently is not required, so there’s little useful information there, and there has been no real effort to mine that.

Google Analytics is not a great tool for doing user segmentation, I found. Like Omniture and other stat services, it seems to be mostly focused on generating visit-based data. But I was able to back my way into a rough picture of site users, breaking them into four groups: fly-by, casual, moderate and loyal users.

The results were sobering.

Here is an open news site in a market that doesn’t have a lot of direct, local online news competition. The paper has a good reputation and enviable print penetration. It’s been online for years, and it has a really full news report with all the standard folderol — yellow pages, classifieds and verticals, photo galleries, reader comments and so on. Yet very few online readers demonstrated loyalty or use approaching anything like the readership of the daily print paper.

Consider: For every thousand Sunday print readers (calculated using the well-established 2.3-times-sales factor), this paper had only 55 heavy website users and another 54 moderate users.

There are data complications, of course. Without tying back to registrations, I can’t tell if some of the casual UUs are home visits from people who are moderate users from their work computer, or vice versa. If I could track that duplication, I would likely see more moderate and heavy users.

On the other hand, it’s anyone’s guess how many of those heavy users simply have the site loaded as their browser’s home page, which would boost their apparent usage without reflecting actual readership of the site. And, more importantly, I set a pretty low threshold on the segment definitions. I defined moderate users as those who came to the site more than 10 times in a month, meaning they looked at between 40 and 50 pages, because I wanted their use to correspond roughly to the likely metered threshold.

There’s plenty of room to argue about the absolute precision of the numbers, but to me the real takeaway is clear: This site’s online readers are nowhere near as engaged with the product as are its print readers.

That’s a problem. And I don’t think the problem is specific to the site I looked at.

The engagement problem is obvious to anyone who spends time looking at news site stats and compares them to research on print readership. Visits typically range from four to six pages, which means maybe three or four actual stories read. Visit times are just a few minutes, compared to the 20 minutes or so the typical print reader spends with the paper. And precious few online readers have established the kind of daily reading habit that comes with a home subscription.

To a certain extent, much of this may just be the nature of the medium. I seem to recall once seeing data that suggested traditional news readership made up about 7 percent of the average online user’s time, but I was not able to find good recent data on that. Beyond any measure of time devoted to news readership online, it may also be that web-based news sites serve and reward quick-hit news grazing, as opposed to the kind of immersive experience you get with print. (Perhaps tablets will help to change that.)

And, of course, you have to look at what traditional news sites are publishing. With most news sites still shoveling print content online, leavened with a few breaking updates and some extra photos (if you’re lucky), it may be that news sites are still doing “radio on TV” and haven’t adapted quickly enough to the online medium.

Whatever the cause or causes, news sites simply have to focus on engagement if they are to thrive. If what we’re doing is not captivating readers, we have to change. Especially if we want to charge.

For 2011, my early resolution: Ignore PVs. Focus on metrics of engagement like PVs/visit, visits/UU/month, time on site and the raw count of heavily engaged users.

Cyber Monday question

So how many people swarmed their local newspaper websites last week looking for all the Black Friday deals?

Right. Unless your local newspaper offers online versions of the circulars, nobody did.

That’s a problem. Online news sites are becoming increasingly irrelevant to people looking for focused shopping information, the very thing, arguably, that gave newspapers a 200-year run as the dominant mass-market medium.

Online circulars might help.

None of the papers I’ve worked with have used them, partly out of concerns that they would undermine print sales, and partly because some of the early versions were so clunky. Today, though, the user experience has improved, and the movement toward paid content suggests a tactic that minimizes cannibalization: Use the online circulars as a value-add benefit for print or online subscribers.

But that’s at best a partial solution.

Just tacking online circulars on to a site creates another information silo, a potentially useful resource that languishes until and unless someone makes an active decision to dive into it.

It seems to me the key to making the most out of online circulars — and all the other commercial information embedded (and isolated) in banners, sponsorships, linked advertiser landing pages, directory listings, business blogs and so on — is to capture and integrate the metadata behind them.

In the meantime, who’s got the best online circulars? Shoplocal? And if you work at a paper with online circulars, what was your Black Friday experience?

On muddling through

There’s nothing more seductive than the next big thing. Especially the next big sexy web thing, the clever idea like Twitter or failblog or Groupon that takes almost nothing to launch, turns 20-somethings into multi-millionaires and seems so retrospectively obvious that anyone could have done it.

News companies, in particular, seem eternally desperate for the next big thing. As they see their traditional business models founder, they bound, ever hopefully, after verticals and micro sites and user-generated content and multimedia content and paywalls and hyperlocal and business blogs and whatever else flashes before them.

Having done my own share of bounding after mirages, I’ve come painfully to realize that a great idea and a buck will get you a medium coffee at Dunkin’ Donuts, and that’s about it.

That’s not to say that paywalls, or pay curtains, aren’t a good thing. We’ll see. As well with hyperlocal and business blogs and all the rest. But no brilliant idea is going to save us. What just might save us, though, is what I’ve come to think of affectionately as the modest virtue of muddling through.

Which is to say that every brilliant idea is flawed in some unexpected way, and even the best ideas are quickly challenged by changing technology and evolutions in online habits and mores. What distinguishes successful online innovators is not the brilliance of their ideas, but rather how well they muddle through partial successes and even outright failures to arrive, eventually, at something that works for their audience and their organization.

As far as insights go, mine is not terribly original. After all, Edison said a century ago that success is 1 percent inspiration and 99 percent perspiration. We’re not talking relativity here.

But it’s something of a relief to realize that brilliance is over-rated. You don’t have to wait for some earth-changing inspiration to get cracking.

You do, however, still have to put in that other 99 percent.

Among media companies I consider to be successful innovators, there are some common features and lessons that can be drawn.

First and foremost, I think, is to expect and plan for failure, at least implicitly. I’ve been told the mantra of one Harvard B School professor is, “Fail early and fail cheap.” That deceptively simple prescription suggests that you won’t blow your whole budget on one Hail Mary project, it suggests that you will start by thinking carefully about what success would look like so you can change course rapidly, if necessary, and it suggests that you will understand the keys to success so you can learn from your experience.

I also think some work cultures are more conducive to successful new media innovation. A broadly shared understanding of the strategy and roadmap is critical, as is meaningful involvement from top management and a relatively flat and collaborative workgroup structure.

So there you have it. With these ideas and a buck, you can go coffee up.

Seeing a site-less web

Jim Boulton has me thinking more broadly about the challenges that aggregators and social media links present to news companies.

News companies certainly have been focused for some time on the risks presented by news aggregators, but mostly in a narrow context — we’ve been concerned primarily about the loss of revenue and traffic when readers go no further than the blurb presented by the aggregator. So far, it seems, the aggregators haven’t replaced on-site news reading — Google News and other aggregators now drive substantial traffic our way. So we’ve adapted, and we’ve learned to treat every page as a landing page, providing plenty of hooks to keep people interested.

And, oddly enough, despite all of the concern about Google News and so on, news companies seem to have jumped quite willingly into the social media game, providing all kinds of ways for people to post links and blurbs through Facebook, Digg and what have you. That’s curious, because social media linkages essentially turn all of us into aggregators, with all the attendant risks to traffic-based revenue.

Boulton comes at the issue from a different perspective. He’s not a media guy, first of all; he appears to have morphed from web designer to web marketing consultant at Story Worldwide, which calls itself a digital content agency. Kost recently, he was behind an exhibition called “Digital Archaeology,” a display of old websites celebrating the web’s 20th anniversary. For that show he was interviewed on the BBC’S “Digital Planet” program, where he made a bold prediction that has deep implications for news organizations:

“In five years time, websites won’t exist,” Boulton said. “So I think that means it’ll be a 20-year time when websites were the most important form of brand communication, and they’ll have gone from not existing at all to being the most important form of brand communication to disappearing again. We’re already seeing it now, the convergence of the browser and the desktop. Online experiences increasingly revolve around the individual rather than around the brand. It’s what we call the semantic web. So I think the notion in maybe five years time that someone will go to an online destination and experience a brand on its terms in its space will just be crazy.”

I have no idea whether he’s right about the demise of websites. But the trend toward Web 3.0-style integration is real enough, and that stands to fundamentally change the relationship our readers and viewers have with our sites.

In one sense, you could look at this as the same old aggregation challenge, where we struggle to lure people into our domains for some gentle monetizing while they, in turn, push for greater control over how and where they get information.

But Boulton’s focus on brand is interesting. Brand management, I think it’s fair to say, is not something most news organizations pay a lot of attention to. At a time when traditional media companies are struggling against the perception that they are old and tired and on their way out, that’s probably foolish. I’m no brand management expert, and I’ll confess that I think a lot of what passes for brand management is awfully close to witchcraft, but I have no doubt that thoughtful attention to the signals you are sending is essential, insofar as it presupposes some real thought about what your audience is, what your value proposition is and how well your strategy and product position you.

But how do you control those signals that when your readers and viewers are getting their information outside of your carefully designed site? In a world of mobile news blurbs and feeds and readers and social news, our stories and content become indistinguishable and commodified. Especially text-based content.

One approach I suspect we’ll see more of is more use of embedded messages, where news organizations start to pay more attention to everything from attribution lines and kickers to in-line references and even quasi-promotional features within the narrative.

There’s a third issue raised by Boulton’s vision of a site-less web, beyond the threats to traditional ad revenue and branding control, and that’s the loss of user information.

When readers and viewers consume our news in their own world or a world of someone else’s making, we don’t necessarily know anything about them and how they’re using our content. Small and mid-size news companies have been slow to take advantage of user data as it is; a site-less web could make it even harder.

For that, I have no prescription. Certainly, we’ll all be getting smarter about using social media alongside traditional web informatics, but beyond that much depends on what sorts of tools emerge for people to consume news in a site-less web.