My Photo

Email me


Made with ImageChef

Google Content Search


Powered by Rollyo

Content Matters Community

Content Links

  • AlacraBlog
  • AlacraWiki
  • billtrippe.com
  • ContentBiz
  • E-Media Tidbits
  • eContent Magazine
  • InfoCommerce
  • ONLINE Insider
  • PaidContent.org
  • Research Recap
  • Seth Godin Blog
  • Shore Communications
  • That We Know
  • The Content Wrangler
  • Web Ink Now
Powered by TypePad

Content Industry Jobroll

  • Jobs from Indeed

Syndication and Reuse

  • Creative Commons License
    This work is licensed under a Creative Commons Attribution 2.5 License.

« March 2006 | Main | May 2006 »

April 11, 2006

BSEC Day Two Keynote - YS Chi

Camelback5 Buying and Selling eContent Day Two opened with a keynote by Elsevier Vice Chair Y.S. Chi.  Y.S., formerly Chairman of Random House Asia and President and CEO of Ingram Book holdings, joined  Elsevier a year ago, to head global academic and customer relations.

Ys_chi
Y.S. spoke of how it’s very difficult to be a publisher today.  He described how we need to redefine publishing, where the primary function is to “quickly get right information to the right person in the right context” and where the new role of the publisher is as market maker (“to know and honor user needs”).

As a frame of reference, Y.S. compared the impact of the Internet to the invention of the printing press.  As with the printing press, the web has led to content explosion, new communities and the democratization of knowledge.  Today’s “uber publishers” can adapt faster, providing deeper communities and do a better job of filtering content.

The opportunities for publishers are threefold:
1. Filter for quality: while incumbents can leverage your brand for trusted content, that won’t last, so publishers need to do a better job filtering.  Filtering must be matched to the user needs: for medical content a top-down editorial model is probably best, while the wisdom of crowds is great for movie reviews.
2. Enhance productivity: the productivity of knowledge workers continues to fall; publishers must embed content into users workflow to make them more productive (as ES has done with Science Direct, Scirus and Scopus)
3. Nurture networks to develop compelling communities.  But, communities must be the “gotta go” type, creating a visceral response from users when they visit. 

In closing, the “uber publisher” must stay attuned to user needs and experiment and learn.  There are many user needs, which translate to many business models.

BSEC Day Two

Camelback_sunriseAs the sun rises over Camelback, we get set to kick off Day 2 of Buying and Selling eContent.

This morning kicks off with a keynote by Y.S. Chi, Vice Chairman of Elsevier.

Following that is a panel entitled "The Subscription Dilemma: Is it Time to Move On - or Not?" featuring McGraw Hill's Andrea Broadbent, Adam Bernacki of Leadership Directories and Jonathan Lewis of eMeta, moderated by Joe Bremner of Kennedy Information.  It should (hopefully) raise issues that make some of the traditional publishers a bit uncomfortable with the long-term prospects for their model.

The morning continues with a look at how premium content can play in the world of SEO and SEM, featuring Congoo's Rafael Cosentino, Matthew Hong of Thomson Gale, David Scott and Pam Springer of ECNext, again moderated by Joe Bremner.

April 10, 2006

BSEC Tim O'Reilly keynote

Camelback6 Buying and Selling eContent is so packed with content that they needed two keynotes the first day.  The afternoon keynote speaker, Tim O’Reilly of O’Reilly Media, was the highlight of the day. 

O’Reilly, credited with coining the name “Web 2.0” provided a set of six key rules for successful Web 2.0 applications:
1. Users add value
2. Network effects by default: Tim talked of the “architecture of participation” where successful apps default to aggregating user data as a side benefit to usage (e.g. Napster defaulted to sharing=on)
3. The Perpetual Beta: key in the software as a service model is continual improvement and rolling feature enhancements.
4. Services have to be above the level of a single device: the PC is not the only access device for Internet applications and those that are limited to such are less valuable than those that reside on the Internet.
5. Data is the next “intel inside”: the BSEC audience was pleased to hear that their content could be the differentiator for applications.  Examples included Navteq, the source of Google maps and virtually every other street mapping application, and Gracenote, the “CDDB” database that matches titles to track numbers within all the major music applications.
6. A platform beats an application every time.  And, unlike the MS Office model, the Web 2.0 world allows small pieces loosely joined together to add new value to the platform.

Tim also spoke of Asymmetric Competition, where a new competitor with a different business model may kill your business.  His example of Craigslist killing the newspaper classified business is now a classic, but there are many others out there.  For those who’ve read Outsell’s Neighborhoods of the Information Industry, this theme should ring familiar.

One of Tim’s key points came out of the Google Maps experience, where they had not intended for the application to serve as the basis for what would become mashups.  His takeaway is that “if your users aren’t surprising you by the ways that they build on your (Web 2.0) product, then you’re doing something wrong.”

To catch up on the rest of the Buying and Selling eContent conference, take a look at the following blogs.  All told, there were close to a dozen attendees blogging in some capacity today.
Rafat Ali will be speaking on a panel Tuesday and is live-blogging throughout.
John Blossom at Shore is posting on his special events weblog.
Ross Mayfield has an interesting recap on Tim O'Reilly's keynote.
Larry Schwartz, of Newstex, points out his win then loss of the $20 cliche bet for first use of "Long Tail"
David Meerman Scott leveraged the WebInkNow blog to field questions during his panel.
Dale Wolf adds his thoughts on his Context Rules Marketing blog
Shannon Holman of ALM Research compares ALM's progress to the comments of some of the speakers (still no sign of data dog in Scottsdale yet)
More to come in day 2...

BSEC Content Technology Meets Web 2.0

Camelback1 One of the more interesting sessions at Buying and Selling eContent was the afternoon session entitled “The Next Wave: Content Technology Meets Web 2.0”.
Ross Mayfield, SocialText CEO, led off with a discussion of wikis and collective knowledge.  First, he enumerated the two types of Collectives that drive social networking applications:
Collective Intelligence (for example, Digg or Memeorandum), where there is a low threshold of user engagement; and
Constructive Intelligence (for example, Wikipedia), where there is a very high level of user engagement required. Clearly, the enthusiasts and other users who are willing to contribute at the Constructive Intelligence level are extremely valuable.

The key takeaway from Ross’ talk is that “Sharing Control Creates Value”.  It’s extremely difficult to share control, particularly for premium content providers, but virtually every successful Web 2.0 application has that at the heart of their business.

BSEC User-Generated Content

Camelback2
Buying and Selling eContent got off to a quick start.  After Esther Dyson's keynote, the first panel discussion, “When Everyone’s a Publisher: The impact of user-generated content” was moderated by David Meerman Scott.

Rusty Williams, Co-Founder of Prospero, led off with a view of how professional publishers should look at user-generated comment.  In Rusty’s view, premium publishers need to be orchestrators of content.   Publishers should embrace this – build a blog, subscribe to RSS, etc.

The model is changing from “inside-out” to “outside-in”, collecting content from all sources and places.

Bloggers shouldn’t be smug; it’s not a religion, it’s just a technology.  Only blog if that’s something you feel comfortable doing.  Richard Branson: “business is about being true to yourself” – use the tools, feel how it expresses you, etc.

Larry Schwartz, President of Newstex, discussed their inclusion of blogs within their news feeds.  According to Larry, Newstex views blogs as commentary; they have several hundred blogs today.  In terms of why bloggers would syndicate their blog, they have found bloggers choose to syndicate their content for a few reasons:
1. To generate traffic for their blog
2. To get publicity (particularly consultants)
3. Some professional journalists want the chance to post their opinions without having to be reviewed by an editor.

Newstex users, largely traders and other financial professionals, see blogs as a way to find information before the mainstream media.  He cited the example of last week’s announcement by Apple Computer of their intention to support Windows.  Gizmodo picked up the story at 9:05am, while AP and other wires were 30-40 minutes behind that.  The time advantage allowed traders to get in and out of the stock while others were just reading the news.

Cyndi Schoenbrun, researcher for Consumer’s Union, uses blogs as a research tool.  They serve as an early warning system as blog editors post to their blog several times per day, a more frequent news cycle than the MSM.  Also, blog editors often have good access to CEO’s at conferences and provide podcasts and transcripts.

For Cyndi, blogs must be proven to have authority, for her to rely upon them.  While authority is hard to determine, she looks for transparency (does the blogger identify their professional identity?) and also looks where advertising is not intertwined with the content.

Some of the sources Cyndi relies upon are PaidContent, I Want Media, Mediabistro and Rexblog.

In response to an audience question about whether Consumer Reports would look to blend user-generated content with their authoritative research, Cyndi indicated that they were already doing a modest amount of that today by having subscribers post comments on automobiles.  They are currently looking to expand that to consumer electronics.  At the same time, they tend to act cautiously and are concerned that a manufacturer might stack the deck with favorable ratings and reviews.

Live blogging from Buying and Selling eContent

Camelback4Buying and Selling eContent kicks off today at Camelback.

This morning's keynote speaker is Esther Dyson.  Following that could be one of the more interesting panels of the conference, focusing on user-generated content.  David Scott will moderate a panel featuring Larry Schwartz of Newstex, Cyndi Shoenbrun of Consumers Union and Rusty Williams of Prospero.  It's sure to generate some interesting debate over "trusted content" vs. user generated information.  Other panels today will focus on content aggregation and web 2.0 trends. 

Access permitting, I will share feedback throughout.


April 09, 2006

Blogburst to launch Monday

Blogburst, the blog syndication service, which I've posted about previously, will officially launch on Monday.
It will be interesting to see how the various newspapers choose to integrate the blogs into their services.  Interestingly, this comes the same day that the NY Times Public Editor announced that the Gray Lady has expanded its use of blogs written by Times reporters.
Whether Blogburst can successfully monetize blogs or not, it's clear that the mainstream media is taking a more serious look at how they can blend authoritative content with opinion.

April 06, 2006

Off to Camelback

Buying_and_selling_econtent Next week, I'll be at Camelback in Scottsdale for InfoToday's Buying & Selling eContent. 

This is consistently one of the best conferences for the content industry and this year has a strong lineup of speakers led by Esther Dyson and Tim O'Reilly.  There's a good mix of established providers and upstarts.  Of course, the gorgeous venue always attracts a top-notch audience making this one of the best networking events of the year.

I will, of course, blog from the conference.  If you're planning to be there, .

April 05, 2006

Chevy's Viral Marketing Gaffe

Chevy_tahoe1Chevy recently launched an advertising campaign designed to become fodder for viral marketing, as detailed in yesterday's New York Times.  Instead, the campaign is quickly becoming fodder for late night hosts.

In the campaign, tied to television’s The Apprentice, users are provided a set of tools to storyboard a pseudo commercial for the Chevy Tahoe.  Users can string together choices from a selection of short video segments, adding text and background music.  The resulting “commercial” is saved, and links can be emailed to friends or colleagues.

Tahoe_desert Clearly, Chevy did not understand the dynamics of the market most likely to use the application – teenagers and young adults.  While Chevy may have envisioned users creating ads touting the Tahoe’s spacious interior, ample seating or rugged looks, that’s not what most of the users were thinking about.  Instead, they leveraged the desert scenes, ice capped mountains and waterfall imagery to drive satirical snippets focused on the war in Iraq and global warming.  For an example of how this works, take a look at the example that I created.

The good (or bad) news is that, as designed, these commercials have spread virally, as users share their creativity with friends.  Unfortunately, most of them do not contain the mom, apple pie and Chevrolet message that might have been originally intended.  This should not have come as a surprise to Chevy.  All it would take is a 15-minute tour of You Tube to see that satire and parodies (along with way too much lip-synching) are the core elements of this genre.

Perhaps Chevy will claim that all publicity is good publicity, but I disagree.  While You Tube parodies like Brokeback to the Future may have provided benign publicity to the underlying film, the Tahoe commercial parodies will not cast a positive light on Chevy.

The clear message here is that it’s critical for business professionals to use and understandnew innovations in technology.  Spend an hour watching You Tube, create a page on MySpace, set up an RSS reader and begin to read blogs.  It’s not enough to simply understand the capabilities of these tools.  It’s critical that you understand how they are being used, the culture of the users and the potential implications of your efforts.  I’m guessing that few within Chevy’s advertising and marketing team had done so.

P.S. If you'd like to create your own Tahoe ad (or parody), you have until Monday to do so at the special  Chevy website.

April 03, 2006

Britannica vs. Wikipedia

Nature_logo There's been an interesting debate going on the past two weeks focusing on the quality of Encyclopedia Britannica vs. that of Wikipedia.

The debate was prompted by an article in the scientific journal Nature last December.  Nature set out to use peer review to compare the accuracy of Wikipedia vs. that of Encyclopedia Britannica.   To conduct the test, 42 domain experts each analyzed a single topic from both encyclopedias, covering a wide range of scientific disciplines.  Reviewers were asked to review the articles for three types of errors: factual errors, critical omissions and misleading statements.  The tests were blind (i.e. reviewers did not know the source of the listing they were reviewing).

Wikipedia_cambrian_explosion The results were quite interesting.  Not surprisingly, Britannica had fewer errors in the overall survey, but not by much.  For the 42 topics, there were 162 errors uncovered in the Wikipedia entries, vs. 123 for Britannica.  Interestingly, of eight "serious" errors, four each were found in Wikipedia and Britannica.

Last week, Britannica struck back with a response to the Nature article.  Britannica did not focus their attack on the results, but instead on the methodology of the study, for example complaining that Nature used supplemental Britannica content outside of the Encyclopedia Britannica.  In addition, they spent quite a bit of their response focusing on the wording - stating that where Nature said "Wikipedia comes close to Britannica", that the 1/3 more inaccuracies (162 to 123) were not that close. 

This week, Nature published a brief response to the Britannica article on its blog, defending its methodology and results.

What Britannica seems to be missing is that this is a public relations battle that it cannot win.  For an organization selling premium content compiled by experts, to split hairs over whether its content is slightly better than a free source compiled by the unwashed masses, is a losing battle.  Rather than hiding behind definitions of what "is" is, the team at Britannica should take this as a siren call to look at their products and value proposition.  Long-term, for Britannica to remain a viable business, they need to better understand the needs of their users and develop products that uniquely address those needs. 

Perhaps the most interesting aspect is how it took Britannica more than three months to respond to the Nature article.  Considering that in the original article, Nature published the specific findings for each of the 42 topics, it shouldn't have taken them long to fact-check them and put together their response.  (In contrast, the team at South Park rewrote an entire script and produced a new episode in less than a week after Isaac Hayes quit as Chef due to pressure from Scientology).  With Wikipedia's ability to respond to changing information within seconds or minutes, Britannica's slow-footed response seems telling.

Many content providers continue to live in denial, hiding behind claims that "our quality will win out" over Internet upstarts.  But, the quality differential is rapidly diminishing.  Whether comparing U.S.-based editors to outsourcing ("they'll never understand our market"), domain experts to the wisdom of crowds ("our PhDs have knowledge no one else can match"), or manual vs. automated tagging ("a computer can't understand the nuances of our taxonomy"), the gap is disappearing.

Traditional content providers hoping to still be around 5-10 years from now need to rethink their strategy.  Rather than relying upon domain knowledge in the compilation of information, they should focus that knowledge on understanding how information is consumed.  That will enable them to build the vertical market workflow-based applications that will continue to command premium value, while their basic content becomes commoditized.

RSS Feed

  • Subscribe in NewsGator Online
  • Subscribe in Bloglines

  • Add to netvibes

  • Add to Google

Subscribe to Content Matters via email


  • Enter your Email


    Powered by FeedBlitz

Barry's Twitter Updates

    follow me on Twitter

    Premium Content

    • Premium Content from Alacrastore.com

    Facebook

    Research Topics

    Search the Alacra Store

    • Search For Business Information