My Photo

Email me


Made with ImageChef

Google Content Search


Powered by Rollyo

Content Matters Community

Content Links

  • AlacraBlog
  • AlacraWiki
  • billtrippe.com
  • ContentBiz
  • E-Media Tidbits
  • eContent Magazine
  • InfoCommerce
  • ONLINE Insider
  • PaidContent.org
  • Research Recap
  • Seth Godin Blog
  • Shore Communications
  • That We Know
  • The Content Wrangler
  • Web Ink Now
Powered by TypePad

Content Industry Jobroll

  • Jobs from Indeed

Syndication and Reuse

  • Creative Commons License
    This work is licensed under a Creative Commons Attribution 2.5 License.

« March 2006 | Main | May 2006 »

April 26, 2006

The 50 Content Companies that Matter: Nature

Nature_logo_3 For various reasons, much of my focus tends to be in the area of business information.  But while the corporate and financial markets often drive innovation in the business info segment, there is quite a bit of innovation in the STM market as well.

Recently, a lot of that innovation seems to be coming from the team at scientific journal Nature.  Nature is part of the Nature Publishing Group (“NPG”), a division of Macmillan Publishers Ltd.  While many of the companies profiled on this blog are early stage, Nature dates back to 1869.  With 400 employees, NPG publishes sixteen journals and four clinical practice titles.

Recently, the New Technology Team at NPG launched Connotea, a social bookmarking site for the scientific community.  Connotea is loosely based on del.icio.us, though they have integrated functionality specific to the scientific community.  For example, bibliographic information from a number of content sources (Nature, Pubmed, Science, Amazon and others) is automatically fetched when these pages are saved to the site.  Connotea also supports the RIS file format, favored by many information professionals, so that users can upload entire collections of references.

Connotea Connotea also includes a wiki-based set of Community Pages, which allow users to write and edit content about the Connotea service.  I first found out about Connotea a few weeks ago, when I noticed a number of visitors to Content Matters coming from a link on the Connotea site (in response to a post I did on the Nature vs. Britannica “feud”).

But, Connotea is hardly the only forward-thinking solution from Nature.  They have been early adopters of RSS and podcasting, and have even launched a mashup of Avian flu reports with Google Earth.  A number of Nature writers and management have blogs, as does even the CEO of parent company Macmillan.

Nature has also taken some steps towards support of the Open Access movement.  For those unfamiliar with Open Access, there has been a major movement to get scientific journals to allow their writers to submit their manuscripts to public archives.  Since much of the scientific research done today leverages public funding, the OA community argues that the resulting scientific findings should be shared with the wider scientific community.  Public Library of Science, another of the “50 Content Companies that Matter” has been a leader in this space.  While Nature has not fully embraced OA (it has proposed authors only release their content following a six-month embargo), they have been more supportive of it than many of their peer journal publishers.

In a market where a few large companies control access to much of the critical information, Nature is a shining star for their flexibility, their willingness to test new technologies and their efforts to keep the “community” in scientific community.  Nature and NPG are clearly one of the 50 Content Companies that Matter.

April 24, 2006

Netvibes: A user configurable, Ajax-based RSS reader

Netvibes_logo As RSS feeds proliferate, it becomes more and more difficult to keep up with critical blogs and news feeds.  While mainstream tools like my.yahoo will probably become the dominant platform for reading RSS in the near-term, there are a number of more innovative and interesting viewers out there.

One of the more compelling offerings is Netvibes.

Netvibes is an ajax-based RSS home page.  At first glance, it doesn’t look that different than other portals.  Once you begin to use it, though, you can see how Netvibes has created a powerful and flexible interface that it simple to use.

Netvibes_screen Users can drag and drop content panels to any spot on the page.  Each panel can be customized with a click, changing the color or number of posts to include.  In addition to text-based feeds, you can point to image feeds such as flickr, a websearch box or links to any other site such as delicious.

As the site has been developed using Ajax, all of this customization can be done without redrawing the full page. 

What’s the impact for publishers?
RSS will soon be changing the content delivery game in a big way.  If you haven’t begun developing your RSS strategy, the clock is ticking.  RSS will be at the heart of Microsoft’s new IE7 and Windows Vista.  The ubiquitous orange feed icon will be prominent in the applications, both of which will deploy a common feeds list and a feeds API.

What this means is that the user experience will change.  With feeds, the publisher will no longer be in full control of how their content is being consumed.  Users may use a simple interface, like my.yahoo, to browse articles or they may wish to configure their own custom interface through tools like Netvibes.  Users may prefer online-only readers such as Netvibes, existing portals such as my.yahoo, or may prefer to read their content offline using apps like Feedreader.  Solutions providers may mash your feed with other content, developing applications that you'd never considered.

Publishers will soon have to make decisions about what content they will include in their feeds and whether they want those feeds to be easily syndicated by others.  Those decisions will have critical branding implications.  More importantly, content providers will have to decide whether to make their feeds customer-centric or advertiser-centric.  Today, many publishers (NY Times, WSJ Online) limit their RSS feed to just a headline or a few words from an article, forcing the reader to click through to the publisher’s website for authentication or to drive advertising page views.  But this is not a very customer-friendly experience and may send your readers looking for similar content elsewhere. 

In the meantime, if you aren’t using RSS yet, take a few moments to configure Netvibes and make it your browser Start page, so you can begin to learn how RSS may impact your 2006-2007 strategy.

April 20, 2006

SocialText: Communities that Work

Socialtext_logo One of the key characteristics of “Web 2.0” is social software, used to bring together and facilitate community interaction. 

This week, I add one of the emerging leaders in social software, enterprise wiki provider SocialText, to my list of Emerging Content Technologies.  SocialText develops and hosts wikis for companies for both internal and external use.  Wikis are rapidly beginning to replace intranets and portals for managing the communications around projects.

Unlike earlier knowledge management tools, wikis can rapidly increase productivity.  According to SocialText CEO Ross Mayfield, “a typical wiki can eliminate 30% of the email around a project.” 

Rossmayfield_1 One key to the success of SocialText, and wikis in general, is their decentralized structure.  Rather than a centralized, IT-focused solution, a SocialText wiki can be set up by non-technical staff in a short time.  Typical projects involve a handful of initial users, then once a critical mass of information is developed they expand to a wider audience.  By sharing control with their user community, sponsors of wikis can drive participation and a feeling of ownership.

As a Web 2.0 company, SocialText is committed to an open environment.  They will soon release an Open Source version and are involved in various open source initiatives today.

What’s the opportunity for publishers?
According to Elsevier Vice Chair Y.S. Chi, as content becomes commoditized, “the role of the publisher is beginning to shift from creator of content to manager of markets”.  Publishers have the ability to leverage their brand to develop communities focused around a given subject area.  Using wiki technology to cultivate a strong user community can provide you with a platform to sell various solutions to that market.  The difficult part for publishers will be ceding the absolute control that publishers are accustomed to.  But for those who do, the rewards may be significant.

April 18, 2006

Return Path launches "Credit Score" for direct emailers

Return_path_logoEmail performance management provider Return Path has just launched a new service called Sender Score Reputation Monitor.  Return Path's focus is on helping companies ensure deliverability of their email to customers and subscribers. 

Return Path has positioned Sender Score as similar to a Credit Score - it's a quantitative measure of a sender's reputation for legitimacy.  This reputation data can be used by ISP's to drive email blocking and filtering.  Companies with a "good" Sender Score should see their emails delivered successfully to users, while those with bad scores could be suppressed.

Email spam continues to be a large problem.  Most of the proposed solutions (from charging emailers to have their mail delivered to requiring senders to authenticate each addressee) are fraught with problems.  Return Path's quantitative approach seems well-thought through and balances the needs of both senders and recipients.

For more information on Sender Score, take a look at Fred Wilson's post.

April 13, 2006

Google launches ajax Calendar

Google_calendar
Google has unveiled Google Calendar (as a beta of course).
It's an ajax-based calendar, allowing users to share schedules with others.  More importantly, Google has developed this as a platform, so look to see some interesting timeline-based mashups in the near future.  Content providers with date-specific material should explore mashing it up with the Google Calendar.



BSEC Wrap Up

Desert_sunset The past few days at the Buying and Selling eContent conference were a mix of good and bad. 

BSEC, known for good networking and an inconsistent program, lived up to its reputation.  The turnout was strong and the climate encouraged networking.  The program itself was mixed, with a few strong speakers and panels combined with some that just didn’t make much sense or big-name speakers like Esther Dyson, who either didn't know the market or simply was disinterested.

The audience, too, was mixed.  Like the speakers, there were a handful of attendees who seemed closely attuned to the convergence of technology and content, and were actually putting to use some of the new tools and approaches.  On the other end of the spectrum, there were some speakers and attendees who seemed to have ignored all the advances of the past three to four years.  The majority of the attendees were somewhere in the middle.  The good news is that, unlike past years, the absolute fear of Google seemed to be diminishing.  Instead of “will Google take away my business”, they were asking “how do I play well with Google” and “what happens if Google doesn’t play nicely with me”?

Tim_oreilly For most, Tim O’Reilly’s keynote was the high point of the program.  Unlike analysts and pundits, O’Reilly is one of the few visionaries in this industry who has put his concepts into practice.  Other speakers from the vendor community, such as Ross Mayfield at SocialText and R.J. Pittman of Groxis, clearly were showing Web 2.0 capabilities, but their showcase clients were mostly in the consumer not b2b community.

The second day keynoter, Y.S. Chi, Vice Chair at Elsevier, showed there is hope for this industry.  While the model of publisher that he described, serving as market maker (“to know and honor customer needs”) might not sound earth shattering, it’s a message that you’d never have heard from a company like Elsevier in the past.  It’s clear that Y.S.’s vision for Elsevier is for a company attuned to its customer needs and one that comprehends that it needs to add value to its content, rather than simply hide it behind an exclusive wall. 

The final panel of the conference was a wrap-up, with a mix of panelists including Rafat Ali, Michele Manafy, David Seuss, Marydee Ojala and others.  Rafat seemed disappointed by what he’d heard during the two-day conference.  As compared to the media markets he normally follows, this segment remains risk-averse and unaware.  David Seuss attempted to provoke controversy, suggesting that all the attention on Google made no sense, and that intertwining premium content with free Google results would simply devalue the premium content.  While David was being intentionally provocative, his statements could have begun an interesting debate had there been more than 60-70 attendees left at that time.  In my opinion, premium content providers must have a Google strategy, but that strategy should be more than simply good SEO and SEM.  We all need to work well with Google (Yahoo and MSN as well), to gain exposure beyond those core users who know us.  At the same time, premium content providers need to add greater value through metadata, tools, information presentation and analysis.

In closing, BSEC is a good conference, but it could be a great one.  The venue draws a strong group of attendees, so it’s a very good business development and networking opportunity.  There are some terrific speakers and panelists, but the inconsistency of the program is frustrating.  I think that there are three things that could be done to improve next year’s conference:
1. Set a few themes up front, then make sure that those themes run through all the panels
2. Vet the speakers more closely to make sure that their experiences are relevant to those themes.  The bigger names don’t always make for the best panelists.
3. Provide some introductory material to level-set the audience.  Some of the questions that came up (“what’s the long tail”?) could have been addressed early on, so that everyone was on the same page.

Were you at BSEC?  What do you think?  Please click Comments to add your thoughts or
For further thoughts on BSEC, check the following blogs:

John Blossom’s live blogging
David Scott’s WebInkNow
Steve Goldstein’s AlacraBlog
Shannon Holman’s If You See Something, Say Something
PaidContent
Larry Schwartz at Newstex
Ross Mayfield’s weblog
Dale Wolf’s Context Rules Marketing


April 11, 2006

Jigsaw raises $12M

Jigsaw_logoJigsaw, an innovator in CRM and lead identification, and one of our 50 Content Companies that Matter, has raised $12m in venture capital.
Austin Ventures led the round, along with prior investors Norwest Capital and El Dorado Ventures.

Despite the fact that TechCrunch believes Jigsaw to be one of the most evil companies in the world, IMO their model is nothing but a modern day version of the way that directory publishers have compiled information for years.

I am sensitive to Michael Arrington's belief that users should be able to opt-out of the Jigsaw database, but having spent many years in the compiled information business think he's overhyping the risk. 

Jigsaw is hardly giving away personal information - it's not home addresses, names of children or anything else that might lead to evil use.  It's simply allowing users to upload business contact information for business professionals.  If you've ever subscribed to a controlled circulation magazine, you've already exposed yourself to worse.

With its low cost base, it will be interesting to see how Jigsaw uses the funds for business development.

BSEC: Search engine panel

Camelback7After lunch, Jeff Cutler of Answers.com led an interesting panel bringing together three leaders from the search market. Jim Gerber, Director of Content Partnerships at Google, Cliff Hawk, biz dev head for Microsoft Windows Live and Ryan Massie who heads news search and local initiatives at Ask.com were together on stage.

It was an interesting discussion as these three competitors were also faced with defending their business models to an audience of skeptical publishers.

Google

Jim Gerber provided an overview of four related businesses within Google – Google News, Google Scholar, Google Video and Google Book Search. He stressed how in all four models the content owner controls pricing and access.

Windows_live Cliff Hawk walked through the new organizational structure at Microsoft, with the new Windows Live. According to Cliff, Windows Live is about tools, utilities and web services, user choice and selection and customization of content. Windows Live, which will leverage many of the tools of MSN, will incorporate RSS, gadgets, mail and messenger. According to Cliff, the imminent launch of Microsoft’s Ad Center Ads (the Microsoft version of AdWords), may provide some opportunities for early adopters to buy keywords while auction prices are low.

Ask_1 Ryan Massie, of Ask, provided a brief history of Ask.com and how they finally had to kill off Jeeves the Butler, as users didn’t perceive that Ask was a true search engine.

A clear theme throughout the discussion was that these search engines could threaten traditional content aggregators. As more and more premium content is exposed through the search engines, content aggregators will have to prove that they are adding value. Most aggregators have begun to move up the value chain, through development of workflow-based applications, through the provision of ancillary applications that help corporations manage their content purchasing or by adding significant metadata to improve the user experience. Those who view themselves as simply a pipe may find it harder and harder to hang on.

BSEC Panel: The Quest to Know Your Customers

Camelback9BSEC Panel: The Quest to Know Your Customers

Depending upon how you look at it, Hal Espo had either the easiest or most difficult moderator role of the conference.
The last panel before lunch included three speakers, each with a strong combination of personality and insights.  John Blossom began the levity with the comment that "It's good to be in the position that I can't ask any questions".

Blossom John Blossom led off with a summary of a custom study that Shore had recently performed for Hoovers, looking at purchasing patterns of premium business information.   The study showed that, even for large companies, a significant amount of content is being purchased at the department and workgroup level.  In fact, 49% of purchases at large organizations were made via credit card. 
The study asked which types of content were most frequently used; web, news and other free content came out on top.  When the question was flipped to ask which were the most important sources of business information, market research and subscription databases were at the top.
John’s takeaways from the study are that there remains a strong demand for premium business information, that ease of use and access is key, and that publishers must sell to individuals, departments and institutions.

Lou Celi, of the Economist Intelligence Unit, walked us through a quick history of electronic content at the EIU.  He broke it into three phases:
During the 1980’s, it was selling “old wine in new bottles”, repackaging existing Economist content for delivery through LexisNexis, Dialog, MAID and others.
In the mid-90’s, that changed to “new wine in new bottles”, where content was developed specifically for the e-user.
More recently, Lou indicates that the business had moved “outside the bottle”, where they are a full e-business, integrating content directly within customer portals and applications, doing custom research and putting on conferences.  It has changed their entire business model.  While they still sell subscriptions and pay-per-view documents, they also offer flexible enterprise licensing as well as sponsorship for “thought leadership” events.

BSEC Changing Business Models

Camelback8The morning at BSEC continued with a 2-part panel focused on models in the content industry.

The first panel, “the Subscription Dilemma” included Jonathan Lewin of eMeta, Andrea Broadbent of McGraw Hill and Adam Bernacki of Leadership Directories, focusing on paid content models.  Andrea described how they converted  ENR.com from a free site to a paid site, leveraging critical construction economic data as the “crown jewels” of the site.  They have converted 71,000 paid subs. 

Adam spoke of the differences between demand-publishing at D&B and selling canned content such as that of a directory publisher like Leadership Directories and how a balance can reach audiences outside of the core customer base.  While transactional (pay-per-view) sales are less than 1% of LDI’s revenues today, Adam sees that as a strong growth opportunity.

The second part of the panel was focused on the role of SEO and SEM for premium content providers.  The panel consisted of Rafael Cosentino of Congoo, Matt Hong of Thomson Gale, Pam Springer of ECNext and David Scott of Freshspot Marketing.

Rafael led off with a discussion of how premium content providers may be able to help themselves by developing co-branded networks.  During his previous experience at Healthology, Rafael developed numerous co-branded networks.  As a result, when he did a Yahoo search on “cardio health”, 90% of the results on the first page came from Healthology and its partners.  Rafael walked through a list of top tips for successful co-branded networks.  He also suggested that premium content providers could take a “the enemy of my enemy is my friend” approach, where premium content providers, by linking to one another, could improve each other’s traffics, giving all of them stronger positioning vis-à-vis open web content.

David_m_scott David Scott provided four key observations for those involved in developing search engine strategies:
1. First, he indicated how search is remarkable in that it’s the only form of marketing that does not require interruption of the user from what they are doing.  You reach people at the exact moment they are seeking you.
2. Good search engine strategies look for sites that aggregate audiences.  As an example, he indicated how MarketingSherpa, with its focused audience, generates much higher quality leads than general search like Google.
3. Don’t be egotistical.  Understand your buyers first, then develop SEO/SEM programs based upon what they are looking for.  Too many publishers select keywords based upon what they have to offer, not what their customers are looking for.
4. Pay very close attention to the landing pages that you drive people to.  It’s amazing how many companies invest tremendous budgets on SEM, then drive that traffic to their home page or a weakly designed landing page.

BSEC Day Two Keynote - YS Chi

Camelback5 Buying and Selling eContent Day Two opened with a keynote by Elsevier Vice Chair Y.S. Chi.  Y.S., formerly Chairman of Random House Asia and President and CEO of Ingram Book holdings, joined  Elsevier a year ago, to head global academic and customer relations.

Ys_chi
Y.S. spoke of how it’s very difficult to be a publisher today.  He described how we need to redefine publishing, where the primary function is to “quickly get right information to the right person in the right context” and where the new role of the publisher is as market maker (“to know and honor user needs”).

As a frame of reference, Y.S. compared the impact of the Internet to the invention of the printing press.  As with the printing press, the web has led to content explosion, new communities and the democratization of knowledge.  Today’s “uber publishers” can adapt faster, providing deeper communities and do a better job of filtering content.

The opportunities for publishers are threefold:
1. Filter for quality: while incumbents can leverage your brand for trusted content, that won’t last, so publishers need to do a better job filtering.  Filtering must be matched to the user needs: for medical content a top-down editorial model is probably best, while the wisdom of crowds is great for movie reviews.
2. Enhance productivity: the productivity of knowledge workers continues to fall; publishers must embed content into users workflow to make them more productive (as ES has done with Science Direct, Scirus and Scopus)
3. Nurture networks to develop compelling communities.  But, communities must be the “gotta go” type, creating a visceral response from users when they visit. 

In closing, the “uber publisher” must stay attuned to user needs and experiment and learn.  There are many user needs, which translate to many business models.

BSEC Day Two

Camelback_sunriseAs the sun rises over Camelback, we get set to kick off Day 2 of Buying and Selling eContent.

This morning kicks off with a keynote by Y.S. Chi, Vice Chairman of Elsevier.

Following that is a panel entitled "The Subscription Dilemma: Is it Time to Move On - or Not?" featuring McGraw Hill's Andrea Broadbent, Adam Bernacki of Leadership Directories and Jonathan Lewis of eMeta, moderated by Joe Bremner of Kennedy Information.  It should (hopefully) raise issues that make some of the traditional publishers a bit uncomfortable with the long-term prospects for their model.

The morning continues with a look at how premium content can play in the world of SEO and SEM, featuring Congoo's Rafael Cosentino, Matthew Hong of Thomson Gale, David Scott and Pam Springer of ECNext, again moderated by Joe Bremner.

April 10, 2006

BSEC Tim O'Reilly keynote

Camelback6 Buying and Selling eContent is so packed with content that they needed two keynotes the first day.  The afternoon keynote speaker, Tim O’Reilly of O’Reilly Media, was the highlight of the day. 

O’Reilly, credited with coining the name “Web 2.0” provided a set of six key rules for successful Web 2.0 applications:
1. Users add value
2. Network effects by default: Tim talked of the “architecture of participation” where successful apps default to aggregating user data as a side benefit to usage (e.g. Napster defaulted to sharing=on)
3. The Perpetual Beta: key in the software as a service model is continual improvement and rolling feature enhancements.
4. Services have to be above the level of a single device: the PC is not the only access device for Internet applications and those that are limited to such are less valuable than those that reside on the Internet.
5. Data is the next “intel inside”: the BSEC audience was pleased to hear that their content could be the differentiator for applications.  Examples included Navteq, the source of Google maps and virtually every other street mapping application, and Gracenote, the “CDDB” database that matches titles to track numbers within all the major music applications.
6. A platform beats an application every time.  And, unlike the MS Office model, the Web 2.0 world allows small pieces loosely joined together to add new value to the platform.

Tim also spoke of Asymmetric Competition, where a new competitor with a different business model may kill your business.  His example of Craigslist killing the newspaper classified business is now a classic, but there are many others out there.  For those who’ve read Outsell’s Neighborhoods of the Information Industry, this theme should ring familiar.

One of Tim’s key points came out of the Google Maps experience, where they had not intended for the application to serve as the basis for what would become mashups.  His takeaway is that “if your users aren’t surprising you by the ways that they build on your (Web 2.0) product, then you’re doing something wrong.”

To catch up on the rest of the Buying and Selling eContent conference, take a look at the following blogs.  All told, there were close to a dozen attendees blogging in some capacity today.
Rafat Ali will be speaking on a panel Tuesday and is live-blogging throughout.
John Blossom at Shore is posting on his special events weblog.
Ross Mayfield has an interesting recap on Tim O'Reilly's keynote.
Larry Schwartz, of Newstex, points out his win then loss of the $20 cliche bet for first use of "Long Tail"
David Meerman Scott leveraged the WebInkNow blog to field questions during his panel.
Dale Wolf adds his thoughts on his Context Rules Marketing blog
Shannon Holman of ALM Research compares ALM's progress to the comments of some of the speakers (still no sign of data dog in Scottsdale yet)
More to come in day 2...

BSEC Content Technology Meets Web 2.0

Camelback1 One of the more interesting sessions at Buying and Selling eContent was the afternoon session entitled “The Next Wave: Content Technology Meets Web 2.0”.
Ross Mayfield, SocialText CEO, led off with a discussion of wikis and collective knowledge.  First, he enumerated the two types of Collectives that drive social networking applications:
Collective Intelligence (for example, Digg or Memeorandum), where there is a low threshold of user engagement; and
Constructive Intelligence (for example, Wikipedia), where there is a very high level of user engagement required. Clearly, the enthusiasts and other users who are willing to contribute at the Constructive Intelligence level are extremely valuable.

The key takeaway from Ross’ talk is that “Sharing Control Creates Value”.  It’s extremely difficult to share control, particularly for premium content providers, but virtually every successful Web 2.0 application has that at the heart of their business.

BSEC User-Generated Content

Camelback2
Buying and Selling eContent got off to a quick start.  After Esther Dyson's keynote, the first panel discussion, “When Everyone’s a Publisher: The impact of user-generated content” was moderated by David Meerman Scott.

Rusty Williams, Co-Founder of Prospero, led off with a view of how professional publishers should look at user-generated comment.  In Rusty’s view, premium publishers need to be orchestrators of content.   Publishers should embrace this – build a blog, subscribe to RSS, etc.

The model is changing from “inside-out” to “outside-in”, collecting content from all sources and places.

Bloggers shouldn’t be smug; it’s not a religion, it’s just a technology.  Only blog if that’s something you feel comfortable doing.  Richard Branson: “business is about being true to yourself” – use the tools, feel how it expresses you, etc.

Larry Schwartz, President of Newstex, discussed their inclusion of blogs within their news feeds.  According to Larry, Newstex views blogs as commentary; they have several hundred blogs today.  In terms of why bloggers would syndicate their blog, they have found bloggers choose to syndicate their content for a few reasons:
1. To generate traffic for their blog
2. To get publicity (particularly consultants)
3. Some professional journalists want the chance to post their opinions without having to be reviewed by an editor.

Newstex users, largely traders and other financial professionals, see blogs as a way to find information before the mainstream media.  He cited the example of last week’s announcement by Apple Computer of their intention to support Windows.  Gizmodo picked up the story at 9:05am, while AP and other wires were 30-40 minutes behind that.  The time advantage allowed traders to get in and out of the stock while others were just reading the news.

Cyndi Schoenbrun, researcher for Consumer’s Union, uses blogs as a research tool.  They serve as an early warning system as blog editors post to their blog several times per day, a more frequent news cycle than the MSM.  Also, blog editors often have good access to CEO’s at conferences and provide podcasts and transcripts.

For Cyndi, blogs must be proven to have authority, for her to rely upon them.  While authority is hard to determine, she looks for transparency (does the blogger identify their professional identity?) and also looks where advertising is not intertwined with the content.

Some of the sources Cyndi relies upon are PaidContent, I Want Media, Mediabistro and Rexblog.

In response to an audience question about whether Consumer Reports would look to blend user-generated content with their authoritative research, Cyndi indicated that they were already doing a modest amount of that today by having subscribers post comments on automobiles.  They are currently looking to expand that to consumer electronics.  At the same time, they tend to act cautiously and are concerned that a manufacturer might stack the deck with favorable ratings and reviews.

Live blogging from Buying and Selling eContent

Camelback4Buying and Selling eContent kicks off today at Camelback.

This morning's keynote speaker is Esther Dyson.  Following that could be one of the more interesting panels of the conference, focusing on user-generated content.  David Scott will moderate a panel featuring Larry Schwartz of Newstex, Cyndi Shoenbrun of Consumers Union and Rusty Williams of Prospero.  It's sure to generate some interesting debate over "trusted content" vs. user generated information.  Other panels today will focus on content aggregation and web 2.0 trends. 

Access permitting, I will share feedback throughout.


April 09, 2006

Blogburst to launch Monday

Blogburst, the blog syndication service, which I've posted about previously, will officially launch on Monday.
It will be interesting to see how the various newspapers choose to integrate the blogs into their services.  Interestingly, this comes the same day that the NY Times Public Editor announced that the Gray Lady has expanded its use of blogs written by Times reporters.
Whether Blogburst can successfully monetize blogs or not, it's clear that the mainstream media is taking a more serious look at how they can blend authoritative content with opinion.

April 06, 2006

Off to Camelback

Buying_and_selling_econtent Next week, I'll be at Camelback in Scottsdale for InfoToday's Buying & Selling eContent. 

This is consistently one of the best conferences for the content industry and this year has a strong lineup of speakers led by Esther Dyson and Tim O'Reilly.  There's a good mix of established providers and upstarts.  Of course, the gorgeous venue always attracts a top-notch audience making this one of the best networking events of the year.

I will, of course, blog from the conference.  If you're planning to be there, .

April 05, 2006

Chevy's Viral Marketing Gaffe

Chevy_tahoe1Chevy recently launched an advertising campaign designed to become fodder for viral marketing, as detailed in yesterday's New York Times.  Instead, the campaign is quickly becoming fodder for late night hosts.

In the campaign, tied to television’s The Apprentice, users are provided a set of tools to storyboard a pseudo commercial for the Chevy Tahoe.  Users can string together choices from a selection of short video segments, adding text and background music.  The resulting “commercial” is saved, and links can be emailed to friends or colleagues.

Tahoe_desert Clearly, Chevy did not understand the dynamics of the market most likely to use the application – teenagers and young adults.  While Chevy may have envisioned users creating ads touting the Tahoe’s spacious interior, ample seating or rugged looks, that’s not what most of the users were thinking about.  Instead, they leveraged the desert scenes, ice capped mountains and waterfall imagery to drive satirical snippets focused on the war in Iraq and global warming.  For an example of how this works, take a look at the example that I created.

The good (or bad) news is that, as designed, these commercials have spread virally, as users share their creativity with friends.  Unfortunately, most of them do not contain the mom, apple pie and Chevrolet message that might have been originally intended.  This should not have come as a surprise to Chevy.  All it would take is a 15-minute tour of You Tube to see that satire and parodies (along with way too much lip-synching) are the core elements of this genre.

Perhaps Chevy will claim that all publicity is good publicity, but I disagree.  While You Tube parodies like Brokeback to the Future may have provided benign publicity to the underlying film, the Tahoe commercial parodies will not cast a positive light on Chevy.

The clear message here is that it’s critical for business professionals to use and understandnew innovations in technology.  Spend an hour watching You Tube, create a page on MySpace, set up an RSS reader and begin to read blogs.  It’s not enough to simply understand the capabilities of these tools.  It’s critical that you understand how they are being used, the culture of the users and the potential implications of your efforts.  I’m guessing that few within Chevy’s advertising and marketing team had done so.

P.S. If you'd like to create your own Tahoe ad (or parody), you have until Monday to do so at the special  Chevy website.

April 03, 2006

Britannica vs. Wikipedia

Nature_logo There's been an interesting debate going on the past two weeks focusing on the quality of Encyclopedia Britannica vs. that of Wikipedia.

The debate was prompted by an article in the scientific journal Nature last December.  Nature set out to use peer review to compare the accuracy of Wikipedia vs. that of Encyclopedia Britannica.   To conduct the test, 42 domain experts each analyzed a single topic from both encyclopedias, covering a wide range of scientific disciplines.  Reviewers were asked to review the articles for three types of errors: factual errors, critical omissions and misleading statements.  The tests were blind (i.e. reviewers did not know the source of the listing they were reviewing).

Wikipedia_cambrian_explosion The results were quite interesting.  Not surprisingly, Britannica had fewer errors in the overall survey, but not by much.  For the 42 topics, there were 162 errors uncovered in the Wikipedia entries, vs. 123 for Britannica.  Interestingly, of eight "serious" errors, four each were found in Wikipedia and Britannica.

Last week, Britannica struck back with a response to the Nature article.  Britannica did not focus their attack on the results, but instead on the methodology of the study, for example complaining that Nature used supplemental Britannica content outside of the Encyclopedia Britannica.  In addition, they spent quite a bit of their response focusing on the wording - stating that where Nature said "Wikipedia comes close to Britannica", that the 1/3 more inaccuracies (162 to 123) were not that close. 

This week, Nature published a brief response to the Britannica article on its blog, defending its methodology and results.

What Britannica seems to be missing is that this is a public relations battle that it cannot win.  For an organization selling premium content compiled by experts, to split hairs over whether its content is slightly better than a free source compiled by the unwashed masses, is a losing battle.  Rather than hiding behind definitions of what "is" is, the team at Britannica should take this as a siren call to look at their products and value proposition.  Long-term, for Britannica to remain a viable business, they need to better understand the needs of their users and develop products that uniquely address those needs. 

Perhaps the most interesting aspect is how it took Britannica more than three months to respond to the Nature article.  Considering that in the original article, Nature published the specific findings for each of the 42 topics, it shouldn't have taken them long to fact-check them and put together their response.  (In contrast, the team at South Park rewrote an entire script and produced a new episode in less than a week after Isaac Hayes quit as Chef due to pressure from Scientology).  With Wikipedia's ability to respond to changing information within seconds or minutes, Britannica's slow-footed response seems telling.

Many content providers continue to live in denial, hiding behind claims that "our quality will win out" over Internet upstarts.  But, the quality differential is rapidly diminishing.  Whether comparing U.S.-based editors to outsourcing ("they'll never understand our market"), domain experts to the wisdom of crowds ("our PhDs have knowledge no one else can match"), or manual vs. automated tagging ("a computer can't understand the nuances of our taxonomy"), the gap is disappearing.

Traditional content providers hoping to still be around 5-10 years from now need to rethink their strategy.  Rather than relying upon domain knowledge in the compilation of information, they should focus that knowledge on understanding how information is consumed.  That will enable them to build the vertical market workflow-based applications that will continue to command premium value, while their basic content becomes commoditized.

RSS Feed

  • Subscribe in NewsGator Online
  • Subscribe in Bloglines

  • Add to netvibes

  • Add to Google

Subscribe to Content Matters via email


  • Enter your Email


    Powered by FeedBlitz

Barry's Twitter Updates

    follow me on Twitter

    Premium Content

    • Premium Content from Alacrastore.com

    Facebook

    Research Topics

    Search the Alacra Store

    • Search For Business Information