July 22, 2007

Can Business Users Get Value from Facebook?

Twitterwilson_2
Can business users get value from Facebook?

That's a question that I've seen & heard from a number of sources in recent weeks.  As Fred Wilson twittered this morning, its the same question he had about LinkedIn in the early days, but now gets value from it. 

So, what does Facebook have to do to create value for business users?  Here are a few thoughts:

Facebook_friend First, it needs to provide the tools to support business relationships.  Today, the only business relationship they support is "worked together" (or I guess "hooked up" may apply to the workplace, but I'm not going to go there).  Business relationships can be much more complex.  We should be able to reflect relationships like client:vendor, investor:portfolio company, biz dev partners and more.

Next, Facebook should allow its users to set a sharing threshold of "friendship".  There may be details that I'd share with "friends" but not with business colleagues.  I'd like the ability, when I add a friend, to categorize them as a friend, a colleague or an acquaintance.  Then, when I add apps to my profile, I'd like to flag them to whether they'll be shared with each category of friends.  That helps me separate business apps from personal apps, while still keeping them all in one place.

Once a framework like this is put in place, it will support business applications.  It would be simple, for example, to provide "degree of separation" relationships like LinkedIn offers. 

So, why do we need to build that in Facebook, if LinkedIn already offers it?

Facebook provides things that LinkedIn cannot easily match.  With the open platform, it's reasonable to expect industry-specific networking applications to emerge.  There are attributes of relationships which differ from industry to industry.  For example, a lobbyist might want to have a party attached to each of her contacts.  Horizontal apps could also be easily built for recruitment, reference checking, business development and more.

As a platform, Facebook is well-suited to replace the "home pages" most people use today such as my.yahoo.  I can pop an RSS reader into Facebook, so I can read all of my RSS feeds off that page.  I can integrate a calendar and address book (which should be able to synch with my corporate (Exchange) files and my Blackberry.  I can also access it from a mobile device, without having to fumble through sites that don't work well on the mobile. 

So, eight weeks after the platform was announced, it's clear that Facebook has not yet transformed itself into a business networking application.  But with a few enhancements to the core platform, and a bunch of creative developers using the platform, I'd bet it's there by next spring or summer.

July 20, 2007

Casting Call: Socialtext Seeks CEO

SocialtextRoss Mayfield posts on his blog that Socialtext is seeking to hire a new CEO (or, CEO 2.0, as he calls it).  Mayfield intends to stay on as Chairman and President.

It's an open call and Ross has posted details on his blog.  If you're a driven business leader with experience in the software market, you might want to check it out.

July 16, 2007

Content Industry Group on Facebook

Facebook This week I created a Content Industry group on Facebook.  My goal is to provide a forum for business networking as well as a venue for sharing information.

Facebook users, just click here to join.  If you don't have a Facebook account, take a minute to create one then join.

July 13, 2007

Monetizing Facebook Apps

Money In the past week I’ve gotten a handful of questions (both in the comments here and face-to-face) about how Facebook applications will be monetized.  I’ll provide some initial thoughts here and welcome your comments.

Facebook_ra First, we need to acknowledge that it’s early in the process.  The Facebook API was launched May 25 – exactly seven weeks ago today.  So, while there are some fairly obvious methods of monetizing Facebook traffic today, we can be sure that more interesting models will emerge in the months to come.

The second acknowledgement is that Facebook is a platform; it does not come with a built-in monetization mechanism.  Let’s remember how long it took before Google monetized its traffic, and then again before it provided the mechanisms (via AdSense) for others to do the same.

What the Facebook platform does provide is a distribution vehicle, enabling you access to a rapidly growing addressable market that already exceeds 24 million users.  The platform also lends itself well to two well-established Internet monetization methods: advertising and eCommerce.

  • Advertising: While other social networks have restricted the ability for 3rd party developers to run ads on their “widgets”, Facebook places no such restrictions.  So, the most obvious method of monetizing a Facebook app is through development of an app that generates a lot of traffic and can serve ads.  This model should work for traditional media businesses.
  • ECommerce: The second method for generating revenue is through eCommerce.  Today’s top Facebook app is iLike.  iLike is a simple application that allows users to indicate their favorite music.  iLike is positioned as a music discovery service in the same vein as Last.fm or Pandora.    The iLike app for Facebook includes a link next to each song title, allowing users to buy that song, generating eCommerce transactions.  They also have a concert tour module where you can see which performers are on tour, then purchase tickets.  Developers could easily leverage existing affiliate marketing platforms (such as Amazon) to monetize traffic for books, movies or any other products.

So, is Facebook a cash machine for developers?  Hardly.  Building a business on the Facebook platform requires development of a compelling application that remains useful to the user over time.  It will require tapping into viral markets, but also providing underlying marketing and PR to drive usage.  The Field of Dreams approach will not work in this competitive landscape.

It’s also important to understand that generating traffic on Facebook is slightly different than that on other social networks.  Most social network apps get viral by having users post a widget to their page, then having other users interact with it.  As Lance Tokuda, CEO of RockYou points out in this VentureBeat article,

“...the viral loops for Facebook (there are several) revolve around the news feed, the mini feed and the invite request.  Not around people coming to your page and interacting with it.”

Ilike_2 Meanwhile, Lightspeed Venture Partners' Jeremy Liew provides a better sense of how Facebook apps are valued.  To Jeremy, there are four metrics that will determine the value of a Facebook app:
1. RPM (revenue per thousand page views (technically iframe views)).  This metric captures the revenue, whether it’s driven by a CPM, CPC or CPA (cost-per-action) model.
2. Page views per  user per month: many apps don’t generate many pageviews, as they simply display in a single iframe (such as a daily horoscope).  Those that generate interaction with multiple pages will have higher value.
3. Monthly churn: the stickier the app, the more likely it will be retained over time.
4. Virality of the app (as described in the Lance Tokuda interview).

I also see opportunities to make money in serving the emerging Facebook ecosystem.  In the first few weeks since the platform was announced, more than a thousand apps have been launched and tens of thousands of developer keys have been distributed by Facebook.  As more and more apps emerge, there’s a need for a more comprehensive directory for users to find the apps they need.  The existing “Top 10” list and broad categories are OK for starters, but they will always be dominated by large consumer apps.  There’s a tremendous opportunity for someone to launch a more navigable app directory that connects users with the more specialized apps they require.

Of course, there will be companies who specialize in app development (as already exist for developing MySpace widgets) and for marketing of your apps, just as the SEO/SEM market has emerged around Google.

So what’s the best way to monetize Facebook?  I think that Charlie O’Donnell says it best.

“It shouldn't be up to Facebook to figure out your business model”. 

The Facebook platform can dramatically shorten time-to-market of new applications, and help drive traffic.  But, it’s up to us to develop the applications and supporting business models to turn it into a business.

July 11, 2007

Finding Recruiters and Consultants in the Content Industry

AlacralogosmallAlacra launched the Alacra Wiki about two years ago.  The wiki provides detailed information on the companies, people and products that make up the business infromation and content industry.

Recently, Alacra added two new categories of content to the wiki: Recruiters and Consultants.  I get asked regularly for referrals in both of these areas.  Both segments are led by boutique firms, often sole practitioners, who have unique industry experience.  By adding these to the Alacra Wiki, it will make it easier to identify these niche providers (though I'm still happy to provide referrals to those who ask).

Alacrawiki1 The People section of the Alacra Wiki is a "who's who" of the content industry, profiling key leaders from all aspects of the media and publishing community.  If you're not listed on the Wiki (check here), all it takes is a minute to register and add your information.   Or, if you'd rather have Alacra add your details, just drop a note to customer_service@alacra.com and they'll post the info to the Wiki for you.

The Alacra Blog has more details on the new categories and the Alacra Wiki in general.

July 10, 2007

Seeding the Facebook Ecosystem

BaypartnersVia TechCrunch comes news of the next step in the emergence of Facebook: a VC fund offering seed investments only in companies developing Facebook apps.

Facebook_bg Valley VC Bay Partners has launched a new program called AppFactory, targeting developers of Facebook apps.  The investments will be small – typically in the $50k - $250k range, according to Bay partner Salil Deshpande, and is designed as a "fast-track" program with few investment hurdles.  Interested parties can find out all the details on the AppFactory FAQ.

While some will question the idea of betting on a platform that’s less than two months old, the concept has merit in my opinion.  By leveraging the Facebook platform, the time (and cost) to get to market is minimized.  And while it’s possible that Facebook is a fad, more and more serious players in the industry see it as the new dominant social media platform. 

The Facebook apps that have been launched to-date are little more than eye candy, allowing users to share their preferences (whether music, movies or political causes).  Of course, as Marc Andreesen points out in his “Five f__ weeks” post, the Facebook platform has been out for less than two months.  That’s not a lot of time to conceive and launch meaningful software.  He predicts that by September, when students go back to school and business people finish vacations, we’ll see strong adoption of a compelling group of Facebook apps.  Betting against Marc in the past has not been a rewarding endeavor, so I’m going along with him on this call.

June 28, 2007

Facebook Ecosystem Emerging

FacebookReadWriteWeb has an interesting post on acquisition of Facebook applications.

Facebook_favorite_peeps The latest this week was the acquisition of Favorite Peeps by slideshow creator Slide.  Favorite peeps is a fairly simple apps that lets users display their favorite friends, providing their own details about them.  It's a nice complement to Slide's TopFriends app.

With Facebook's new platform, thousands of applications are being developed.  There will be a bit of a land rush as people strive to gain exposure to their apps.

The confusion of what apps are worth installing creates opportunities, particularly for companies who can package together bundles of apps.  Going forward, I think there will be a need for a site to emerge (think Tucows or CWS Apps List) which will help users navigate and select Facebook apps.

What's most interesting to me is the speed at which this new Facebook ecosystem is emerging.

June 25, 2007

LinkedIn to Open Platform to Developers

LinkedinVia ZDNet comes word that LinkedIn plans to open up its platform to developers, seemingly in response to the recently launched Facebook platform.   According to LinkedIn CEO Reid Hoffman, the site will enable third party apps in about nine months.  That's quite a long gestation period in the world of Internet development, so this clearly sounds like a defensive counter to Facebook.

Facebook_profile It's apparent that Facebook is positioning itself to become a social networking platform for the business community.  LinkedIn has held that position to-date, but have not been able to leverage it to become part of their user's every day workflow.  Only a third of their users login once per quarter, while 60% of Facebook users login at least once per day.

The new Facebook platform is generating many new applications.  That's both a blessing and a curse, as it's incredibly difficult to assess what's worth installing and which of your friends have what apps.  However, I expect a small ecosystem to emerge surrounding the Facebook platform.  Third party tools will be developed which will organize, rate and manage the new Facebook apps.  And, of course, there will be blogs to provide commentary, much as they do today for TypePad widgets.

In order to become relevant to business users, Facebook will also have to make minor UI modifications to become more businesslike.  When defining a relationship between two business associates, you need to have better choices than "lived together", "took a course together" or "we hooked up".  While "worked together" is a start, it's much too broad to define the many business relationships like "client-vendor", "biz dev partner" or others.

Yet those seem like modest changes to make, whereas LinkedIn will need to move beyond its current connections lookup functionality to get users to visit regularly.  I wouldn't rule LinkedIn out yet, though.  While I find myself turning more and more to Facebook, I still use LinkedIn for business critical functions like recruiting.  I have many more contacts on LinkedIn and find it better for recruitment and biz dev.  However, that gap could easily be closed in less than the nine months it will take before this new platform is launched.



June 19, 2007

BandsInTown Mashes up Concert Schedules

BandsintownMusic continues to be one of the leading categories for emerging technologies.

This week marked the launch of mashup BandsInTown.

Bandsintowngraubart_3 BandsInTown is an online community that lets users track concerts for their favorite bands.  BandsInTown imports your music preferences from last.fm, then matches that to concert schedules from your geographic region.  The resulting concert cloud helps you find upcoming shows you might not be aware of.

While the cloud can include acts you don't listen to, the larger items match to the ones that show up higher in your last.fm rankings.  You can mouse over a link to see their upcoming shows in your area.

Maximopark_2 Clicking any of the bands brings you to a wiki-styled landing page which shows the upcoming show(s) but also has links to an artist page, maps to the venue and space for additional images or band info.  All of these pages are designed for user-generated content.

The event awareness space is attracting attention with sites like eventful, Oodle's Bandtracker and others.  What  separates BandsInTown is its ability to import your last.fm preferences, its easy-to-navigate tag cloud and its focus on wiki-like features.  I'd like to see them add a Facebook plugin to make it easier to share events with friends.

Thanks to Frank Gruber for the heads up on BandsInTown.








June 13, 2007

More on the Facebook Platform

Facebook_2 More interesting posts this week on the potential and implications of the Facebook platform.

Marc Andreesen describes how, despite the failures of the walled gardens of AOL, Compuserve and Prodigy, few  web properties have focused on the need to develop  platforms. Yes, some have API's, enabling interaction from the outside, while others support basic widgets, but not the full development platform needed to foster a development ecosystem.

Andreesen also delves deeply into the pros and cons of the Facebook architecture.  Interestingly, he points to what may soon be known as the Facebook Syndrome (a variation of the TechCrunch Effect), where when  your application goes up on Facebook you are very happy because you have lots of users, and you are very sad because your servers blow up.

Meanwhile, in advance of her Web 2.0 presentation, Esther Dyson explores the non-technical  implications of the growth of social networks.  Esther points out that the new Facebook model:

  • Mirrors the social relationships of the real world, which will require tools and applications to manage interruptions and to provide gradations of levels between "friend" and "stranger".
  • Changes the business model online, allowing businesses to tap into existing social networks, rather than simply opening up a website in a vacuum.
  • Supports the attention economy, where you establish a presence and gain attention back

It seems at all the recent conferences, there's been discussion of the need for businesses to establish a presence in Second Life.  For those in the content and technology space, I'd put that on the back burner and begin to focus on their Facebook strategy.

June 11, 2007

The 50 Content Companies that Matter: Facebook

FacebookAs social networks take hold, the site that gets the most attention (and the most traffic) is MySpace.  For the business community, however, the social networking site to watch is Facebook.

Facebook, founded in 2004 by Harvard sophomore Mark Zuckerberg and a few classmates, was initially oriented towards college students.  The basic premise was to automate the printed "facebooks" or  directories used by all students.  Last year Facebook opened up its network to non-students.  You can join a network for your company or simply for a city or geographic area.

Recently, Facebook became an open platform, allowing applications to be created to add functionality to your Facebook page.  Applications for Facebook today include a Flickr interface, a last.fm widget and Flixster, for movie reviews.  Rather than simply allowing embedded scripts (in the way that a blog or MySpace permits), Facebook has provided an open API, which should enable more sophisticated applications than typical script-based widgets could support.

Facebook_daily_reach Facebook today has more than 21 million registered users and is the 17th most visited site in the world (according to Alexa).  While that number is impressive, what's even more impressive is that more than 90% of those users are "active" users who visit Facebook at least once per month and 60% login on a daily basis.  For comparison, LinkedIn reports that roughly a third of its 10 million users log on at least once per quarter.  Even more compelling is that the 21 million users currently generate more than 1.5 billion page views per month.  That translates to more than 2,300 page views per month per registered user.  Meanwhile, more than eight million photos are uploaded to Facebook each day.  Talk about sticky!!

Facebook seems well-positioned to transition from the college market to a broad and diverse user base.  As more plugins are added, the service will increase its utility and provide compelling reasons for business professionals and others to join.  A more diversified user base will also give Facebook greater staying power, while MySpace faces the risk of the fickle teen market falling in love with a new entry.

Facebook_graubart Today there are only a handful of business-oriented applications for Facebook, but that should change, particularly with their opening up the platform.  Traditional content providers should certainly be exploring ways to integrate with the Facebook platform.  To-date, the only one who's done that is Forbes, but I'd expect others to follow soon.  Facebook also needs to tweak their app to recognize the needs of business users.  Today, linking to friends is very school-oriented and doesn't support relationships like client-vendor, business partners, etc.  Even the concept of "friend" doesn't really capture typical business relationships.  Those are pretty easy changes for Facebook to make though.

It was well-reported that Yahoo last year attempted to acquire Facebook for $1 - $1.6 billion.  Every few weeks the rumors of a Yahoo acquisition heat up again.  But in the meantime, Facebook has demonstrated spectacular growth and a strong vision.  If Yahoo does succeed with an offer, you can be sure it will be for more than $1.6 billion.

Social networking is still in its infancy and the current applications have yet to deliver strong value for the business community.  But that's beginning to change.   And Facebook seems well positioned to become the leader in social networking for business.  And they are clearly one of the Fifty Content Companies that Matter.

You can visit my Facebook page here.

June 02, 2007

Google Acquires Feedburner

FeedburnerI wrote about this deal and why I thought it was great for RSS last week when it was still a rumor.  It's now official; Google has acquired Feedburner.

More details are available from Union Square Ventures and the Feedburner blog.

For anyone involved in content delivery, this deal should be a good thing.  Google's business is all about monetization and if anyone can figure out how to monetize RSS, they'll be the ones to do it.  Congratulations to Dick Costolo and the Feedburner team.

May 30, 2007

CBS Buys Last.fm

LastfmCBS Corporation has acquired social media music site Last.fm for $280 million.

Last.fm today has roughly 15 million users of the largely free service, which takes a "wisdom of crowds" approach to music recommendations.  While other services, such as Pandora, are compelling, what makes Last.fm so successful is the fact that it's not intrusive.  Users simply download an iTunes plugin which monitors the music they play.  For example, you can see the music I listen to on my last.fm page.

According to the press release,"The Last.fm management team will work with all relevant CBS divisions to apply their community-building and technology expertise to extend CBS businesses online and within the mobile space."

This acquisition, coming on the tail of the smaller Wallstrip deal, makes it clear that CBS is taking a more serious look at new technologies.  It will be interesting to see whether they can leverage these new capabilities across their traditional businesses.


May 24, 2007

Google to Acquire Feedburner

FeedburnerTechCrunch reports that Google will acquire RSS feed management company Feedburner in a deal reportedly worth $100 million.

This deal could be big for a number of reasons:

First, it suggests that Google will put its force behind RSS adoption.
Second, it means that Google believes that it's feasible to monetize RSS feeds. 

Feedburner offers a feed and blog advertising network.  To-date, publishers have struggled to figure out a way to monetize the reading of their RSS feeds.  While most people think of Google as being in the search business, they're really in the traffic monetization business. The acquisition of Feedburner suggests that Google believes that RSS can be monetized through advertising.

Another way to look at this, as suggested by WebProNews, is that while Microsoft, through Vista, will be pushing user adoption of RSS, Google will be pushing publishers into creating more RSS.

Feedburner was founded in 2003 and has raised $10 million to date, led by Mobius Venture Capital, Union Square Ventures and Portage Ventures.

Many people (including me) have touted the "year of RSS" for the past few years.  With Microsoft and Google pushing it, it looks as though the time for RSS may finally have arrived.

April 12, 2007

Webkinz: Second Life for the elementary school set

Webkinz If you have a child between the ages of 6-12, you probably are familiar with Webkinz.  Webkinz are basically “Beanie Babies meets Tamagotchi meets Second Life.

Manufactured by Canadian gift wholesaler Ganz and available largely through card and specialty stores, Webkinz are small stuffed animals that sell for $12.95 each.  They’re rather unremarkable, except each Webkinz comes with a unique security code that allows you to register it, providing access to the “Webkinz World” portal.  The portal itself includes a number of games, quizzes and related areas, each of which earn you points (“KinzCash”).   KinzCash can be used to by virtual items for use by your virtual Webkinz in this virtual world.

Webkinz_cat Part of the user’s “job” is to keep each of their Webkinz satisfied in three areas: happiness, health and hunger.  By “playing” with your virtual Webkinz, feeding them (using KinzCash to buy food and drinks) and providing exercise, you keep their scores high on all three counts.

There’s a modest social networking aspect to Webkinz, where users can “friend” each other or IM one another to invite them to compete at one of the games.  All of this happens between anonymous users and there is no identifying information shared for safety reasons.  You don't have to worry about running into naked avatars in Webkinz World.

My eight-year old has five Webkinz, and colleagues tell me their children have ten or even twenty.  The manufacturer has leveraged the high demand for Webkinz by requiring that Webkinz sellers also buy inventory of their other products.  Much like the Beanie Baby craze of a few years ago, supply is kept fairly low, increasing demand. 

Ganz has employed no advertising, relying solely on word-of-mouth marketing for their promotion.  Launched in April, 2005, more than 1.5 million Webkinz pets have been sold, with over 700,000 registered users.  And a quick search of eBay found more than 10,000 Webkinz for sale, with 93 of them (rare or “retired”) selling for $250 or more each.

What’s interesting to me is how easily even the youngest Webkinz users pick up the system.  While I’m still not a big believer in the concept of Second Life for the business community, it’s clear that virtual worlds are a big play in the gaming world  And Webkinz, along with sites like ClubPenguin and Tweenland have shown that the market entry point for these types of products has quickly moved to the “tween” and elementary school market.

March 26, 2007

Buying and Selling eContent - John Blossom Think Piece

Octopus_cactusAt BSeC, John Blossom closed the morning with a 15-minute “Think Piece” entitled “Chasing the Mammoth”.  The talk focused on the redefinition of publishing in a social media ecology.

Using the analogy of global climate change and how, in the time of the ice age, people were nomads and owning land was unimportant, John talked about how we are now in an unstable business climate with shifting resources and global trade.  In this market, owning IP is no longer important and the walled gardens and licensing deals of yesterday’s stable market are no longer relevant.

Johnblossom John described how “Social Media powers nomadism in publishing”, moving from corporate production to enabling individual and institutional production in shifting contexts. 

John showed examples of how user-defined context are the new “mammoth”, including:,
* Using Yahoo Pipes to create a custom publication of hedge fund news. 
* Using LinkedIn Answers (with a question I had posted a month or so ago), to show how leveraging a network of peers, I had gathered feedback on books on competitive strategy. 
* Voxant’s Newsroom, where users can grab licensed video content by automatically generating a snippet of code which users can place on their blog or website.
* ASP community builders like Near-time and Ning, which allow you to create a private “MySpace” community.

In closing, he described the “Mammoth culture” as leaner and meaner, tribal, collaborative and mobile.  Publishers need to “let their content graze where it needs to” and understand user behavior in ways most do not today.

John showed how he remains one of a handful of people in this industry who“get” the Web 2.0 world and was able to coherently share relevant examples of what publishers can easily be doing today.

March 04, 2007

Cisco Acquires Tribe.net to Bring Social Networking to Corporate Clients

TribeThe NY Times has reported that Cisco will be acquiring the technology assets of nearly dormant social networking site Tribe.net.  You may recall that Tribe.net was one of the early players in the social networking arena.  Perhaps best known for the controversy when they banned sexually explicit content, Tribe had largely faded away, with the popularity of mainstream sites like Facebook and MySpace.

Cisco’s reported acquisition of Tribe, following its recent acquisition of social network tools provider Five Across, gives the Company an interesting set of capabilities.  The plan, as outlined in the Times, is for Cisco to leverage these tools to help their corporate clients develop and manage community sites for their users. 

Just as users are moving away from destination sites (as described in Fred Wilson’s deportalization post), the bet is that we will see similar changes in the social networking arena.  Destination sites like MySpace and YouTube may still be strong gathering places for teens, but corporations looking to communicate with clients will want to bring those capabilities into their own environments.

At first blush, the acquisitions seem odd for Cisco.  Other than their linguistic pairing, social networking and networking have little in common.  That being said, social networks can obviously drive quite a bit of Internet traffic and Cisco knows how to monetize the bits and bytes of traffic.  That said, the white label social networking environment is pretty crowded already, as Jeremiah Owyang points out.

It’s clear that social networking is making its way to the corporate world.  On Friday, Reuters announced its intention to build a “financial MySpace” for its Reuters Messaging clients.  Meanwhile, at the recent SIIA Information Industry Summit, Ben Edwards presented some of the ways that IBM has already been leveraging these technologies for both internal and external communications.

It will be interesting to see how Cisco plans to leverage these technologies.  For more on the acquisition, see posts from Rafat Ali, Mashable and TechCrunch.

February 01, 2007

28% of Online Users Have Tagged Content

Pew...according to a December, 2006 study just released by Lee Rainie and the team at Pew Internet & American Life Project.

Rainie Effective tagging has long been considered the "holy grail" for improving information retrieval.  While tagging can dramatically improve search results for text, it's even more critical for graphics, audio and video content.

Publishers have long known this and have employed editors (originally as employees, but now largely outsourced) or technology to categorize and tag their content.  Then a few years ago, the idea of folksonomies, or community-based tagging, popped up, with the advent of  delicious, RawSugar, flickr and other applications.  But, until now, most of us have assumed that it was just a sliver of users doing the tagging.  Delicious, the most renowned of the tagging platforms, has only 1 million users and of those, only a modest percentage are active taggers.

According to the new Pew study, 28% of all online users surveyed have tagged a page, a photograph, video or blog post.  Overall, 7% of the 2,400 users in the survey tag content on a daily basis.  Pew phrased the question as "Please tell me if you ever use the internet to categorize or tag online content like a photo, news story, or a blog post.”  As such, they are capturing an audience of users who probably don't even think of what they are doing as tagging.

What would be interesting is to see how much tagging is being done to other people's content vs. their own.  When I browse through flickr or youtube, I see most content is tagged reasonably well.  That makes sense as users are tagging it either to make it easier for them to find their own stuff or to help it be discovered by others.  While you'd expect all content creators to follow that approach, a random look at blog posts shows that most bloggers do not bother to apply Technorati tags to their posts.  Delicious, on the other hand, is more of a bookmarking service, so you gain the benefit of a community tagging other people's content.  But, as indicated, participation is low.

The Pew study points out that demographics come into play here.  Most taggers are under-40, well educated and have higher income.  Over time, as tagging applications provide greater benefits to users, tagging should continue to increase.  And information retrieval should continue to improve.

The study may be downloaded free of charge at the Pew site.

January 31, 2007

The Charity SuperBowl Wiki

SuperbowlYou know how those pub super bowl "box" pools help keep the game interesting?  I can recall years ago during Super Bowl XVIII, when the Raiders were beating the Redskins 21-3 at halftime.  For most of the world there was little left to watch.  But, with 30 seconds left and the Skins in field goal range, I was a kick away from a $500 prize.  Of course, the Skins went for it (and failed) on 4th down and went on to lose 38-9.  But, those of us with boxes continued to watch the game and root.

That excitement is now available without the smoky pub or hangover.

Charlie O'Donnell has created the SuperBowl XLI Wiki, with the proceeds going to the charity of your choice.  All it takes is 2 minutes to go to his site, sign up and send $10 to his Paypal account.  So, sign up now and you'll have something to keep your interest besides the commercials.

P.S. For those interested in the technology, Charlie set this up with the free pbwiki.com

January 30, 2007

SIIA: Traditional vs. User-Generated Content

JohnblossomJohn Blossom served as moderator of the first panel of the day, focusing on Traditional Content vs. User-Generated Content.
Panelists included Dan Morrison, CEO of ITToolbox, Jigsaw CEO Jim Fowler and Jeff Guilot, EVP, Product & Technology at Hoovers.

John Blossom began with an overview of where social media plays in the area of
mission-critical business information.  Today, social media plays in the areas of Expert Insights, Business Intelligence, Business Development, Hiring, Public Relations, Collaboration and Research.

Social media is now gaining attention of professionals; for example, LinkedIn has moved into the top 200 websites this year according to Alexa.
Social media enables new types of structured insights:

  • Tagging
  • Linking
  • Endorsing (voting)
  • Six Degrees

Ittoolbox Dan Morrison, CEO of ITToolbox (competitor to KnowledgeStorm, TechTrends) discussed user-generated content as a model for business information.

ITToolbox is an Online community for IT Knowledge sharing:

  • 1.2M pages of user generated content
  • 1.5M unique monthly visitors
  • 740 advertising customers
  • Up to $120 CPM

Their premise:
Media & Technology industries are merging.  New media models are coming from technology - user-generated content (community-powered models) will lead some segments.

Is this a disruptive opportunity?
- How will media consumption be distributed?
- How will ad perfomance compare?

User-generated content is quite different than traditional content.  They see community-powered information as a utility:
- Share knowledge, discuss trends, solve probelms, review products and services - all to increase productivity and efficiency.
-online communities on desktop throughout day
- Communications and content merge - communication as media

Advertising performance:
Performance improves as targeting and context improve
- Communities=high volume content, granular topics
- participation = active interest in topic
- Ad performance improves by targeting this granular level and active context.

They believe a disruptive opportunity exists if they can deliver a higher consumption of information and better performance for customers.

Jimfowler Next up was Jim Fowler, founder of Jigsaw Data.
Jigsaw, one of our "50 Content Companies that Matter" is an online business directory allowing users to buy and trade business cards.  Their users build and maintain the database. 
Today, they have more than 5.2 million contacts, with, for example, 49,000 contacts at Deloitte.  Their goal is to map (and keep updated) every business organization on the planet, leveraging the user community to do so.  Their position is to be to the information content what Wikipedia is to Encyclopedia Britanica.

Jim talked about the risks of user-generated content:
1. It's the wild west.  Wikipedia has constant battles between users trying to manipulate the data.
2. Managing a community: must make all product decisions by involving the community.
3. It's a new business model; for Jigsaw, the individual users are the ones who build and maintain the database, but their revenues come largely from large corporate sales.  Interestingly, Jigsaw is now seeing revenues from data cleaning (for client's CRM systems) starting to rival their revenues on the sales and marketing side.

Hoovers The third panelist was Jeff Guillot, EVP, Product & Technology at Hoovers.
Jeff discussed how a traditional content provider sees user-generated content affecting their business.

Hoovers has launched four products in the user-generated content space:
Bizmology Blog: daily postings by editors
Hoovers business tool for publishers: Improved tools to allow publishers to integrate Hoovers content into their products.
Integration of Hoovers Insight into the Salesforce.com platform.
Relationship Mining (Hoovers Connect) - in limited beta: free service for users - a visual tool to link your existing network to Hoovers content (a la LinkedIn).

Today, Hoovers uses these tools to drive traffic in a brand-relevant way by combining Hoovers editorial expertise with user-generated content.

While user-generated content is still in its early stages, it's clear that companies like ITToolbox, Jigsaw and LinkedIn are seeing real revenues in this space.

January 24, 2007

IBM brings Many Eyes to Visualization

Many_eyesThanks to Tim O'Reilly for the heads up on this one.

The Visual Communications Lab at IBM has launched into alpha a social visualization platform called Many Eyes, led by data visualization gurus Martin Wattenberg and Fernando Viegas.

The site falls into a new category, described by its designers as "social data analysis".  Similar in ways to Swivel, profiled here a few weeks ago, Many Eyes allows users to upload data sets, which then can be used by other users to create visual analyses.  While Swivel's visual tools are more rudimentary graphs and charts, Many Eyes uses some of the more advanced forms of data visualization including Treemaps and Bubble Charts.

Treemap Like Swivel, Many Eyes has been seeded with various data sets, many from government data sources.  Users can upload their own data and use that posted by others.

Data visualization is starting to take hold, after years of fits and stops.  One of the strengths of the Many Eyes site is the diversity of visual maps which they offer.  No single map is appropriate for all data or for all users.  Many users still prefer simple rows and columns for viewing data, while others take more quickly to visualization.  New tools like Many Eyes and Swivel, along with established visualization tools like Grokker, offer compelling ways to navigate and mine data.

January 21, 2007

Riding MySpace to the Top of the Charts

Lilyallen While the Arctic Monkeys have been credited (on this blog and elsewhere) with leveraging the web and social sites to launch their music career, they're not the only ones to have done so.

Next week, one of my favorite new performers, Lily Allen, will have her CD released in the United States.  Lily Allen's ska-influenced alt-hip-hop has gained her a strong audience in the UK and I expect her to see similar success in the U.S. markets.  Her music is tinged with sarcasm, humor and a bit of pretentiousness as might come from a 21-year old who seems wise beyond her years.  While the song Smile was her first big UK hit, I prefer the darker LDN and humorous Knock em Out.

What makes the Lily Allen story unique is how she leverage social networks not only for publicity, but as an online focus group to shape her first CD, Alright Still.  According to this Billboard article, Allen posted new tunes to her MySpace page, then gauged the feedback from her fans.  In fact, that process helped her validate some of the songs against the advice of her record label.  To reach that success, Allen apparently spent hours each night online, chatting with fans and reading their comments.  While much was made last week of the new blog launched by 73-year old Marriott CEO Bill Marriott, that seems like just an online press release by someone who admittedly cannot type. 

To really engage your audience using social software requires a significant commitment.  Clearly, Lily Allen is an early success story of that process.  For those who haven't heard her music yet, I'd suggest you give a listen at her website.  For those looking at how to leverage technology to launch a new brand, you might learn a few things by following her path.


January 10, 2007

Yahoo Buys MyBlogLog

Mybloglog_1 Yahoo acquired MyBlogLog yesterday in a $10M deal that's been rumored since late last year.
The acquisition adds to YahooMybloglog_community_2 's presence in the social software space, while also providing some new capabilities for advertising on blogs.

At its most basic, MyBlogLog is a social networking application that allows users to create a profile and take that profile with them to various blogs, "joining" them.  You've probably seen photos of recent readers on Content Matters and other blogs.  These are readers who have active MyBlogLog profiles.  Once you join a blog community, you can view the profiles of other members and see what other communities they are participating in.  It's a pretty compelling way to discover relevant content you might not be aware of.

Mybloglog_stats The other side of MyBlogLog is the statistics they provide to the participating blog sites.  You can easily see how many visitors you've had, how they got to your site and what they've clicked on once they got there.  It's not the full metrics you'd get with HBX or even Performancing, but it's a nice snapshot.  You can even get a view of what other pages (on other participating blogs) your community members have visited.  It's sort of like a private Digg just for people with like interests.

As they've done with Flickr, Delicious and others, the initial plan is for Yahoo to keep MyBlogLog a standalone entity.  On day one the only change you'll see is the ability to login with your Yahoo ID, rather than creating a separate ID.  Over time, they will further integrate it with their other social networking applications.

In the meantime, you can join the Content Matters community to test it out.

For more on MyBlogLog, see earlier reviews from TechCrunch, ContentMatters and A VC.
For more on the acqusition, see posts by Om Malik, Search Engine Land and Rafat.




December 27, 2006

Public Library of Science Launches PLoS ONE

Plosone While Nature may be going back to the drawing board on their wiki-based peer review test, Public Library of Science officially unveiled the beta of their PLoS ONE last week.

PLoS ONE which was first announced in June, is a peer review platform that uses both traditional and web-based comments.  Each article is peer-reviewed with oversight by an academic board, much like other scientific journals.  After publication, however, articles published on PLoS ONE are then opened up for reader annotation, discussion and rating, creating a dialog between author and reader.

As a Public Library of Science offering, the product is completely open-source and published under the Creative Commons License.  Rather than charging users for the content, PLoS charges a nominal ($1,250) fee to authors to have their articles reviewed. 

For various reasons, this blog focuses largely on business and financial content.  However, there is quite a bit more innovation going on in the STM market today.  Both Nature and Public Library of Science have previously been named to the "50 Content Companies that Matter" list.  I have no doubt that these two organizations will continue to push the envelope in 2007.


December 19, 2006

The Year in Search: Dweebs, horndogs and geezers

Roughtype_1 Nick Carr, at Rough Type, has posted the top 10 search terms for the year from Google, Yahoo and AOL.
The amazing part is the lack of overlap among the three.  The only term that appears on more than one list is "American Idol", which rates 6th on Yahoo and 4th on AOL.

I'm kind of surprised that Bebo is #1 on Google.  Not so much that it outranked MySpace, whose traffic dwarfs that of Bebo, but that Google users would bother to use a search engine to locate a 4-letter URL.  Also interesting is that video provider metacafe is on the list, but not YouTube.  Is it the opposite of the Bebo issue?  Has everyone already bookmarked YouTube or figured out how to type the domain name directly?  Or is the fact that Time Magazine has taken notice an indication that YouTube has jumped the shark.

Nick's comments hit it pretty much on the head.  Yahoo searches indicate 13-year old boys looking for half-naked photos of celebrities; Google is led by geeks searching for videos and social networking sites, while AOL's users are searching for categories of information, like horoscopes or dogs.

The full lists:
Google:
1. Bebo
2. Myspace
3. World Cup
4. Metacafe
5. Radioblog
6. Wikipedia
7. Video
8. Rebelde
9. Mininova
10. Wiki

Yahoo
1. Britney Spears
2. WWE
3. Shakira
4. Jessica Simpson
5. Paris Hilton
6. American Idol
7. Beyonce Knowles
8. Chris Brown
9. Pamela Anderson
10. Lindsay Lohan

AOL
1. Weather
2. Dictionary
3. Dogs
4. American Idol
5. Maps
6. Cars
7. Games
8. Tattoo
9. Horoscopes
10. Lyrics

December 05, 2006

Can Web 2.0 penetrate the Intelligence Community?

Ny_times_logo_1 Clive Thompson’s article, Open Source Spying, in this Sunday’s New York Times Magazine was a fascinating look at how Web 2.0 is creeping its way into the least-open IT environment imaginable – the US Intelligence community.

The article starts by describing problems well-known to anyone who has tried to provide technical solutions to the major TLA agencies: outmoded technologies fomented by technical and policy walls prevent any meaningful sharing of information. 

I spent time in 2003-2004 delivering analytic technology solutions to the Intelligence community.  While advanced technologies had a few advocates in some high places (among others former Navy Admiral and “Total Information Awareness” sponsor John Poindexter was a big fan of our technology), there were hurdles in place that were too high to clear. 

Probably the greatest hurdle was the lack of sophistication of the actual agency employees combined with the greed and arrogance of the systems integrators who “served” them.  The defense and intelligence communities have been outsourced to private contractors for the past 25 years.  This was started by Reagan, who believed you could downsize government by cutting a $40k per year civil servant and replacing them with an $80k per year private contractor (often the same person but now on a different payroll).  Over time, the government lost its ability to recruit strong IT minds, and those who did join would quickly shift to the private sector where they could get a huge raise for performing the same job.

Since the agencies were left with little IT expertise, they became highly dependent upon contractors, large systems integrators like Lockheed, Northrop Grumman, SAIC and others.  Unfortunately, efficiency and public safety are lower priorities than revenue growth for these firms.  During my time in Washington, I found that the systems integrators put up walls to keep “off-the-shelf” software out, even while Corporate America had embraced COTS solutions as opposed to custom software.  COTS software is less expensive, easier to support, easy to integrate and has a much lower level of implementation risk than a custom solution, but systems integrators can make more money having a team of 50 people spend two years building something than having a team of 10 implement something that already exists.

The Times article describes recent efforts to change the system, led by Dale Meyerrose, CIO of the new Director of National Intelligence.  Meyerrose instituted a mandate of using COTS solutions (mainly for compatibility reasons) and has also begun to tackle some of the cultural obstacles.

At the same time, the DNI took over a CIA program that explored new methods of gathering and sharing intelligence.  Among the first methods considered were use of wikis and blogs.  Could decentralized tools like wikis and blogs survive in the most centralized of all IT environments?

Tests are still underway, but early results are compelling.  As an example of how this could work, the article describes how a wiki was used to capture and update information about the crash of a private commuter plane into a Manhattan apartment building.  Over the course of about two hours, that page was updated 80 times by analysts from nine different agencies.  They were quickly able to reach the conclusion that this was not a terrorist-related incident.  How long might that have taken in the traditional model, where analysts at different agencies were unable to share information?

Will Web 2.0 applications solve the many woes of the intelligence community?  I think that it will take more than a few innovative programs to weaken the multi-billion dollar grip the contractor community has over intelligence.  But, with the right leadership (and perhaps some committee hearings on war profiteering), it’s possible that more user-generated content initiatives will displace the failed knowledge management projects of the past.  And if these efforts can begin to take hold in the command-and-control centric world of the intelligence community, just think what impact they might have in a more flexible organization like your own.

December 01, 2006

TechCrunch Insights on the Answer Wars

Yahoo_answersTechCrunch’s Michael Arrington has written a great post about the victory of the Yahoo Answers model as opposed to the failed Google Answers.  The post uses this case to show how Web 2.0 and the idea of community is more than simply marketing hype.

Google Answers was launched in 2002, when ad revenues were down, and used a more traditional business model – users pay for answers, with Google taking a cut.  During its entire run, only 800 “experts” answered questions.  Yahoo launched its solution last year, at a time when social media was beginning to take hold.  In the Yahoo model, questions are posted for free and anyone can respond with an answer.  Readers vote on the answers, enabling the “wisdom of crowds” to determine if the answer is compelling.

Under Yahoo’s approach, as Michael describes it, the “network effect kicked in big time” and Yahoo Answers gets a tremendous number of page views.

It’s still very early in the social networking space and models are still evolving.  Yahoo Answers makes it clear that community participation, when leveraged correctly, can generate big results.

November 13, 2006

Zimbra Adds Offline Access

Zimbra Zimbra, a provider of hosted collaborative applications, announced that it will be adding some offline capabilities to its Ajax-based office suite. 

Zimbra's applications are in the same category as Google Docs & Spreadsheets and Google Calendar, Zoho, Preezo and a host of other online would-be Microsoft Office competitors.  The SaaS concept and the collaborative capabilities of an online office suite are appealing.  But the downside is that most of us still are not connected 24x7. 

Many of the hosted applications seem to believe that offline access is anathema to their business model.  When I began to use SalesForce.com, my greatest frustration was that I had no offline access.  A plane trip following a week's worth of sales calls is the perfect time to enter all those detailed contact notes.  But not for Salesforce users; not with their "no software" logo.

Zimbra, along with soon-to-launch Scrybe (see TechCrunch post from October), seem to understand that (at least limited) offline capabilities are a must-have in order to gain any meaningful level of adoption.  Zimbra's goal is for users to have the same experience whether online or offline, supported by synchronization of any work performed offline.  That will include use of calendars and contacts lists, as well as documents.  The announcement first came at O'Reilly's Web 2.0 conference last week.

Down the road, we will reach a point when connectivity can be taken for granted, even in planes, trains and automobiles.  In the near-term, adoption of hosted solutions will be higher for those providers who provide a bridge to the offline world.

For more on Zimbra's announcement, take a look at the Zimbra blog and O'Reilly Radar.

October 31, 2006

Google Acquires Jotspot

Jotspot_logo_1Google today announced that it has acquired hosted wiki provider JotSpot.
Under the acquisition, the service will become completely free.

Previous rumors this summer had Jot being acquired by Yahoo, as part of their efforts to bulk up their social software platform.  I wouldn't be surprised to see Yahoo respond with an offer for WetPaint or SocialText.

In the near term, the deal provides Google with many more page views for serving ads.  Longer-term, it will be interesting to see whether they integrate Jot with Google Docs & Spreadsheets and other collaborative applications.

October 16, 2006

Linking Out?

LinkedinSitting at a conference last week, I had a discussion with a few colleagues about the value of business social networking sites, particularly LinkedIn.
The general consensus was that LinkedIn was moderately useful, though none of us were raving fans of the service.  Then today, I stumbled across a post by Jeff Atwood, called Opting Out.

Jeff has some pretty strong feelings about the lack of value he has gotten from LinkedIn, and the fact that LinkedIn gets inherent value from its 7 million members.  Seeing no value from the service, Jeff is right to opt out (and is rightly frustrated that there's no easy "remove me" button).

That being said, I think that services like LinkedIn do add value, though the value is perhaps a bit more nuanced than LinkedIn might want you to think.

First, we should all understand that social networking in the business world is very different from social networking for teenagers.  When you're a teenager, gaining a new "friend of a friend" might be compelling in itself.  When you are measured by the size and status of your social networking, almost any addition has value.

In the business world, the value of a distant online friend is minimal and could even be a negative.  I use LinkedIn as an online tool to help me understand the network of my offline contacts.  In other words, if I am trying to find a contact at CompanyX, I'll use LinkedIn to see who in my network knows someone at CompanyX.  Then, I'll typically call that contact to discuss a possible introduction.  My goal is not simply to increase the size of my network, but rather to identify the potential reach of my existing network.  And, LinkedIn helps me do that.  Without it, I'd end up sending emails to all of my peers asking "Do you know anyone at CompanyX?"   

It's true that I get about a request a month from someone who doesn't know me to link to them.  Many of them are simply link collectors (you can see them listed with 500+ connections in their LinkedIn Profile).  In most of these cases, I politely decline, letting them know that I only share access to my network with people whom I've met.  I also have my profile configured so others cannot simply browse my connections - they only see the link when it's part of a relevant search.

But for me, LinkedIn does create value, certainly enough value to make it worth keeping my profile up to date.  If it doesn't create that level of value for you then, like Jeff (and Robert), you should opt out.

October 13, 2006

GooTube

YoutubeI've refrained from posting about the Google-YouTube marriage, knowing that thousands of bloggers and media, most (all?) much smarter than I, were covering it like FoxNews covered the Natalie Holloway story.

That said, I'm surprised that the debate continues as to whether Google overpaid.

The basic premise seems to be that valuations of web 2.0 sites were getting astronomical and that the acquirers could never achieve a return on their investment.  I'd buy that argument if it were a traditional media company buying it, but it doesn't hold water in this case.  There are three reasons why Google was uniquely well-suited to make this acquisition:

  1. Google is spending about 2% of its already frothy market cap to buy this with stock. That's different from when News Corp spent $580M in cash to buy MySpace (which, by the way, seems to be working quite well for them).  Google knows better than almost anyone how to monetize the YouTube traffic, so giving up 2% of its shares for the million or so unique daily visitors to YouTube looks pretty good.  And while uniques are important, a key metric is the amount of time spent on a site, and YouTube will dramatically increase the average time spent per user.
  2. The biggest argument against the acquisition, as highlighted by Mark Cuban's Blog Maverick, is that lawsuits will bog the company down and, if copyrighted content were removed, there'd be little left of value.  Fair enough.  But, Google's gained quite a bit of experience in negotiating with the publishing community and I think they'll get most of the companies to play ball.  After all, while some overprotective companies like Disney might sue, others like Viacom's Comedy Channel know that the value of having a 3-minute clip from the Daily Show is incredibly valuable free advertising.
  3. The other challenge with YouTube is that delivering all that rich media content takes a lot of servers and bandwidth.  There aren't many companies that understand web infrastructure better than Google, so I doubt they'll have many challenges in this area.

Yes, there are many reasons why a traditional media company might have been afraid of the risks involved in acquiring a property like YouTube.  But for Google, it seems like a no-brainer.

October 04, 2006

Social Software 101

SpannerworksSEM provider Spannerworks has released a free eBook entitled What Is Social Media?

The eBook, while brief, is a pretty good introduction to social media for novices.  It provides quick overviews of five major categories of social media:

  • Blogs
  • Social Networks
  • Content Communicies
  • Wikis
  • Podcasts

The descriptions are high level - this is not aimed at the TechCrunch community.  But, if you're new to social software, or if you're trying to get management buy-in to some of these concepts, it's a quick and useful read.

The eBook was written by Antony Mayfield, Head of Content and Media for Spannerworks.  Antony is listed as a contributor to the Spannerworks blog, SearchSense.  Surprisingly, the blog is fairly sparse and doesn't show off much in the way of social software.  But, this is a new practice for Spannerworks, so we'll have to give it some time to see how they grow it.  And, any eBook that links to the Joy Division/Missy Elliott mashup Love Will Freak Us can't be all bad.

You can download the eBook for free at Spannerworks.

September 27, 2006

Delicious reaches 1 million users

DeliciousCourtesy of Heather Green's Blogspotting, comes the news that Delicious has reached 1 million users, up from about 300,000 when Yahoo bought them less than a year ago.

Now, that's not MySpace or YouTube growth, but it's pretty solid, particularly considering the fact that Yahoo has not done a lot to promote the social bookmarking application.  Most of its growth is from viral marketing.

One question I would have is how active these million users are.  Are these people who have tagged 5-10 pages or 100?  Have these users all bookmarked new pages in the past 30 days?  60 or 90?   
My second question is how many of these users are using it truly as a social bookmarking tool, i.e. sharing their bookmarks and adding others to their network (and looking at their bookmarks)?

I'm a delicious user, but I use it more as a portable bookmarking tool so that I can access my bookmarks from my desktop or laptop (or last year, from a PC on a cruise ship).  I rarely take advantage of the social capabilities of delicious.

This is why I think the rumored Yahoo acquisition of Facebook would be a big boon to the company.  Yahoo has some great social networking components in Flickr and Delicious, but they remain somewhat niche products.  While I commend Yahoo for letting its acquired companies remain independently managed (I've been involved in too many acquisitions where integration was a nightmare), having a strong platform like Facebook, with tight integration of these other social networking tools, could drive usage.

September 21, 2006

Yahoo to Acquire Facebook?

FacebookThe WSJ is reporting that Yahoo will acquire social networking site Facebook for roughly $1 billion.

Facebook has been the subject of a number of rumors this year, most notably Microsoft and Viacom, among others.  Following Murdoch's acquisition of MySpace, Facebook has been among the most attractive sites out there (with much fewer legal headaches than YouTube).

If the deal happens, it will be interesting to see what Yahoo does with it.  One of the attractions for users of Facebook has been its exclusivity.  Until recently, it was only available to students with a college email address.  Then it was rolled out to high school students and now to the public.  Just as Groucho Marx didn't want to belong to any club that would have him as a member, you could see users revolt as the velvet rope is opened to the masses.

On the positive side, with Flickr and Delicious in the fold,  Yahoo is clearly pursuing the social networking arena.  These tools could be very useful to the facebook community, encouraging users to share information with their network. 

This proposed deal is interesting, particularly in light of Yahoo's announcement this week that it expects to be at the low range of its revenue projections, due to slowdowns in advertising.  On one hand, diversifying their revenue base could be a positive.  Conversely, Yahoo management probably should be focusing on fixing the problems with the Panama search release and on their core PPM business.

For more thoughts on Yahoo, read Michael Parekh, Fred Wilson and Rafat Ali, who reports that CEO Mark Zuckerberg has no interest in selling right now.

August 14, 2006

Web 2.0 leads Gartner 2006 Hype Cycle

Gartner_logo_1 Gartner, today, released its 2006 Emerging Technologies Hype Cycle Report.  The annual report which shows where Gartner analysts (if not their clients) will be focused for the coming year, listed Web 2.0, Real World Web and Applications Architecture as the three big emerging technologies.

Gartner_hype_cycle_chart Within Web 2.0, Gartner focuses on four trends and technologies:

  • Social Network Analysis, basically the enterprise business intelligence view into all the nuggets of information generated through social networks.  There are clear opportunities here for market research, competitive intelligence and trend identification.
  • Ajax: the processes used to develop functional applications within the browser
  • Collective Intelligence: the development of content, metadata, software and other services by a large group of people without any centralized authority.
  • Mashups: the integration of multiple web services to create new services. 

Of these four, Gartner views Collective Intelligence as having the highest potential impact, as a “transformational technology” in Gartner terminology, but with mainstream adoption 5-10 years down the road.  Social Network Analysis and Ajax are both considered high impact, with a much shorter time horizon – less than two years.  Mashups have only moderate impact (which they already have achieved IMO), again with a less-than two-year horizon.

The challenge for many organizations is that to fully embrace Web 2.0 often means a major change in your business model.  While it’s easy to begin using Ajax, it’s much harder to shift from a centralized, controlled environment to an open source or collective intelligence approach.  While the early success stories for Collective Intelligence have occurred in the consumer space (think MySpace or YouTube), innovative companies will identify ways to harness this in the enterprise environment. 
The key for content providers today is to begin to use these tools and make them a part of their daily routine.  Create a development sandbox and see how your users begin to interact with these technologies.  Over time, as these technologies move towards mainstream adoption, you’ll be well-positioned to take advantage of them.

For more insights on the Gartner report and Web 2.0, read posts by Dion Hinchcliffe, Paul Kedrosky and Peter Rip.

July 11, 2006

MyBlogLog Puts the Community in Blogging

Mybloglog_logo MyBlogLog launched a set of analytics tools about a year ago to help bloggers track activity on their blog.  You may have noticed text such as "nth most popular outgoing link" pop up when you mouse over a link on Content Matters.  That's MyBlogLog at work.  It gives bloggers a better sense of what their readers are reading, where they're coming from and what external links they click on.

Mybloglog_community_1 Last month, the team at MyBlogLog took that one large step further, adding a community aspect to blogs.  On the left pane of Content Matters, you'll see a new section, Content Matters Community, including photos and screen names of the five most recent visitors to the site.

Readers are encouraged to "join" the community for the blogs which they read.  These communities then can be used to generate recommendations for other communities to join (blogs to read).  Even cooler, MyBlogLog can automatically set you to "join" a community after you've visited that blog X times (default is 10).  In much the same way that last.fm can introduce you to music that you may like, MyBlogLog can introduce you to new blog content which is of interest to others in your community.

To learn more, take a moment to set up a free MyBlogLog account and join the Content Matters community.

What's the impact for publishers?
Trade magazine publishers and other content providers who dominate vertical markets should be looking for ways to create a community. There are many ways to increase user interaction; simply having your readers faces appear on your blog can be an easy, yet appealing piece of the community experience.  You might showcase community members elsewhere on your site, or even use them as a sounding board for editorial and features.

Eric Marcoullier, Todd Sampson and their team have created an interesting environment catering to the blog community.  Right now, it's fairly bare-bones, but it will be interesting to see where users take these communities.

In the meantime, I'd encourage you to sign up and begin to play.

Some of the notable blogs which have adopted MyBlogLog Communities already include:

Fred Wilson's A VC
Brad Feld's Feld Thoughts
Ben Barren's RSS'ing Down Under
Steve Rubel's MicroPersuasion
Valleywag
and hundreds more

June 21, 2006

Sphere adds relevancy to blog search

Sphere_logo While many consider the search market to be locked up by GYM (Google, Yahoo and Microsoft), there are still many niche areas of search where the big three do not dominate.

One of these areas is blog search.

Blog search is different than web search for a few reasons.  First, the time-sensitivity of blog posts make them more like news than like web pages.  Second, the page-rank approach of Google doesn't hold up very well for blogs, where the reputation of the blogger is more important than the number of links to a given post (particularly since links won't point to a page of a post that's just been authored).  Third, the display of blog search results has typically been reverse chronological order (newest posts at the top), simply because it is hard to otherwise rank the relevance of the results.

While Technorati, Feedster, Icerocket and others have made inroads into this space, there have recently been a few new entrants,

Sphere_results One of the more interesting recent entrants is Sphere.  Sphere aims to improve the relevance of blog search by applying improved algorithms so that results are ordered by relevancy, not simply chronology.  Sphere uses a combination of inbound and outbound links, metadata for the post and the blog and semantic text analysis to gain insights into what the blog is focused on.  The combination of semantic analysis and analysis of inbound links also helps push blog spam to the bottom of the results (since no reputable blogs are likely to link to spam posts).

Technorati_results In my testing, Sphere's relevance seems to hold up well.  While results for some searches are similar to that of Technorati or Google, in others I found Sphere's results to be clearly superior.  Particularly when searching for areas where the spammers are active, Sphere's ability to suppress those results provides a better search experience.  For those who prefer the Technorati-like chronological view, Sphere allows you to prioritize the results in that order as well.  You can also do a custom date search, entering start and end dates, particularly useful for looking back in time, but also useful for seeing the most relevant posts of the past week, for example.

Where Sphere really shows its stuff, however, is in their tools.  In particular, they offer a bookmarklet called "Sphere It".  A user looking at any piece of content on the web, can click "Sphere It" and the system will retrieve blogs posts that are similar to that content.   

While "Technorati This!" is a similar offering, the approach is very different.  Technorati This! uses links to find posts that point to the article you are searching.  Sphere It uses semantic analysis (most likely a Bayesian-like categorization engine) to find posts that are similar in content to your article, not simply those which link to it. 

Sphereit_topix2_1 For an example, take a look at what happens when I apply Sphere It to an A.P. article on political parties fundraising for the upcoming congressional races. 

The results are 17 blog posts from the past day talking about the fund-raising efforts of the two parties, none of which seem to link back to the underlying AP article.  The results are what you might expect to find using an enterprise classification application like Autonomy, as opposed to a simple link-based search.

Sphere_time Content providers looking to blend user generated content with their own editorial may also find Sphere It a useful tool.  Time Magazine has begun to use Sphere It on their Time.com site.  When reading an article, you'll see a box marked "related blogs", which uses Sphere It to find blog posts with similar content.

Sphere It is a compelling way to bring user generated content into your site with a simple "more like this" function.

Today, I find that I use a combination of Sphere and Technorati in my daily searches.  More and more I am turning to Sphere first, but Technorati still covers a wider universe and I check them when I don't find what I need in Sphere. 

Blog search remains a wide open market, and I won't be surprised if the GYM crowd improve their performance over time (through acquisition or otherwise).  In the interim, users will continue to reap the benefits of continued innovation.  I recommend you give Sphere and Sphere It a spin.  You'll be pleased with the results.

June 08, 2006

Newsgator Brings RSS to the Enterprise

Newsgatorlogo_md Though most users still don't have any idea what RSS means, underneath the covers RSS is taking hold within many applications they use.  My.yahoo is now RSS-based and the new IE7 and Windows Vista will leverage RSS as the primary means of pushing content to users.

Whether or not the term RSS gains prominence, its ubiquity is not really in question.  RSS is the protocol that will be used more and more for distribution of information inside the enterprise and among publishers and their users.

Rss_logo_1 One company seems well-positioned to reap the benefits of widespread RSS adoption.  While there are many web and software-based clients for RSS, most of them are geared to the individual desktop.  The one company which is focusing its efforts on RSS for the enterprise is Newsgator.

Newsgator has an RSS Platform and suite of products geared to three audiences:
1. End-users: products like FeedDemon, a software-based reader, NetNewsWire, a Mac-based client, and Newsgator Inbox, which lets you use Microsoft Outlook as your RSS client.
2. Enterprise customers, with the NewsGator Enterprise Server.
3. Private label clients seeking to use RSS to distribute content to customers.

While RSS readers are becoming a commodity, Newsgator offers the advantage that all of its readers - for the Web, the desktop and mobile devices - are integrated, so if you've read something on one device, it will be marked as read in another.

Ngenterprise1sm Newsgator Enterprise Server is an infrastructure product geared towards helping the enterprise manage vast numbers of feeds coming in from the outside and being distributed within an organization.  Part of the value proposition is simply helping organizations manage bandwidth, to avoid the types of problems that early push applications like PointCast created.  If you have 1,000 users who subscribe to a specific Wall Street Journal feed, for example, you can bring that feed in once to the server, rather than having a thousand users pulling it directly to their desktops.  As companies continue to move away from installed desktop applications, a server-based model works well.

The Enterprise Server can also be used to centrally manage blacklists and whitelists, to manage access credentials, to block undesirable URLs and to restrict access outside the firewall.  Version 1.3 of the Newsgator server, just released, adds new features such as "clippings", the ability to drop articles into a folder which users may subscribe to, the ability to forward any email to an RSS feed, "smart feeds" (persistent keyword searches) and more.

The early adopters of Newsgator Enterprise have come from a number of industries.  Early adopter firms like Edelman Public Relations and executive search firm Spencer Stuart have been among the first on board.  According to Newsgator Director of Product Marketing Todd Berkowitz, they are seeing strong interest from pharmaceutical companies, law firms and others.  Applications include use of external content for sales, marketing and competitive intelligence, as well as internal efforts from corporation communications, human resources and product development.

Myusatoday Newsgator has also signed up some impressive clients for its private label services.  The A beta version of My USAToday, driven by Newsgator allows users to construct a home page combining content from USAToday and external sources.  Similarly, My Newsweek is a syndicated service hosted by Newsgator, featuring feeds by columnists as well as feeds for hot topics and latest buzz.


What's coming next for Newsgator?  According to Berkowitz, some of the items on the development roadmap include support for tagging, improved filtering and relevance and integration with other desktop clients.

What's the impact for publishers?
As I've posted previously, it's pretty clear that RSS is coming and coming fast.  It may get a cooler name (Microsoft and Feedburner seem to have adopted the term WebFeed), but that little orange logo will be driving virtually all applications that push content.  If you haven't begun to establish an XML strategy, it's time to start now.  While content providers with strong technology resources may wish to develop their own solutions, those who prefer to outsource may want to look at private label solutions from companies like Newsgator.

June 05, 2006

Google launches Google Spreadsheets

Google_labs
Google has announced a beta of a web-based spreadsheet.

The new application, to be launched under the Google Labs platform, will be released Tuesday to a select group of users who sign up here.  The spreadsheet is an outgrowth of an Excel conversion tool which Google acquired last year in its acquisition of 2Web Technologies. 

This follows on the heels of Google's acquisition of web-based word processor Writely and the release of Google Calendar.  All Google has to do next is acquire S5 or Thumbstacks, as a hosted competitor to PowerPoint, and they'll have an ASP competitor to Microsoft Office.

According to Google Product Manager Jonathan Rochelle, the product is being positioned as a workgroup application, enabling users to more easily share data in a collaborative environment.  Rochelle indicated that they were also exploring the interest level in using Google Spreadsheets as a front-end for Googlebase.

While this product clearly won't be a threat to Excel for true number-crunchers or enterprise installations, there's definitely room for a new entrant in the spreadsheet market.  Spreadsheets are used for many purposes beyond financials - everything from the office phone list to simple project task lists.  Those types of tasks, really simple database projects, would be a natural fit for a collaborative spreadsheet application.

UPDATE: Dan Farber at ZDNet does a better job than I did at describing the different tasks that Google Spreadsheets is trying to solve, as compared to Excel, and how StarOffice and other web-based spreadsheet tools may feel the heat first.

According to Christina Quarles, analyst at Thomas Weisel Partners, Google Spreadsheets will help Google expand in markets outside the U.S., where the concept of a bundled copy of MS Office is not as entrenched as it is in the U.S.

June 03, 2006

New Visualization Tools Map Blog Networks

Thanks to Guillaume Du Gardier (via Steve Rubel) for noting a few new visualization technologies focused on mapping the social networks of blogs.

I'm a huge fan of visualization tools, particularly for social networking and similar applications.  Tools like i2 Analyst Notebook and ClearForest's ClearResearch provide insights into huge volumes of information for the professional user.  Grokker and Vivisimo provide similar benefits in the web search world.  The tools Rubel shows are not at that level, but each does an interesting job of showing the network of links around a given blog or website.

Touchgraph Of the two, Touchgraph is the more useful one.  Links between sites are color-coded to make it easy to see inbound or outbound links.







Aharef_map More visually appealing, but with less functionality, is this map by Sala Aharef's Websites as Graphs.  It helps you see the density of a network, with color-coded indications of links, images and more, but is not very navigable.

As users struggle to deal with information overload, data visualization will start to take hold among mainstream users.   In the meantime, we can admire the creativity of the early offerings.

May 26, 2006

Keeping Ratings Trustworthy

Startsamazon Ratings and reviews, such as those offered by Amazon, TripAdvisor, Zagat and ePinions, are one the most compelling forms of user-generated content.  When done well, ratings provide readers with a way to make better purchase decisions in a time where unlimited choices create confusion.

But, what happens when the ratings cannot be trusted?

This week, the Dixie Chicks released a new CD, Taking the Long Way.  The Dixie Chicks, originally a  popular country band, caught the attention of many of us three years ago, just before the start of the Iraqi War, when they told a UK audience that “we’re ashamed that the President is from (our home state of) Texas”.  At that time, when questioning the administration was tantamount to active support of terrorists, they became immediate pariahs.  Country music stations refused to play their music and they received numerous death threats.

Dixie_chicks_cd Three years later, the Dixie Chicks have released their new CD.  The music style itself has changed.  Traditional country twang has been replaced with a more folk-pop sound, reminiscent of Roseanne Cash.  That, in itself, might result in some unusual ratings, as their traditional listeners might be disappointed, while listeners who might not have given them a listen in the past might find themselves fans.  But, what’s driving the Amazon ratings of this CD is less the music than the politics.  A one-star review on Amazon starts with “Lets get up and show your back side and talk trash, just to keep your self in the spot light, the CD sucks and I wouldn't buy it, I don't have anything by them and turn them off every time they come on the radio.”  Well, if you don’t listen, how can you review it?   

The Dixie Chicks are just the latest example of this.  Take a look at ratings for books from Al Franken or Ann Coulter.  I would guess that three-quarters of the reviewers have not read their books.  For those authors, five stars means “I love your viewpoint, even if I've never read this book”, while one star means “I hate your politics”.  Meanwhile, etailers have to contend with  suppliers trying to game the system, giving their own products and services high reviews, while bashing the competition.

What can be done to address this?  First, we can look at whether the “star system” is the best way to show ratings.  Sure, it’s easy for users to view the stars or sort by them, but statisticians have long known that “mean” is one of the weakest measurements.  A simple distribution of ratings might tell a better story.

Ratings_chart_1 Rating_chart_2 For example, let's look at the distribution of stars for two books that each have an average rating of 3.5 stars.  The green on, on the left, has most of its reviews giving it 3-4 stars.  The blue chart on the right has very little in the mid-range, but has a lot of 1-star and 5-star ratings.  By reading a few reviews, you’d get the context of why the love-hate relationship exists (is it the product, or something else).

Another option is to force users to provide their real names, or at least validate their registration by email.  This will reduce the number of fake or duplicate entries, although it may also reduce the overall participation level.

What ideas do you have for making ratings and reviews more trustworthy?  Please post your comments or send me your thoughts.

May 22, 2006

Waxxi - First Interactive Podcast

Waxxi_logo_1 A few weeks ago, I posted about Waxxi, a startup offering interactive podcasts.

Their first such podcast was held live this past Saturday and featured Robert Scoble and Shel Israel.  Unfortunately, due to scheduling conflicts, I couldn't participate.

The initial reviews were pretty positive, though.  Jeremiah Owyang has a very detailed summary of the podcast content on his blog, while Frank Gruber gives a thumbs up to the process itself. 

I think Waxxi has hit on an interesting format for encouraging dialog within a community.  I will be sure to sign up for their next one and would be interested in hearing from readers who are considering using Waxxi for their own use.

UPDATE: Waxxi has posted the podcast to their site.  You can download a copy or listen interactively on the site.

May 18, 2006

Google Notebook - Test Drive

Google_notebook_logo
Now that Google Notebook is live, I thought I'd take it for a quick spin.

My initial feedback: it's rather underwhelming.

Google Notebook is designed as a bookmarking replacement - in essence it's Google's response to delicious, which of course was acquired by Yahoo last December.  Like Delicious, Google Notebook allows the user to select web pages, annotate them and store them for retrieval at a later time.  Also, like Delicious, Google Notebook can be private or shared with others.

Google_notebook_plugin Google notebook is a browser plugin.  From any web page you can simply right-click, then select "note this" to create an entry for that page.  From there, you simply type in your comments about that page.

True to its name, Google Notebook uses a notebook metaphor (the kind you write in, not the kind you type into) for storing these annotated sites.

Unlike Delicious, Google Notebook does not use tags.  Rather than assigning tags to the pages, you enter text about them, then you can create section headings for various topics, and organize the notes into the sections.

One benefit is that if you highlight text passages, Google Notebook will automatically grab them and paste them into your note.  This makes it more of a researcher help than strictly a bookmarking tool.  In that sense, one might look at Google Notebook as a competitor to NetSnippets rather than Delicious.

Google_notebook_full_screen Here, in full screen mode, you can see how the notebook page looks.
The interface is clean (as you'd expect from Google) and they've got a nice ajax-based drag and drop capability to move your notes around.  The page is search-driven, so you can search your own notebooks or public notebooks for whatever you're seeking.

As with most Google Labs offerings, Google Notebook at first blush is uninspiring.  While it's a potentially useful research assistant, the lack of tagging limits its usefulness from a social software perspective.  True annotation seems to be a very niche product and it would seem that Google would much prefer to go after the wide audience that Delicious has established.  Perhaps it will get there in the next phase, as it seems pretty easy to add tags to what they've already delivered.  In the meantime, Google Notebook is a simple offering that users might find useful but is unlikely to inspire  raving fans the way Delicious has.

For more feedback on Google Notebook, take a look at TechCrunch, SearchEngineWatch and GoogleTutor.




Yahoo Analyst Day

Yahoo_logo Yesterday was Analyst Day for Yahoo.  They have posted the PowerPoint (188 pages worth) on the Yahoo website.  It's a lot to go through, but has some interesting slides.

Yahoo_fuse_1 Jeff Weiner, SVP of Yahoo! Search and MarketPlace, did an interesting presentation on their view of SocialSearch.  Yahoo's vision uses the acronym FUSE - for Find, Use, Share, Expand, as shown in this diagram.

Yahoo_social_monetization Yahoo! really seems to have thought through their strategy, leveraging Social Search (Yahoo Answers and Yahoo Groups),  social bookmarking (delicious) and social media (Flickr and a tbd video service) to drive revenue.  The revenue model is shown in the graphic below.  I think this vision is a real differentiator from Google, whose initial efforts (like Google Notebook) have been uninspired.


For more commentary on Yahoo! Analyst Day, check out Rafat Ali on how Yahoo is trying to catch up to Google in revenue per search, or take a look at ...not much else.  Surprisingly, Yahoo Analyst Day got hardly any coverage on the main search engine and tech blogs, compared to pretty extensive coverage of Google Press Day.

Update: John Battelle has added a few brief thoughts and will be interviewing Jeff Weiner tomorrow.

May 17, 2006

Social software solutions to knowledge management problems

Since the mid-90’s, KM companies have been trying to solve the “expert finder” problem, that is, finding the experts within your organization for a given topic or problem. 

While finding the internal expert is critical for all large companies, it’s most acute at large consulting firms and investment banks.

There have been various KM attempts to solve this problem, some simply using resumes (have skills and experience, but never get updated), while others tried to get users to update proprietary databases with skills and interests, generally with low participation rates.

Rod Boothby, at E&Y, posts about how they are using people blogs to capture and maintain the critical information about skills and projects, to quickly find the experts for a project team.

This is another example of how inexpensive and easy-to-use hosted social software applications are solving KM problems where larger, more complex applications failed.  The ease of creating blog pages (in this case, using templates for consistency), of linking project pages to people pages, and using standard web search tools, encourages adoption and makes maintenance easy.

May 16, 2006

The 50 Content Companies that Matter: TripAdvisor

Trip_advisor_logo Since the early days of the Web, travel has been one of the fastest growing segments on the Internet.  So much so that it has changed the business model of the airlines and nearly decimated the travel agent industry.

While much of the attention has been on the transactional sites that book travel, there have been many other pockets of innovation in this space.  One of the most innovative has been TripAdvisor, a subsidiary of Expedia which was spun off last year from Interactive Corp (IAC).  Expedia itself has always been a technology leader, perhaps due to its origins within Microsoft.  TripAdvisor was acquired by IAC/Interactive (which then owned Expedia) in the spring of 2004, and is a lead generation site for the travel industry. 

TripAdvisor brings together two of my favorite themes in the content industry – lead generation and reviews and ratings.  TripAdvisor provides users with travel information and recommendations for hotel accommodations, restaurants and activities on various destinations.  The most valuable part of the TripAdvisor content are the ratings.  I find their hotel ratings invaluable as a means of finding quality hotels at a reasonable price.

Tripadvisor_inside_palo_alto TripAdvisor has recently taken another step up the user-generated content chain, launching wiki-based content branded as TripAdvisor Inside.  These special content sections, each focused on a city or state (see example for Inside Palo Alto), are collaborative, user-created editorial pages with sections for history, things to do, dining scene and more.  With its recent launch, the content right now is a bit sparse (only nine restaurants in Palo Alto) but that’s sure to grow in time.

 

Just last week, TripAdvisor launched what they call goLists, where users can contribute lists of places to go, things to see, sample itineraries, etc.  Similar in concept to an Amazon Favorites List, this is another example of leveraging the community to create valuable content.

TripAdvisor has done a terrific job in building valuable content, which in turn generates significant leads for their travel partners.  Their star ratings and reviews have created a barrier to entry for new competitors, and their new wiki pages show that they understand how to leverage their community to develop more compelling content.  In creating a balance that addresses the needs of their advertisers and users, while continuing to embrace new technologies, TripAdvisor is clearly one of the 50 Content Companies that Matter.

May 13, 2006

Can You Spell D-O-P-E-S?

Us_capitol CNET's Declan McCullagh broke the story that Rep. Michael Fitzpatrick (R-PA) has introduced a bill that would ban access to social networking sites (and virtually any other site where users contribute content) in public access locations such as libraries and schools. 

The bill, called DOPA for the Deleting Online Predators Act, has been endorsed by House Speaker Hastert.  McCullagh has posted a copy of the bill here.  The bill is clearly in response to alarmist stories on the news about predators targeting MySpace, Facebook and other social networking sites to identify and engage potential targets.  The definition of social networking is quite broad: "...commercially operated Internet website that allows users to create web pages or profiles that provide information about themselves and are available to other users and offers a mechanism of communication with other users, such as a forum, chat room, e-mail, or instant messenger."  The way I read that, it could ban users from reading blogs that allowed comments, or even Flickr or Wikipedia.

While I am not surprised at this knee jerk reaction, it's just disappointing to see Congress jump into things that they clearly do not have an understanding of.  You don't protect children from online predators by reducing their access to the Internet any more than you would protect them from offline predators by banning children from parks and schools.

If the United States is to remain competitive economically, it's critical that we, as a society, embrace technology.  While the Government challenges the science of Global Warming and Evolution, the one bright area for the U.S. in the sciences has been our continued innovation in the technology field.  Between this DOPA bill and the COPE Act - telecom/cable efforts to charge a premium for Internet access, Congress is on the verge of mucking up the entrepreneurial innovation engine of the Internet.

I would like to propose a new Act; I call it No Congressmen Left Behind.  It would require all members of Congress and their staffs to demonstrate proficiency in the basics of the Internet before they would be allowed to vote on any issue that impacts its use.  I urge you to contact your Representatives and tell them to oppose both COPE and DOPA.

In the meantime, if you have children, take a few minutes to look at their MySpace page.  Ask them to show you their FaceBook listing.  Have a mature and intelligent dialog with them about the risks of sharing too much information on the Internet.  Help them understand the boundaries.  MySpace and FaceBook are not the problem.  And ignorance is clearly not the solution.

May 11, 2006

Jotspot acquired by Yahoo?

Jotspot_logoTechCrunch is reporting a rumor that Yahoo has acquired ASP wiki provider JotSpot

Jot_alacra_wiki We've been using JotSpot at Alacra for a few months.  Similar to WetPaint and SocialText, Jotspot has positioned the hosted wiki as the web 2.0 version of Knowledge Management.
With Yahoo potentially getting into the mix, wikis could quickly proliferate into mainstream use, which would help players like SocialText who are better suited to the enterprise.

This is still just a rumor, as TechCrunch comments indicate current Jot employees were not aware of any acquisition.


May 09, 2006

The Future of Magazines

Flipping through the May issue of a trade pub for the content industry, I landed on the “executive changes” page.  My reaction, after glancing at the first few was “that’s ancient news”.  These were based upon press releases that went out two months earlier.

It made me give some thought to the role of magazines in today’s media framework.  I understand the editorial and production cycles of a monthly magazine, but I think that it’s important that publishers adapt their editorial content to what is relevant for a monthly cycle.  If you’re publishing monthly (or even weekly), your mission should be analysis and context, not news delivery.

With direct-to-consumer press releases, readers don’t depend upon monthly trade rags for news; instead, they want perspective and insights.

Many suggest that magazines are dying as a medium.  I disagree.  Magazines still have appeal, but the context in which they are read has changed.  Their format makes them easy to slip into your briefcase for a train or plane ride.  For certain consumer magazines, the ability to archive a collection has appeal.  Magazines like the Economist or the New Yorker leverage their form to provide in-depth articles on a specific topic.  (When’s the last time you read a 20-page article online?)

But many of today’s trade publications run the risk of becoming irrelevant, as a new generation of users turns to other sources for the information traditionally within the trade publication domain.  In order to remain significant, trade pubs need to put the needs of their users first.

Those willing to rethink their role will realize that they have many strengths which they can leverage.  The greatest strength is their community of readers and advertisers.  For most magazines, their online presence is simply an extension of the offline edition, perhaps updated with greater frequency.  Rather than simply duplicating the content of the print version, magazines should begin to take advantage of the many web 2.0 tools out there to develop their community, for example using wikis and blogs focused on special topics.  New applications such as interactive podcaster Waxxi could be used for participatory events.

Returning to the “executive changes” section, magazines could take advantage of their breadth of coverage to show trends in the changes (who’s hiring and who’s losing talent) providing rankings or tables, rather than simply reprinting old press releases and labeling it news.

Trade publications have the brand, reach and subscriber base today to enable them to create strong communities which would, in turn, allow those brands to thrive for years to come.  Those strong brands would continue to command top dollar from advertisers looking to reach that community.  Those who insist on keeping to the editorial calendars of the past will find their impact and their brands diminishing.

May 08, 2006

Waxxi - Social Podcasting

Waxxi_logo There’s a lot of experimentation going on right now, applying a “social spin” to new and existing technologies to generate new uses.  Frank Gruber has an interesting post on a new company, Waxxi, which is launching an interactive podcasting capability. 

The basic concept is that Waxxi will host a scheduled interactive podcasting event, where participants can call in and participate.  The audio portion will be augmented by an IM stream as well.

Waxxi’s first interactive podcast will occur May 20, 2006 at 10:30 am PST / 1:30 EST.  The podcast will feature Robert Scoble (Microsoft’s best known blogger) and consultant Shel Israel, co-authors of the business blogging book Naked Conversations.  You can register for the podcast at the Waxxi site.

As Michael Arrington points out, there could be a few hiccups in this podcast, depending upon the number of participants and the level and methods of moderation.

That being said, this could be an interesting vehicle for content providers and conference companies.  Today’s webinars tend to be uni-directional – just a talking head clicking through PowerPoint slides.  If you can leverage your community of users to create an interactive dialog, that could be very compelling.  And, since the end results are captured as a podcast, you can develop a collection of interesting audio content for your users.

May 07, 2006

Share your OPML

Share_your_opml Dave Winer has launched a new application called Share Your OPML.
The OPML file is the file that stores the RSS feeds that you read. 

Share Your OPML is sort of like a Last.fm for RSS feeds.  The goal is to compile a database of what people are reading, as well as the patterns within the individual taxonomies, then use pattern matching to suggest other feeds.  Like Last.fm, you can easily see other users who have similar interests to yours - just click on the "Subscriptions Like Mine" link.

There's also a top 100 ranking (right now, TechCrunch is #1 by a fairly wide margin).  Share Your OPML is built under a Creative Commons license, so you can expect others to build onto the framework with new functionality.

To participate, just upload your OPML file.  Most RSS feedreaders have a menu item that lets you export your OPML file with a single click.  So, take a minute, upload your file and discover some new and interesting feeds.


May 05, 2006

Social software adoption patterns

Steve Goldstein has an interesting post today on the AlacraBlog: You Are What You Publish.
In it, he postulates how students, being early adopters of tools like MySpace and FaceBook, will soon utilize other social networking apps such as wetpaint, jotspot or Socialtext
These students, as they move into the workforce will drive increasing usage of community-based social sofware in the workplace.
I echo Steve's conclusion that if you're looking for ways to spur viral adoption of these tools, the education market is a good place to focus. 
Meanwhile, if you have kids today, take a few minutes to learn more about what they're doing on these social software sites.  It might just help you understand where your own business will be headed in the next few years.

April 20, 2006

SocialText: Communities that Work

Socialtext_logo One of the key characteristics of “Web 2.0” is social software, used to bring together and facilitate community interaction. 

This week, I add one of the emerging leaders in social software, enterprise wiki provider SocialText, to my list of Emerging Content Technologies.  SocialText develops and hosts wikis for companies for both internal and external use.  Wikis are rapidly beginning to replace intranets and portals for managing the communications around projects.

Unlike earlier knowledge management tools, wikis can rapidly increase productivity.  According to SocialText CEO Ross Mayfield, “a typical wiki can eliminate 30% of the email around a project.” 

Rossmayfield_1 One key to the success of SocialText, and wikis in general, is their decentralized structure.  Rather than a centralized, IT-focused solution, a SocialText wiki can be set up by non-technical staff in a short time.  Typical projects involve a handful of initial users, then once a critical mass of information is developed they expand to a wider audience.  By sharing control with their user community, sponsors of wikis can drive participation and a feeling of ownership.

As a Web 2.0 company, SocialText is committed to an open environment.  They will soon release an Open Source version and are involved in various open source initiatives today.

What’s the opportunity for publishers?
According to Elsevier Vice Chair Y.S. Chi, as content becomes commoditized, “the role of the publisher is beginning to shift from creator of content to manager of markets”.  Publishers have the ability to leverage their brand to develop communities focused around a given subject area.  Using wiki technology to cultivate a strong user community can provide you with a platform to sell various solutions to that market.  The difficult part for publishers will be ceding the absolute control that publishers are accustomed to.  But for those who do, the rewards may be significant.

April 13, 2006

Google launches ajax Calendar

Google_calendar
Google has unveiled Google Calendar (as a beta of course).
It's an ajax-based calendar, allowing users to share schedules with others.  More importantly, Google has developed this as a platform, so look to see some interesting timeline-based mashups in the near future.  Content providers with date-specific material should explore mashing it up with the Google Calendar.



April 11, 2006

Jigsaw raises $12M

Jigsaw_logoJigsaw, an innovator in CRM and lead identification, and one of our 50 Content Companies that Matter, has raised $12m in venture capital.
Austin Ventures led the round, along with prior investors Norwest Capital and El Dorado Ventures.

Despite the fact that TechCrunch believes Jigsaw to be one of the most evil companies in the world, IMO their model is nothing but a modern day version of the way that directory publishers have compiled information for years.

I am sensitive to Michael Arrington's belief that users should be able to opt-out of the Jigsaw database, but having spent many years in the compiled information business think he's overhyping the risk. 

Jigsaw is hardly giving away personal information - it's not home addresses, names of children or anything else that might lead to evil use.  It's simply allowing users to upload business contact information for business professionals.  If you've ever subscribed to a controlled circulation magazine, you've already exposed yourself to worse.

With its low cost base, it will be interesting to see how Jigsaw uses the funds for business development.

April 10, 2006

BSEC Tim O'Reilly keynote

Camelback6 Buying and Selling eContent is so packed with content that they needed two keynotes the first day.  The afternoon keynote speaker, Tim O’Reilly of O’Reilly Media, was the highlight of the day. 

O’Reilly, credited with coining the name “Web 2.0” provided a set of six key rules for successful Web 2.0 applications:
1. Users add value
2. Network effects by default: Tim talked of the “architecture of participation” where successful apps default to aggregating user data as a side benefit to usage (e.g. Napster defaulted to sharing=on)
3. The Perpetual Beta: key in the software as a service model is continual improvement and rolling feature enhancements.
4. Services have to be above the level of a single device: the PC is not the only access device for Internet applications and those that are limited to such are less valuable than those that reside on the Internet.
5. Data is the next “intel inside”: the BSEC audience was pleased to hear that their content could be the differentiator for applications.  Examples included Navteq, the source of Google maps and virtually every other street mapping application, and Gracenote, the “CDDB” database that matches titles to track numbers within all the major music applications.
6. A platform beats an application every time.  And, unlike the MS Office model, the Web 2.0 world allows small pieces loosely joined together to add new value to the platform.

Tim also spoke of Asymmetric Competition, where a new competitor with a different business model may kill your business.  His example of Craigslist killing the newspaper classified business is now a classic, but there are many others out there.  For those who’ve read Outsell’s Neighborhoods of the Information Industry, this theme should ring familiar.

One of Tim’s key points came out of the Google Maps experience, where they had not intended for the application to serve as the basis for what would become mashups.  His takeaway is that “if your users aren’t surprising you by the ways that they build on your (Web 2.0) product, then you’re doing something wrong.”

To catch up on the rest of the Buying and Selling eContent conference, take a look at the following blogs.  All told, there were close to a dozen attendees blogging in some capacity today.
Rafat Ali will be speaking on a panel Tuesday and is live-blogging throughout.
John Blossom at Shore is posting on his special events weblog.
Ross Mayfield has an interesting recap on Tim O'Reilly's keynote.
Larry Schwartz, of Newstex, points out his win then loss of the $20 cliche bet for first use of "Long Tail"
David Meerman Scott leveraged the WebInkNow blog to field questions during his panel.
Dale Wolf adds his thoughts on his Context Rules Marketing blog
Shannon Holman of ALM Research compares ALM's progress to the comments of some of the speakers (still no sign of data dog in Scottsdale yet)
More to come in day 2...

BSEC Content Technology Meets Web 2.0

Camelback1 One of the more interesting sessions at Buying and Selling eContent was the afternoon session entitled “The Next Wave: Content Technology Meets Web 2.0”.
Ross Mayfield, SocialText CEO, led off with a discussion of wikis and collective knowledge.  First, he enumerated the two types of Collectives that drive social networking applications:
Collective Intelligence (for example, Digg or Memeorandum), where there is a low threshold of user engagement; and
Constructive Intelligence (for example, Wikipedia), where there is a very high level of user engagement required. Clearly, the enthusiasts and other users who are willing to contribute at the Constructive Intelligence level are extremely valuable.

The key takeaway from Ross’ talk is that “Sharing Control Creates Value”.  It’s extremely difficult to share control, particularly for premium content providers, but virtually every successful Web 2.0 application has that at the heart of their business.

April 03, 2006

Britannica vs. Wikipedia

Nature_logo There's been an interesting debate going on the past two weeks focusing on the quality of Encyclopedia Britannica vs. that of Wikipedia.

The debate was prompted by an article in the scientific journal Nature last December.  Nature set out to use peer review to compare the accuracy of Wikipedia vs. that of Encyclopedia Britannica.   To conduct the test, 42 domain experts each analyzed a single topic from both encyclopedias, covering a wide range of scientific disciplines.  Reviewers were asked to review the articles for three types of errors: factual errors, critical omissions and misleading statements.  The tests were blind (i.e. reviewers did not know the source of the listing they were reviewing).

Wikipedia_cambrian_explosion The results were quite interesting.  Not surprisingly, Britannica had fewer errors in the overall survey, but not by much.  For the 42 topics, there were 162 errors uncovered in the Wikipedia entries, vs. 123 for Britannica.  Interestingly, of eight "serious" errors, four each were found in Wikipedia and Britannica.

Last week, Britannica struck back with a response to the Nature article.  Britannica did not focus their attack on the results, but instead on the methodology of the study, for example complaining that Nature used supplemental Britannica content outside of the Encyclopedia Britannica.  In addition, they spent quite a bit of their response focusing on the wording - stating that where Nature said "Wikipedia comes close to Britannica", that the 1/3 more inaccuracies (162 to 123) were not that close. 

This week, Nature published a brief response to the Britannica article on its blog, defending its methodology and results.

What Britannica seems to be missing is that this is a public relations battle that it cannot win.  For an organization selling premium content compiled by experts, to split hairs over whether its content is slightly better than a free source compiled by the unwashed masses, is a losing battle.  Rather than hiding behind definitions of what "is" is, the team at Britannica should take this as a siren call to look at their products and value proposition.  Long-term, for Britannica to remain a viable business, they need to better understand the needs of their users and develop products that uniquely address those needs. 

Perhaps the most interesting aspect is how it took Britannica more than three months to respond to the Nature article.  Considering that in the original article, Nature published the specific findings for each of the 42 topics, it shouldn't have taken them long to fact-check them and put together their response.  (In contrast, the team at South Park rewrote an entire script and produced a new episode in less than a week after Isaac Hayes quit as Chef due to pressure from Scientology).  With Wikipedia's ability to respond to changing information within seconds or minutes, Britannica's slow-footed response seems telling.

Many content providers continue to live in denial, hiding behind claims that "our quality will win out" over Internet upstarts.  But, the quality differential is rapidly diminishing.  Whether comparing U.S.-based editors to outsourcing ("they'll never understand our market"), domain experts to the wisdom of crowds ("our PhDs have knowledge no one else can match"), or manual vs. automated tagging ("a computer can't understand the nuances of our taxonomy"), the gap is disappearing.

Traditional content providers hoping to still be around 5-10 years from now need to rethink their strategy.  Rather than relying upon domain knowledge in the compilation of information, they should focus that knowledge on understanding how information is consumed.  That will enable them to build the vertical market workflow-based applications that will continue to command premium value, while their basic content becomes commoditized.

March 14, 2006

Roll Your Own Search Engine with Rollyo

Rollyo_logo

The concept of vertical search engines has been around for quite a few years. In fact, Alacra  launched its Portal B vertical search engine more than five years ago. Of course, timing is everything in technology and it seems that now the timing is right for vertical search.

One of the emerging leaders in this space is Rollyo. Named for “roll your own” search engine, Rollyo allows you to select up to twenty five websites and create a custom search engine indexing just those sites. 

Users can create multiple “searchrolls” for different topics. Or, a user could create a searchroll containing news from your favorite news sites for a given topic. And, as with any good Web 2.0 app, you can easily share your searchrolls with others, providing a social component to search indexes. 

So, who needs a personal search engine? Most of us do, actually. While there are times when you want to search the entire Internet for some obscure information, there are many other times when you want to search “trusted sources” which can provide you with more relevant results. There are numerous technologies out there which try to use statistical or linguistic approaches to disambiguate terms with multiple meanings. But, if your search index contains only sites related to island-hopping, there’s no confusion over which “Java” you’re search for. 

For a quick test drive of a personal search engine, try out the Content Industry Searchroll which I created in about three minutes using Rollyo.  Try terms like tagging, workflow or blog and see how the results are more relevant than in a full search engine.  Want to learn how Pfizer uses content-related technologies?  Just enter "Pfizer" in the search box and browse the results.  That would be an impossible search to do using Google on the full web.

What’s the implication for publishers?

Publishers, particularly trade press, enthusiast publications or industry-specific information publishers are well-suited to provide vertical search to their users. Whether it’s indexing surfer sites for Surfer Magazine, building and construction sites for McGraw Hill’s CIG or political coverage for CQ, enabling users to search an editorially-chosen set of trusted content is a tremendous value. 

I have heard many database publishers complain that the Internet is eating away at their business and that they are having difficulty retaining customers. Creating a vertical search engine is an effective and manageable way to add significant value to your content beyond what Google, Yahoo and Microsoft can deliver. After all, you’ve been saying for years that you add editorial value beyond simple compilation. Well, roll your own and prove it. 

March 06, 2006

Last.fm: Big brother is listening

Lastfm_logo “Big brother is listening.”

In light of recent news of NSA wiretaps and FBI “raids” on libraries that may sound ominous. But, what if you asked “big brother” to listen in?

That’s the idea behind last.fm. Last.fm, formerly AudioScrobbler, is an iTunes plugin that drives a music recommendation engine. The concept is pretty simple. The plugin monitors all the music you listen to on iTunes, generating lists of your favorite performers and songs, then compares those to others in their database.

Lastfm_similarUsing various matching algorithms, Last.fm is able to create music neighbors, people whose listening habits are similar. The more interesting information comes when you click on a performer you like. The Last.fm engine quickly shows you other “similar” bands. I have found this a great way to discover new bands. For example, if you look at the Arctic Monkeys, you’ll see Maximo Park, the Libertines, the Subways, the Rakes and other UK indie bands. Another useful feature is that you’ll see the songs for each band or performer ranked by the number of times they are listened to. Want to sample a few songs from the Subways? With just a click you’ll see their most popular songs on Last.fm are Oh Yeah and Rock and Roll Queen.

Lastfm_subwaysLast.fm also supports tagging of songs and artists, as well as the ability to recommend an artist to another last.fm user. Outside of the “neighbor recommendations”, the community aspects of Last.fm are modest today, but there’s certainly opportunity for growth.

Last.fm has also launched the last.fm player, a personalized online radio station. With the Last.fm player you can listen to your own favorites or choose to listen to what they call neighbor radio. And now, my biggest complaint about last.fm has been addressed. Most of us listen to a lot more music on our iPod than on our PC. Yet Last.fm only picked up what was played on your PC. Now third parties have developed free plugins that will take your iPod usage and include that in your Last.fm results.

Last.fm also makes your listening habits accessible via an RSS feed. The feed itself doesn’t serve any great purposes today (not sure anyone out there needs real-time reporting of what I’m listening to), but could lend itself to an interesting mashup. What if VNU’s Billboard were to layer their sales rankings on top of Last.fm play lists, then cross-reference those by country? It might make for some interesting predictive capabilities for the music industry. Fred Wilson, in the Music section of his blog, includes his weekly and all-time top 10 playlists from Last.fm.

So, what’s the message for content providers?

First, monitoring customer behavior is not a bad thing. In fact, if you are upfront about what you’re doing and offer clear and compelling benefits to your users, they will welcome the monitoring.

Second, recommendation engines are not simply for e-tailers. If you can leverage community usage behavior to cluster similar content together, it can create a powerful and compelling recommendation engine. Combine that with simple distribution tools such as RSS and you have a low-cost, high value tool to promote new content to existing users. 

P.S. For those interested in knowing what I am listening to, you can visit my last.fm profile.

Emerging Content Technologies

I am starting a new list on Content Matters, of Emerging Content Technologies.

There are so many compelling new things happening in the content technology space that I felt it would be helpful to catalog and share them. Many of these companies are what are commonly referred to as “Web 2.0” products such as tagging, vertical search, social software, RSS, mashups and related technologies. 

Why not simply include these in my existing list of the “50 Content Companies that Matter”? The main reason is that the characteristics of these companies are very different from most of the companies on that list. Most of these are early stage companies. While their concepts are intriguing, as innovators, they may not be the ones who eventually emerge as market leaders when these technologies are adopted. Some of these companies will not make it; others will be acquired by Google, Yahoo or others along the way.

Many in “traditional” content companies may wish to overlook these companies; after all, few of these companies have made inroads in the b2b market. It’s true that teenagers are bigger users of mashups and social software today than business users. But, traditional publishers should ignore them at their peril. These companies and their offspring are the ones who will be taking your market share in the years to come. To that end, wherever appropriate, I will share my thoughts on the relevancy of these technologies to more traditional content businesses. My recommendation to you is that you play with these various applications, making sure that you understand them, then think about how they might impact your business in the years to come.

The first post in this new series is on Last.fm. 

As always, I welcome your feedback and suggestions.

 


March 01, 2006

Is Social Software the new Knowledge Management?

For those who were involved in KM or KM-related solutions a few years ago, the words "Knowledge Management" bring to mind terms like "black hole", "no ROI" or worse.
The good news is that some of the new social software technologies are enabling KM results without  high costs and lengthy implementation efforts.
Recently, I have been using JotSpot, a simple wiki application, to create an intranet for Alacra.  For those interested, I've blogged my experiences on the Alacra Blog.

February 28, 2006

What's a Swicki?

Swicki
You may have noticed a new tag cloud on the  Content Matters blog.  Entitled "Content Buzz Cloud", it's on the upper left of the blog.

The cloud is a "Swicki" and it comes courtesy of social software provider Eurekester.  At first glance, it looks simply like a web 2.0 tag cloud interface a la delicious.  But, the swicki is more than that.  Behind the tag cloud is a vertical search engine.  Similar to Rollyo, the Swicki lets you select specific web sites to include in your index.  It also includes a full web crawl, but the matches from your vertical sites show up higher in your results.

So, what's the social software angle?  Well, a Swicki is designed for community tuning.  Users can tune the results by removing sites that don't match, by adding new sites or by marking certain sites to go to the top of the results.

For Content Matters, I've focused the Swicki on content-related topics and sites.  Swicki is still in beta, and some key features, such as including select blogs, are yet to be released (today, it includes all blogs in its index).  So, how are the results?  Try a few searches and let me know.

February 21, 2006

Blogburst to syndicate blog content

BlogBurst is a new service to syndicate blog content to mainstream media outlets.  Created by Social Software provider Pluck, Blogburst will distribute content from select blogs (Blogburst is by invitation only) to the online sites of newspapers and other traditional media.

Blogburst receives licensing fees from the newspapers it sells the content to.  Initially, bloggers receive no compensation, but gain wider exposure.  According to Search Engine Journal, Pluck plans to compensate bloggers once they finish the beta period.

So, is it worthwhile for bloggers to participate?  I'd say so.  Prior to Blogburst, one of my Content Matters posts was picked up by the WashingtonPost.com and generated a nice spike in traffic.  Most bloggers today are more focused on exposure than on revenue, so syndication is a positive.  Conversely, as Darren Rowse of ProBlogger points out, you have to be OK with the idea that your syndicated posts may show up higher in search results on a partner site than on your own.  I have begun to syndicate Content Matters on Blogburst, but that's a decision each blogger will have to make for themselves.

For more takes on Blogburst, look at BlogSEO and of course, TechCrunch, who hosted the party where Blogburst was announced.