My Photo

Email me


Made with ImageChef

Google Content Search


Powered by Rollyo

Content Matters Community

Content Links

  • AlacraBlog
  • AlacraWiki
  • billtrippe.com
  • ContentBiz
  • E-Media Tidbits
  • eContent Magazine
  • InfoCommerce
  • ONLINE Insider
  • PaidContent.org
  • Research Recap
  • Seth Godin Blog
  • Shore Communications
  • That We Know
  • The Content Wrangler
  • Web Ink Now
Powered by TypePad

Content Industry Jobroll

  • Jobs from Indeed

Syndication and Reuse

  • Creative Commons License
    This work is licensed under a Creative Commons Attribution 2.5 License.

« May 2006 | Main | July 2006 »

June 29, 2006

Alacra Store adds RSS feeds for Credit and Investment Research

Alacra_store_logo_1 Alacra today announced the availability of free RSS feeds in the Alacra Store containing alerts on the latest credit and investment research from premium content publishers such as CreditSights, Fitch Ratings, Moody’s Investors Service, and Thomson Financial.

The Alacra Store has Company Snapshots on over 200,000 public and private companies. Now, on every Company Snapshot, users have the ability to subscribe to RSS feeds containing the latest credit and investment research on the company of interest.

For example, take a look at either of the following:

• Yahoo
• General Motors

All you have to do is click the RSS image on the top right of the company snapshot page and add the feed to your favorite newsreader.  Then you will receive alerts on new research reports which include the title of the report and a brief snippet of text from the report.  If you're interested in purchasing the report, all you have to do is click the link in the feed which will bring you to the page where you can add the report to your shopping cart and proceed with checkout.

For more details, read Jarid Lukin's post on the AlacraBlog or visit the Alacra Store.

The 50 Content Companies that Matter: Midpoint review

Midpoint_review_1 It's been nearly a year since I began the "50 Content Companies that Matter" series with a post on Google.  With 25 companies on the list, I'm halfway to my goal of fifty.  I thought it would be a good time to assess the list, reader feedback and what I've learned in the process.

In terms of feedback, it's been interesting.  Most of the comments have been positive, though there have been a few minor quibbles with my choices.  Others have proposed possible additions (often their own company), mostly smaller companies looking to break through.  Most comments have come via email, with only a handful of people commenting directly on the blog.  I hope this changes, as the comments themselves are quite interesting and I'd like to share them.

From the time I began the list, my goal was to recognize innovation in the content industry or among technologies which can be used in the content industry.  Not surprisingly, most of the innovation tends to be coming from the smaller companies, rather than the established players.  A few of you have pointed this out in your comments, asking why I've ignored some of the largest players in the content market.  While I have posted on a number of traditional content providers (recent posts on Nature and Morningstar, as well as earlier profiles of Zagat and Consumer Reports), much of the innovation has been coming from content companies without the long legacy.  That being said, there are a number of traditional publishers who are doing interesting things and you can expect to see some posts on them in the weeks to come.

On the other end of the spectrum, I've looked at a number of startups with compelling offerings, but who just don't seem qualified for the list.  In my definition, a "company that matters" is one which we'd miss if they were gone.  There are a number of companies vying for position in the social software market, but I'm not sure that any of them have established that type of position yet.  That's why a few months ago I began a second list, of Emerging Content Technologies.  This list is focused on technologies that should have an impact, whether or not the specific company profiled ends up as the market leader.  A good example of that is the hosted wiki space, where Jotspot, SocialText, Wetpaint and others are vying for position.  I can't tell you which of those companies will be the eventual winner, but I can tell you that departments and project teams are quickly adopting wikis to improve project communications.  While the Emerging Content Technologies list focuses on the bleeding edge, in each post I include my thoughts on how traditional content providers might apply that technology in a mainstream business. 

Another question I've been asked (really) is "why 50"?  In hindsight, I think 50 was a pretty good number to use.  Keeping the list to a "top 10" would have excluded many, many companies who are developing provocative content solutions.  At the other end of the spectrum, if I had aimed for "100", I'd find myself scraping the barrel towards the end.  Originally, I thought that I could complete the 50 in a year.  One per week would have done the trick.  Instead, it looks as though it will take 18-24 months.  Thankfully, my commuter train from the burbs gives me some quiet time in the morning for writing.

The experience of developing this list has been fantastic.  It forces me to continually assess what others in this industry are doing, something that's not so easy to do when we're all facing deadlines. 

What do you think about the list so far?  Please post your comments or them to me, along with any suggestions for future additions to the list. 

June 27, 2006

On Wire Transfers and National Security

This week, in a virtual replay of the recent NSA data mining disclosures, major newspapers, including the New York Times and Los Angeles Times, disclosed that the federal government has been accessing the SWIFT database to identify potential terrorist financing activities.

Politicos and pundits on both sides have expressed their outrage, yet when you peel away a few layers, there’s not an awful lot to be outraged about.

First, Democrats, moderate Republicans and civil libertarians have expressed their concern at the idea that the Government was using this private database to gather information.  Meanwhile, White House supporters (and Republican Congressmen up for reelection), have criticized the newspapers for making this information public, claiming that it threatens the usefulness of the program.  Some, like Representative Peter King have charged the Times with treason.

I think that both sides need to stop politicizing this issue and look at the facts.

First, in terms of the program itself, I think that most Americans would support the idea of using data mining of the SWIFT database to identify potential terrorist financing activities, provided that there was adequate oversight, controls and reasonable disclosure to Congress.  After all, the PATRIOT Act and the Bank Secrecy Act have provisions requiring banks to report any suspicious movements of cash to the Government, and new provisions on foreign correspondent banking have brought much greater scrutiny to the issue of wire transfers.

For the exact same reasons, it’s disingenuous at best for the government to claim that these recent newspaper articles have “blown their cover”, allowing the terrorists to know that their transactions are being monitored.  The PATRIOT Act has been in effect for almost five years.  There are thousands of articles in industry publications and on federal government websites, instructing financial institutions on their KYC requirements and provisions for SARs for suspicious wire transfers. During this period, the government has boasted of how it's efforts in this area have led to arrests.   According to experts in the field, in recent years terrorists have stopped using electronic transfer for their funds because of these changes, resorting to hawala - the use of informal money brokers and couriers to move money around.  To think that the terrorists were unaware that banking transactions could be monitored is simply naïve.

As with the NSA’s data mining of telephone records, there are a few core principles that we should all be able to agree upon:

  • First, issues of homeland security should not be politicized by either side. 
  • Second, advanced technologies such as data mining and text mining can be extremely useful, but require oversight so that security does not trump our civil rights.
  • Third, transparency is always a good idea, even in issues of national security.  The reason that the media and others were quick to condemn this data mining is that this administration has cloaked too many things under the guise of national security, giving them little credibility on such matters.

If the Administration would engage the Congress in an honest, meaningful way, I believe they would get Congressional approval to utilize advanced technology to strengthen our national security.  A healthy debate would educate, inform and, in my opinion, rally support.  Unfortunately, with midterm elections just a few months away, that’s unlikely to happen.

UPDATE: Richard Clarke and Roger Cressey's editorial in today's NYTimes "The Secret the Terrorists Already Knew", provides further insights on the inappropriate politicization of this issue.

June 23, 2006

Lead Generation Survey

Knowledgestorm_logo_1 Twice this week I received simply awful telesales calls.  The two were remarkably similar, but seemed to come from different companies.  Both callers were calling from an overseas (Indian) call centre.  Both were pitching professional services (one for development of web 2.0 solutions, the other for outsourced QA) and the purpose of the call was to get me to commit to a subsequent call with a sales rep.  In both cases, the telesales rep threw a bunch of buzzwords at me, without being able to explain what they were trying to sell.  In neither case did they last beyond 30 seconds (and, no, I did not agree to the subsequent call).

Unfortunately, most of us get calls like that all the time.  Despite the many lessons learned about selling and lead generation, it seems that most companies still think it's simply dialing for dollars and a numbers game.

This week, KnowledgeStorm, in conjunction with SiriusDecisions, distributed the results of a market survey of buyers of IT products and services.  The paper, entitled Demand Creation: the Prospect's View, was a summary version of a recent webcast they had done, and compiled results from surveying about 1,000 respondents.

Considering the state of telesales, it was not surprising that when asked which delivery mechanism they most often responded to, cold-calling got the lowest score, behind email and direct mail.  In fact, when asked which delivery mechanism vendors most get wrong, a whopping 61% of the respondents said cold-calling, with email and direct mail each getting about 10%. 

When asked for their most trusted sources, 29% indicated industry analysts, followed by peers and internal groups at 22% and 16%, respectively.  Search followed at 14%, seemingly low for the IT community.  The bottom of the list was comprised of vendors, VARs and Partners.  Well, those three are hardly independent voices, though many would argue that the IT analysts are not as independent as some might perceive.

Knowledgestorm_buying_cycle Perhaps the most interesting results in the survey, as shown in this table, are how they map the sources used at different stages of the buying cycle.  Early in the cycle, webinars, white papers and industry analysts are the leading source, as users are seeking to learn more about a general solution.  During the middle stage, as prospects are doing further analysis, demos and trials join white papers as key sources, while webinars begin to decline.  At the later stages, when users are making final vendor decisions, trials, along with analyst reports are the key elements.  That combination at the end makes me think that people, while legitimately testing out the software, also feel like they want to have a "magic quadrant" to hide behind if the implementation blows up on them.

In developing marketing materials for your website and to support direct sales efforts, it's critical to provide different content to support prospects' needs as they move through the buying cycle.  This study reinforces that idea that white papers and webinars are among the most effective part of the marking mix early in the sales stage.

June 21, 2006

Sphere adds relevancy to blog search

Sphere_logo While many consider the search market to be locked up by GYM (Google, Yahoo and Microsoft), there are still many niche areas of search where the big three do not dominate.

One of these areas is blog search.

Blog search is different than web search for a few reasons.  First, the time-sensitivity of blog posts make them more like news than like web pages.  Second, the page-rank approach of Google doesn't hold up very well for blogs, where the reputation of the blogger is more important than the number of links to a given post (particularly since links won't point to a page of a post that's just been authored).  Third, the display of blog search results has typically been reverse chronological order (newest posts at the top), simply because it is hard to otherwise rank the relevance of the results.

While Technorati, Feedster, Icerocket and others have made inroads into this space, there have recently been a few new entrants,

Sphere_results One of the more interesting recent entrants is Sphere.  Sphere aims to improve the relevance of blog search by applying improved algorithms so that results are ordered by relevancy, not simply chronology.  Sphere uses a combination of inbound and outbound links, metadata for the post and the blog and semantic text analysis to gain insights into what the blog is focused on.  The combination of semantic analysis and analysis of inbound links also helps push blog spam to the bottom of the results (since no reputable blogs are likely to link to spam posts).

Technorati_results In my testing, Sphere's relevance seems to hold up well.  While results for some searches are similar to that of Technorati or Google, in others I found Sphere's results to be clearly superior.  Particularly when searching for areas where the spammers are active, Sphere's ability to suppress those results provides a better search experience.  For those who prefer the Technorati-like chronological view, Sphere allows you to prioritize the results in that order as well.  You can also do a custom date search, entering start and end dates, particularly useful for looking back in time, but also useful for seeing the most relevant posts of the past week, for example.

Where Sphere really shows its stuff, however, is in their tools.  In particular, they offer a bookmarklet called "Sphere It".  A user looking at any piece of content on the web, can click "Sphere It" and the system will retrieve blogs posts that are similar to that content.   

While "Technorati This!" is a similar offering, the approach is very different.  Technorati This! uses links to find posts that point to the article you are searching.  Sphere It uses semantic analysis (most likely a Bayesian-like categorization engine) to find posts that are similar in content to your article, not simply those which link to it. 

Sphereit_topix2_1 For an example, take a look at what happens when I apply Sphere It to an A.P. article on political parties fundraising for the upcoming congressional races. 

The results are 17 blog posts from the past day talking about the fund-raising efforts of the two parties, none of which seem to link back to the underlying AP article.  The results are what you might expect to find using an enterprise classification application like Autonomy, as opposed to a simple link-based search.

Sphere_time Content providers looking to blend user generated content with their own editorial may also find Sphere It a useful tool.  Time Magazine has begun to use Sphere It on their Time.com site.  When reading an article, you'll see a box marked "related blogs", which uses Sphere It to find blog posts with similar content.

Sphere It is a compelling way to bring user generated content into your site with a simple "more like this" function.

Today, I find that I use a combination of Sphere and Technorati in my daily searches.  More and more I am turning to Sphere first, but Technorati still covers a wider universe and I check them when I don't find what I need in Sphere. 

Blog search remains a wide open market, and I won't be surprised if the GYM crowd improve their performance over time (through acquisition or otherwise).  In the interim, users will continue to reap the benefits of continued innovation.  I recommend you give Sphere and Sphere It a spin.  You'll be pleased with the results.

June 20, 2006

Craigslist Adds More Cities, International Coverage

Craigslist_logoAccording to the Wall Street Journal, Craigslist has just added 100 new cities, including 20 outside the United States, bringing its total coverage to 300 cities.

According to Staci at PaidContent, the expansion includes 72 US cities plus 28 international cities from Beirut to Helsinki.

In the article, WSJ editor Brian Carney interviews Craigslist CEO Jim Buckmaster over lunch and bloody mary's.  They hit on most of the standard topics (why do you leave a half billion dollars on the table?), but provide some interesting insights to Buckmaster's thinking.

Carney quotes Buckmaster  "I do think that the Internet is a spectacular tool for any information business -- newsgathering and other journalistic enterprises are essentially in the information business. Another aspect to it that gets reported on is drawing the lines within the Internet itself with respect to content generators and various kinds of aggregation and search tools.

Where does the revenue end up in those kinds of scenarios over time? I think you'll see the lines will move from side to side in terms of where the revenue lands among the various players in the information economy, which is still very young."

Craigslist continues to be a very interesting experiment in how a content technology company can deliver fantastic value to its users while taking a low cost, minimalist approach to its entire business.  While those outside the classified business may think Craigslist is not their problem, we can expect others to try to emulate Craigslist and freemium models in all aspects of the content space.

June 16, 2006

Google launches vertical search for Government market

Google_govGoogle today announced the (re)launch of Google U.S. Government Search, an updated version of their vertical search of federal government websites.

The Google Government search has actually been around for at least five years, though it was not heavily promoted.

The updated site indexes content from federal, state and local government sites - basically it's the subset of the Google index from .gov or .mil domains, along with some hand-picked sites from .com, .us and .edu domains.

Google_gov_john_wood Users can submit searches as they would on Google, but results will be targeted to those from government sites.  For example, looking for information on John Wood, who serves as Chief of Staff at the Department of Homeland Security, I first used the standard Google search.  It returned 289 million pages, led by John Wood Community College and John Wood Water Heaters.  The DHS Chief of Staff was nowhere to be found among the first few results pages.  The same search on Google Government yielded 3.5 million pages, with the DHS John Wood right at the top.

Google's relaunch of this site was timed to tout the hiring of Mike Bradshaw as head of federal sales.  Clearly, Google is looking to sell its Google appliance more heavily in the federal markets and hopes that as users find Google useful for searching their extranets, they will want to use Google for their internal file access as well.

For publishers serving vertical markets, this should serve as a reminder of how compelling vertical search can be.  Leveraging their domain expertise and the skills of database editors, content providers are uniquely situated to create useful vertical search solutions to drive traffic to their sites for advertising or to drive transactions.

While Google's U.S. Government Search helps improve search results, a publisher focused on the government space could take it even further, providing search by level (Federal, State or local), department or agency or even individual states or localities.

Update: An interesting comment from Outsell's Chuck Richard, who notes that "the SERPs are already populated with AdWords ads from information companies such as BidNet, Onvia, GovernmentBids, BidLink, and many more. This drives home once again that what on the surface looks competitive to information companies in the government space is being used by them to steer traffic back to their professional-quality, finely matched content."  While these AdWords bids were purchased for Google (and not specifically for Google U.S. Government Search), it's clear that they will benefit from the positioning. 

June 15, 2006

Notes from SLA

Camden_yards Baltimore was home to 5,000+ attendees of the Special Libraries Association annual conference this week.

For a recap of the week's events, take a look at the following blogs:

  • The InfoToday blog has extensive coverage, with details of everything from recaps of the presentations to new product releases to the Martini Luge  from the LexisNexis party.
  • For an international perspective, take a look at the Filipino Librarian blog by Vonjobi.
  •  Steve Goldstein has posted some photos from the Alacra party including a surprise visit by LA Clipper and reputed Alacra fan Sam Cassell.
   

June 14, 2006

The New "Rock Stars" of Information Services

Sla2006 As access to information becomes more decentralized, the role of the corporate librarian has undergone significant change.  Once the guardians of access, today they are more frequently involved in developing the strategies for how information is consumed at the desktop and throughout the enterprise.

But even as more and more end-users become proficient in accessing information, there are areas where the skills and experience of information services professionals is critical.  One emerging area for those skills is Compliance.

This week, at the Special Libraries Association annual conference in Baltimore, Alacra hosted a breakfast entitled “Getting to Know Your Clients”.  The speakers for this event were two executives from Lehman Brothers Compliance Department, Marty Cullen, Vice President, Client Identification and Verification, and Jim Holderman, Senior Vice President, Financial Crimes Prevention.  Each leads a critical function within Lehman’s efforts for Anti Money Laundering and Bank Secrecy Act Compliance.

As Marty pointed out in his presentation, the Know Your Customer provisions require financial institutions to understand “Who”, “What” and “How” about all of their customers.  And, he adds, “are those questions not the definition of the library profession?”  In performing this due diligence, his group’s role is to tell a story about a person, an organization or situation, utilizing public information.  And, while many financial institutions have hired former law enforcement and regulatory personnel, the key skills for performing and interpreting the research is a perfect fit for information services professionals.

That being said, the transition from the library to the compliance department is not always an easy jump.  Jim Holderman described the complexity of moving from a service function to a control function.  It requires you to know your own capabilities and to understand the corporate culture of your organization.  According to Jim, there are three key things to consider:

  • First, does your organization have a culture of compliance?  If there’s no commitment from the top, you won’t succeed.
  • Second, do you believe that your opinion really adds value?  Are you able to bring disparate information together and add valued interpretation?  If not, you will quickly be marginalized.
  • Third, are you prepared to deal with regulatory scrutiny?  Unlike most jobs, failure to perform your compliance function can lead to civil or even criminal penalties. 

While decentralization and outsourcing continue to impact the traditional corporate library role, information services professionals are finding that their MLS degree and research skills can gain them prominence in other roles.  Whether its Compliance for financial institutions or eDiscovery at law firms, information services professionals willing to consider a switch may find lucrative, high profile opportunities for their skills.

June 09, 2006

Google Spreadsheets - Initial Test Drive

Google_spreadsheet_logo_1 Now that I've had the chance to play with Google Spreadsheets a bit, here are my initial reactions:

Overall, it's a well-constructed implementation of Ajax.  In most ways, the app behaves like a software application not a web app.  Response times are fairly snappy and you don't feel like you're working on a hosted application.

Google_spreadsheets The basic look and feel is, well, like a spreadsheet.  It's the same basic experience that we've had for more than 20 years.  It's hard to get excited about Google Spreadsheets as an application.  At the same time, the fact that Google has delivered a compelling ASP spreadsheet should come as a big boost to supporters of the Software as a Service (SaaS) model.  It's not that this application takes us to new capabilities, but rather that it proves that there's not a lot that you can't do in a hosted model anymore.

As previously mentioned, I don't see this replacing Excel for typical U.S. business users.  Instead, it may be used for a lot of the non-business-critical tasks that spreadsheets are used for, such as maintaining contact lists.  Also, with hosted applications, users may not want to entrust confidential data to Google.

In related news, TechCrunch posts that social software provider SocialText has struck a deal to be the exclusive distributor of wikiCalc, another web-based spreadsheet.  WikiCalc is the brainchild of Dan Bricklin, father of the spreadsheet, who launched VisiCalc for the Apple IIe in 1979.

The significance of this could be bigger than Google Spreadsheets, according to TechCrunch.  Integrating WikiCalc within SocialText will enable users to host the content inside their firewalls.  Also, as a wiki, wikiCalc offers a complete audit trail, so you can see who edited what.  WikiCalc is an open source application and the look and feel are more like a web page than a spreadsheet.  While that may turn off the number cruncher types, I think it makes row and column data much more accessible to a wider audience.

While Google Spreadsheets seems like a great tool for non-mission-critical applications, wikiCalc could see quick adoption within workgroups for sharing project documents.

RSS Feed

  • Subscribe in NewsGator Online
  • Subscribe in Bloglines

  • Add to netvibes

  • Add to Google

Subscribe to Content Matters via email


  • Enter your Email


    Powered by FeedBlitz

Barry's Twitter Updates

    follow me on Twitter

    Premium Content

    • Premium Content from Alacrastore.com

    Facebook

    Research Topics

    Search the Alacra Store

    • Search For Business Information