Rich Skrenta, co-founder of news site Topix and the Open Directory Project, writes that PageRank wrecked the web.
I am not sure I follow his argument. But it's interesting.
Over at Smartmobs, Marius Chitosca writes about Skrenta's new project: a competing Web search service:
Convinced that Google, with its services, is not yet the best thing on a market of nearly $ 100 billion, Skrenta is determined to give birth to his baby, helped by a workgroup of six and a funding of $ 2 million. Still, as TechCrunch recently reported, the best prognosis for Blekko’s public prototype arrival online is 2009. Though trying to “hurt” Google Goliath on the competition level might seem as shooting blanks for fearful unimaginative minds — Google was itself a small, but ambitious startup 10 years ago — Blekko might just prove to be its smarter deadly David. As market niche, Blekko doesn’t need to be conceived right from the start as a serious Google crippler. 1% of the niche would be enough for starters… since it means $ 1 billion.




Comments (1)
In recent congressional testimony, I've argued that there's very little chance others can compete. Here are some reasons why, from my co-authored article with Bracha:
1) The Search Engine Algorithm. The heart of a search engine and the key to its success is its search algorithm. Effective algorithms are protected by a veil of secrecy and by various intellectual property rights. As a result, new entrants cannot easily appropriate existing algorithms. Moreover, many algorithms are trade secrets. Unlike patents, which the patent holder must disclose and which eventually expire, these trade secrets may never enter the public domain. Search algorithms may be analogous to the high-cost infrastructure required for entry into the utility or railroad markets.
2) Network Effects in Improving Search Responsiveness. The more searches an engine gets, the better able it is to sharpen and perfect its algorithm. The result is that each additional user decreases the cost of a better quality service for all subsequent users. Thus, incumbents with large numbers of users enjoy substantial advantages over smaller entrants.
3) Licensing Costs. A key to competition in the search market is having a comprehensive database of searchable materials. The ability to obtain exclusive legal rights over searchable materials, however, may substantially increase the cost of obtaining and displaying this data and the metadata needed to organize it. Exclusion rights entail licensing (or legal advice) fees, which in the aggregate may raise fixed cost substantially. Google’s notable fight to obtain favorable fair use treatment for an index of books, for example, obscures its licensing deals with some content providers. To what extent exclusion power through licensing is the industry norm is the subject of a host of legal battles taking place on various fronts. If such licenses become the industry practice, only the wealthiest players will be able to afford to develop a comprehensive database of searchable material.
4) Consumer Habit. Many searchers are accustomed to using a certain number of providers, use them relatively habitually, and are reluctant to switch, despite the existence of alternatives. Exactly how high are search engine switching costs is an empirical question that has not been satisfactorily answered to date. To switch a substantial number of users, a new entrant has to supply a product of significantly better quality, again, steeply raising fixed cost. Another factor that may raise switching costs is the trend toward personalized search, which effectively “trains” a service to tailor its results to match the patterns of a user. Just as users “invest” in learning how to use Microsoft Word or Excel, and are reluctant to switch to a new program, they “invest” in training personalized search engines how to find the materials most suited to their interests. The correlation between the quality of search and the length of use in personalized search is likely to further lock users in with an existing provider.