The downside of URL shorteners

Very thoughtful post by Joshua Schachter.

The worst problem is that shortening services add another layer of indirection to an already creaky system. A regular hyperlink implicates a browser, its DNS resolver, the publisher’s DNS server, and the publisher’s website. With a shortening service, you’re adding something that acts like a third DNS resolver, except one that is assembled out of unvetted PHP and MySQL, without the benevolent oversight of luminaries like Dan Kaminsky and St. Postel.

There are three other parties in the ecosystem of a link: the publisher (the site the link points to), the transit (places where that shortened link is used, such as Twitter or Typepad), and the clicker (the person who ultimately follows the shortened links). Each is harmed to some extent by URL shortening.

The transit’s main problem with these systems is that a link that used to be transparent is now opaque and requires a lookup operation. From my past experience with Delicious, I know that a huge proportion of shortened links are just a disguise for spam, so examining the expanded URL is a necessary step. The transit has to hit every shortened link to get at the underlying link and hope that it doesn’t get throttled. It also has to log and store every redirect it ever sees.

The publisher’s problems are milder. It’s possible that the redirection steps steals search juice — I don’t know how search engines handle these kinds of redirects. It certainly makes it harder to track down links to the published site if the publisher ever needs to reach their authors. And the publisher may lose information about the source of its traffic.

But the biggest burden falls on the clicker, the person who follows the links. The extra layer of indirection slows down browsing with additional DNS lookups and server hits. A new and potentially unreliable middleman now sits between the link and its destination. And the long-term archivability of the hyperlink now depends on the health of a third party…

I hadn’t thought of this, and indeed have been cheerfully using without thinking about the consequences. And then I came on this perceptive post by Om Malik on the business model underpinning

Yesterday, New York-based startup incubator Betaworks raised $2 million in funding for its URL-shortener project,, and spun it out as an independent company. The funding raised some eyebrows, with some speculating if, one of the dozens of link-shortening services, was worth a rumored $8 million. I fall in the camp of those who think is worth the money.

Here’s why: The most important aspect of is not that it can shorten URLs. Instead its real prowess lies in its ability to track the click-performance of those URLs, and conversations around those links. It doesn’t matter where those URLs are embedded — Facebook, Twitter, blogs, email, instant messages or SMS messages — a click is a click and counts it, in real time. Last week alone, nearly 25 million of these URLs were clicked.

By clicking on these URLs, people are essentially voting on the stories behind these links. Now if collated all these links and ranked them by popularity, you would have a visualization of the top stories across the web. In other words, it would be a highly distributed form of, the social news service that depends on people submitting and voting for stories from across the web. Don’t be surprised if formally launches such as an offering real soon. This will help them monetize their service via advertising…