Comments Off on Technologies We’ll be Watching in 2010 | This entry was posted on February 9th 2010
2009 was an exciting time for the Web and Web-related technologies as technology advancements saw further shifts in the way we interactive with content and data online.
2010 looks set to be just as exciting as we see these trends advance further, as existing technologies continue to develop and new ones come into the picture.
Below are some things that we will be watching eagerly in 2010.
Read the rest of this entry »
Comments Off on tr.im resurrected. Rules Still Firmly Apply in URL Shortening Caution. | This entry was posted on August 11th 2009
I reported Sunday that tr.im had announced it was terminating its URL shortening service which, would have the knock on effect of discontinuing all links shortened with the service starting early 2010.
Nambu, the development firm behind tr.im today announced it will be retracting this decision, making its service available once more, and with the promise that links will NOT become unavailable after 2009.
Unfortunately, its not safe to sleep at night in the URL shortening world just yet.
Read the rest of this entry »
1 Comment | This entry was posted on August 9th 2009
In a coincidental, but shockingly relevant turn of events, the URL shortening service tr.im as of Sunday August 9th 2009 has announced it will be terminating its URL shortening service, effective immediately.
Regretfully, we here at Nambu have decided to shutdown tr.im, the first step in shutting down all of our products and services within that brand.
tr.im did well for what it was, but, alas, it was not enough. We simply cannot find a way to justify continuing to work on it, or pay its network costs, which are not inconsequential. tr.im pushes (as I write this) a lot of redirects and URL creations per day, and this required significant development investment and server expansion to accommodate.
(Read on at blog.tr.im – blog since terminated)
The coincidental part of the equation lies in my posting to the Visual Blaze blog last week about the many vulnerabilities of utilizing URL shortening services and how URL shortening should be performed with caution.
Shorten URLs with Caution. Here’s why.
As one of the larger players in the URL shortening market, tr.im’s move to shut down their service really drives home the need for users to use external URL shortening services with caution.
To the Nambu Network we express our regrets for the services discontinuation. To all URL shortening users we express our regret for any lose of link traffic that this news may bring.
Read the rest of this entry »
2 Comments | This entry was posted on August 3rd 2009
The rise of URL shortening
Anyone familiar with Twitter will no doubt have used URL shortening. URL shortening is a technique used to, as the name clearly suggests, shorten the number of characters required for a given URL/link. With Twitter’s 140 character limit, this has become an invaluable technique for adding links to tweets.
But URL shortening is also being used in places other than Twitter (and has been for a number of years). Many are using URL shortening to consolidate long URLs on Web sites, for external links in blog posts, links to downloadable content, and many other things. Some are using these shortening services to make long or complicated URLs easier to read and retype by users, and some for reducing the risk of malformed URLs being sent in emails, with longer and complicated URLs at risk from corruption by varying email clients’ interpretation and multi-line breakage.
Without doubt these services have great value – but their use should be done with caution.
Read the rest of this entry »
Comments Off on Search Engine Optimized Research | This entry was posted on July 20th 2009
This has been bugging me for a while now and I even started writing about it back in April. I thought perhaps it was just me being overly sensitive since we build Google optimized websites.
My curious mind and affinity for crunching numbers went to work (finally I get to use my Math minor). I wanted to get a decent sample size (1400+ websites) and give them a standard criteria to gauge how well Google reads their sites. The criteria mainly looked at how the site was built (indexing-friendly code), image/media tagging and meta description handling. The resulting report generated a score out of 100 and anything over 70 got the thumbs up.
Read the rest of this entry »