More and more website publishers are deciding to drop support for Internet Explorer 6. This browser has notoriously poor, incomplete, and outright missing support for modern standards in web design and interactivity. Thus making a site that works well in IE6 and current browsers requires significant additional effort. Many web design firms are moving to a model of charging extra for IE6 support on top of regular design fees.
IE6’s days are numbered.
However, many organizations wrestle with when to drop it from the list of supported browser for their site. Total traffic from that browser version is often cited as a metric to use. Only 6% of our visitors use it? Drop that browser like a hot potato!
Hold on though, what if those 6% do something important, like making 20% of your online purchases in your store? Do you want to walk away from that income? Probably not.
Here are a few criteria to use in assessing if you should continue to support IE6 for the time being while we wait for it to finally die off (which is happening more rapidly now).
Do a significant proportion of IE6 users on your site:
Login to a members-only area or frequently use other core functionality?
Make purchases in your store? Representing how much revenue?
Represent key constituents or prospects for your organization?
You get the idea. If IE6 users are not part of a relevant audience for your key goals online then you are safe dropping them even if they are a somewhat high percentage of your traffic. If IE6 users are valuable to your organization, then you very well may be better off investing in supporting them.
Seth Gottlieb, of Content Here, posts some very good points about the extent to which you should separate the management of your content from the systems that actually publish them for your online audiences.
However, as I have warned in earlier posts, the flexibility may not be worth the cost for all publishers. Unless your business model depends on aggressively leveraging your content and you can afford to play on the cutting edge, a lighter weight “website in a box” style architecture may give you the flexibility you need without the additional complexity and cost of building and integrating these de-coupled systems.
In short, you have to balance elegant engineering with the value of the outcomes you are pursuing with your web site. If you are in the content publishing business and are of sufficient size, then extreme separation can pay off in a significant way. Outside of those two conditions, a pursuit of architectural elegance may actually be counter productive for your needs without sufficient return on the big investment it requires.
I saw via Twitter last week (sorry, can’t remember who posted it!) this post about the New York Times Newswire API. In essence, the Times has published an interface with which you can access their latest headlines, including tons of meta data options to slice and dice your query.
This kind of API tends to result in a lot of experimentation and new value in presenting the content available via the system. Twitter’s API is a great example of this.
Pretty innovative stuff. If you are in the business of moving content this is something to check out.
R is also the name of a popular programming language used by a growing number of data analysts inside corporations and academia. It is becoming their lingua franca partly because data mining has entered a golden age, whether being used to set ad prices, find new drugs more quickly or fine-tune financial models. Companies as diverse as Google, Pfizer, Merck, Bank of America, the InterContinental Hotels Group and Shell use it.
But R has also quickly found a following because statisticians, engineers and scientists without computer programming skills find it easy to use.
This is pretty interesting stuff on a couple of levels. One, it is a programming language created by and for statisticians and data analysts rather than programmers. The story indicates it is easier to use by non-programmers who want to do custom analysis. Two, it is an open source project begun over ten years ago that is now starting to challenge the dominance of SAS, the dominant stats package. It is a classic example of an established firm being disrupted by an upstart, innovative, technology.
Rafe Coburn, a very long time blogger, posted today about how he is seeing more and more web developers who don’t know SQL very well.
It seems to me, though, that actual knowledge of SQL seems to be falling. I blame this on the growing popularity of persistence frameworks that abstract the database away, allowing developers to interact without databases without writing much (or any) SQL. … Many developers don’t even learn SQL in depth, period.
Rafe goes on to explain why knowledge of SQL (a way to query databases directly and with great flexibility) is key knowledge even when your team uses a development framework that abstracts away the database.
For those of you who managed web teams and developers, make sure you are investing in these fundamental skills as well as in the specific technology that is unique to your operations.
Google posted to their official weblog today a bit of background on how they can process dynamic URLs. A dynamic URL is one that contains lots of junk that humans can’t read, including symbols. A LOT of system still create URLs like this, especially when the web page is created from data stored in a database.
It used to be common knowledge that URLs with natural language words in them did better in natural search results placement than dynamic URLs because Google would be better able to process them. It appears that is no longer operative. The key graph from the post today on that topic:
While static URLs might have a slight advantage in terms of clickthrough rates because users can easily read the urls, the decision to use database-driven websites does not imply a significant disadvantage in terms of indexing and ranking. Providing search engines with dynamic URLs should be favored over hiding parameters to make them look static.
While there is still benefit to displaying static URLs for the human using your site it seems that, from Google’s perspective, it’s not worth doing just for search engine placement. Interesting!
Target has settled a class action lawsuit with the National Federation of the Blind over accessibility complaints with Target.com. Despite the law being unclear as to whether the Americans with Disabilities Act (ADA) applies to websites, the company will pay a substantial fee and update its web site to make it accessible to the blind.
Another case study in how building accessible, standards compliant, web sites is not only the right thing to do but can save you millions. Plus, doing the right thing in this regard is easier to do than ever before with improved browsers, web application frameworks and agreed upon standards.
A complicating factor for some organizations can be that they are using systems for their sites that have been developed and added to since the early days of the web when accessibility wasn’t even an afterthought. However, committing to upgrading your system before the lawsuits were filed would have allowed Target to invest a portion of that $6 million in improving their site rather than paying a fine.