Content Management Requirements Toolkit 2.0

James Robertson has announced the release of the second version of his Content Management Requirements Toolkit:

The first version of the Toolkit has been used by organisations the world over, from Fortune 500 companies to government agencies and small businesses. This new version builds upon these successes, and delivers even greater value.

I highly recommend picking up this report if you are embarking upon a CMS selection. It was very helpful to us when we conducted our last CMS selection.

Timing AMS and CMS Implementations

I’ve heard from a couple of organizations in the past few months that are considering deploying their next association management system (CRM for the rest of the world) in conjunction with a content management system.

An AMS deployment alone will suck up all the oxygen in a typical npo IT department, occupying resources for months on end. A CMS deployment is resource intensive as well, especially if it is the first such system for an organization. Both usually drive a lot of business process changes during the deployment process. Trying to do both simultaneously is a big bite to chew.

This is not to say that it isn’t critically important to identify your AMS-CMS integration needs and plan on how to eventually do the integration. It is important. However, that doesn’t mean you have to plonk them both down at the same time in order to have an effective integration. Do one and get it right before you do the other if at all possible.

Column Two: CMS Myth #1: Installing a CMS must be hard

James Robertson takes a stab at busting some CMS myths. First up: Installing a CMS must be hard.

Installing the CMS software can be easy. It should be easy. If isn’t easy, ask why.

During our last CMS selection we required that our web admin be given demos to install from the finalists in our list. That ended up ruling out a couple of companies who couldn’t provide installable code without sending their own engineer over to do it.

Our theory was that if the company had not thought about how to make the initial installation easy then there were probably lots of other areas that had not had proper attention either. I couldn’t find a specific post on his site, but I’m pretty sure I picked up the idea of an effective installer as a sign of quality from reading Joel on Software.

In the careful what you ask for department…

RSS Traffic Burdens Publisher’s Servers:

InfoWorld, which is hosted by Verio, is committed to RSS. But Dickerson says he’s spoken with other large media sites that have delayed implementing RSS feeds, citing potential overhead on IT infrastructure. Some major publishers of RSS feeds are high-traffic sites that already use content distribution and caching to manage server load, such as Yahoo.

While a relatively small number of sites are currently seeing RSS traffic on the scale of InfoWorld, that’s likely to change as the technology becomes more popular. “If RSS is going to go from fairly big to absolutely huge, we’re all going to need to do a little more work on the plumbing,” Dickerson writes.

I wonder if any of the RSS client authors/producers have thought about randomizing thier collection times a bit to spread out the load?

Update:Bloglines‘ CEO has a few thoughts on how to improve the load issue.

Successfully deploying a content management system

James Robertson has posted another great article on content management systems: Successfully deploying a content management system. This quote sums up what the piece covers:

Our experience has shown that there are five key elements that must be addressed in a content management project:

  • strategy
  • change & communications
  • content
  • design
  • technology

The following sections discuss each of these five key elements, and give some examples of activities that should be considered.

Recommended reading.

Managing Search and Taxonomy

Lou Rosenfeld’s recent post on where to position search and taxonomy management within the organization was a nice validation of how we have it set up at our office. According to Lou:

To rant a bit, it really drives me nuts to hear people talk of “search and IA” (which they often understand as browsable taxonomies). This is an absolutely false distinction, and leads to poor search design, poor taxonomy design, and perhaps worst of all, missed opportunities to better integrate the two to support finding, IA’s ultimate goal. For example, search often is greatly improved when it leverages metadata tags. Metadata therefore should be designed with search in mind. So why separate teams? I don’t see any good reason, just a lot of bad ones.

At ASHA, we have two teams in the Web Cluster (our label for a division): the Content Management Team (CMT) and the Knowledge and Community Management Team (KCMT). CMT has responsibility for IA, visual design, general content development and managing the stream of content that comes from our 40+ content contributors. KCMT is responsible for managing our search engine, the ASHA intranet, the member community, online events and the ASHA thesaurus of terms. Both teams sit next to each other in our office and have easy access to one another. We also have a full staff meeting every two weeks where the topic of discussion is often on how we can improve the overall findability of content and services on our site by tweaking our search, metadata, etc.

While they are technically two separate teams, they operate as one in effect. I’m very happy with how well this arrangement has been working for us.

Microsoft Content Management Server

We have been using Microsoft’s Content Management Server in production for almost a year at this point. A while back I added a page of MCMS resources and links to the High Context Wiki. Feel free to add to the page if you have additional info/links/etc.

There must not be a lot of MCMS information sites out there. My little wiki page shows up very high in the Google results for many MCMS searches.

The Role of Technology in a CMS Selection Process

James Robertson has published a briefing on Specifying technology in a CMS tender. I agree with his overall premise but have a few comments on some of the specifics. First a quote:

In short, by focusing on the technology aspects, these tenders often fail to select the best product, and don’t deliver the desired business benefits.

For this reason, we encourage those developing tenders to concentrate on the business requirements, and minimise the technical details.

That being said, there is a legitimate need to specify specific technology issues. This briefing presents some guidelines for doing so, in a way that will generate the best outcomes.

The main point, that the technology is irrelevant if you don’t have criteria that will support your overall business objectives, is right on the money. Assuming you have that part down, I think it is very important to play to your IT strengths if at all possible.

One factor not mentioned specifically in the article is that CMS’ are typically high-maintenance beasts (in my experience). If it is running on a platform for which you already have experienced admins, your life will be much easier. There are a lot of not very well documented tweaks and tricks to keep servers and systems running optimally. You’ll need knowledgable admins for a CMS that bears significant load.

Also, staff expertise in the CMS coding language is more important than given here, I think. Without it you are completely at the mercy of contractors to make modifications, no matter how minor. If you have one or more staff who know the language you can make the minor adjustments that tend to come up pretty frequently without having to spend consulting money nor take the time to secure the resources. You can be more nimble by making those small adjustments yourself and save the cash for major development and integration projects.

And winner of the goofiest web traffic analysis tool goes to….

VisitorVille!

This thing will monitor your site traffic in real-time by embedding a code into your pages that pings their database when page is delivered. It then renders the data as a SimCity-style town, with visitors from search engines getting off buses to visit your pages. The pages are viewed as houses, for low traffic, or skyscrappers for high traffic.

I dunno if this would actually help analyze your traffic better than other tools out there but it certainly looks more entertaining.

Thanks to Dennis on The Well for pointing this out.