blog provides a beginner, bird's eye view of the entire process of implementing a department-wide content management system. For more information on Enterprise Content Management systems, click on one of the companies to the right.
This is simply an account of how successful (or not successful) a new CMS was integrated into the literature department of a Fortune 500 company over the span of 6 months. I am a user of the content management system, not a developer. I do not detail the actual technical side of constructing and programming a system.
I have written about each touchstone moment in the entire process and have labeled those by Week. Note that it took longer than the number of weeks presented here; I just listed the touchstones this way to give you the semblance of a structured, yet unrealistic, timeline.
Week 1 - Implementing a Content Management System
Week 2 - Deciding on a Content Management System Team
Week 3 - CMS Training
Week 4 - My Content Management Process
Week 5 - Under Construction
Week 6 - Server Incompatible with Content Management System
Week 7 - Content Management and Knowledge Transfer
Week 8 - Content Management Goals and Consistency
Week 9 - Content Management in a Bad Economy
Week 10 - The Content Management Light at the End of the Tunnel
Week 11 - XMAX versus XMetal
Week 12 - Migration and Legacy Content
Sunday, March 22, 2009
Saturday, March 21, 2009
XMAX versus XMetal
I was at a meeting with Just Systems today. Just Systems is a content management software company. There most popular product is XMetal (a content management software product). Today's meeting was about a new product they have called XMAX. Basically, this is an lightware version of XMetal. It's got some of the same capabilities as XMetal as far as providing an XML editing component. The cool thing about this new content management product is how it can be customized and embedded into any container or holder that an organization may already have set up for their document needs.
At it's core, XMAX simply looks like a basic window and does have a few buttons that switch views and allows importing and exporting; beyond that, it's not really a full user interface we think of today. It's slim and serves as an editor window via using Active X. This setup makes it easy and natural, as if opening a notepad file, editing the structured content within, then embedding or integrating that augmented content into whatever container you have (such as anything created in Java or HTML).
All in all, XMAX seems intuitive and slick, seeming allowing you to update your content on the fly through this app, and then having that content fit into a web site, for example. USA Today uses XMAX on a regular basis; go to anyone of their entertainment web pages to see it in action.
At it's core, XMAX simply looks like a basic window and does have a few buttons that switch views and allows importing and exporting; beyond that, it's not really a full user interface we think of today. It's slim and serves as an editor window via using Active X. This setup makes it easy and natural, as if opening a notepad file, editing the structured content within, then embedding or integrating that augmented content into whatever container you have (such as anything created in Java or HTML).
All in all, XMAX seems intuitive and slick, seeming allowing you to update your content on the fly through this app, and then having that content fit into a web site, for example. USA Today uses XMAX on a regular basis; go to anyone of their entertainment web pages to see it in action.
Wednesday, March 18, 2009
Week 10 - The Content Management Light at the End of the Tunnel
This week, we are slated to finish our last, third stage of testing. Already we have gone through the development and stage environments. We are now moving into the final test phase called the production environment. As you can see from my previous blogs, it’s been a long road to get here. Years in the making. Two previous systems have already came and gone; however, it seems the third time is a charm -- working with Flatirons has gone fairly well (for the most part) and now we are in the home stretch. Testing today will comprise working through several different workflows for a few different deliverable (sales material, technical material, field material, and updates). The workflows for each have corresponding test scripts (steps) written out in fine detail on an Excel spreadsheet. As we move through each step of the workflow we mark the respective script as either a pass or fail. Those steps that have failed us as of late have mainly been due to permission and preference settings in Webtop. Those were all hammered out in the last two environments and so we should be all set.
Update: I just got handed a new copy of Adobe, Acrobat 9. I will have to install this today and run it side by side with the CMS, since some preferences in Webtop require that we select a commenting program to work from. (When reviewing topics and and maps we have to use Adobe to comment and make suggestions for the writer of the topic or map -- see prev. entry).
Return to HOME page or leave comment below.
Update: I just got handed a new copy of Adobe, Acrobat 9. I will have to install this today and run it side by side with the CMS, since some preferences in Webtop require that we select a commenting program to work from. (When reviewing topics and and maps we have to use Adobe to comment and make suggestions for the writer of the topic or map -- see prev. entry).
Return to HOME page or leave comment below.
Tuesday, February 17, 2009
Week 9 - Content Management in a Bad Economy
The economy came down this past month. Things have gotten pretty bad out in there in the wilderness; layoffs around the U.S. are becoming normal, day-to-day operations. Thankfully, my company has a lot of cash reserves and therefore enough liquidity to get over this hellish economic bump in the road. With regard to our content management status however, we seem to be at a stand still. As stated in prev. post, Dakota had impressed us with it’s knowledge transfer proposal, which we all felt would have put us back on track with implementing our content management system. But due to the current economic climate, we can't get the money to hire these guys.
The only light in this dire situation is that Faltirons is getting worried they won’t get paid. That’s leaves us with a little bit of wriggle room. Rumor has it that the server issue will be resolved soon: Flatirons has offered to subsidize a new server with our Documentum people for just the content management system to run off of, which is a workable solution for both parties.
Hopefully, we can get that end of it handled and finish the implementation of the system. However, the Flatirons contract is about to conclude, giving us just over two weeks to finish testing in 3 different environments (development, stage, and production). Yikes!
Return to HOME page or leave comment below.
The only light in this dire situation is that Faltirons is getting worried they won’t get paid. That’s leaves us with a little bit of wriggle room. Rumor has it that the server issue will be resolved soon: Flatirons has offered to subsidize a new server with our Documentum people for just the content management system to run off of, which is a workable solution for both parties.
Hopefully, we can get that end of it handled and finish the implementation of the system. However, the Flatirons contract is about to conclude, giving us just over two weeks to finish testing in 3 different environments (development, stage, and production). Yikes!
Return to HOME page or leave comment below.
Monday, January 26, 2009
Week 8 - Content Management Goals and Consistency
After meeting with Dakota, everyone was excited about the new content management system again. Dakota had given us a well-written and well-priced proposal as well as a direction for how we should approach bringing our content into the system and conducting the knowledge transfer.
Since we were stalled with Flatirons (see prev. post) we decided to take advantage of our time and move ahead with the advice Dakota had bestowed us. In doing so, a few of us sat down to really analyze our current documents and find how much of our literature was really reusable. By reusable, I mean a chuck of content from a particular suite of documents that appears over and over again on a continual basis. Thus far, we had never really created a library of such content and that's what we had planned on doing today.
The problem that we ran into, that I am sure any company would run into, is consistency. Consistency is a critical in ensuring that the new content system would operate smoothly. Our first issue with consistency was figuring out what topics were reusable in what kind of deliverables. Further, we asked if we would always assume that said content be used with a particular deliverable?
The second issue with consistency came in how we were to save the topics or chucks of information; in other words, what would the actual file name be versus what was to be included in the metadata. Having a naming convention is critical because that's how future users of the content management system will search the repository (the place where the all the content lives).
Again we found ourselves in a relatively heated debate. Should we name the files with a series of numbers or actually describe what the content within that particular topic is? Would legal wording in one type of deliverable make its way into a second type of deliverable? How extensive should a reusable topic remain, if parsing it may make it more reusable? Would we always have to take out parts of a topic that were not needed for certain deliverables? These types of questions and many more permeated the meeting and left us confused as to how we were going to approach this entire situation.
Then one writer suggested something a little unorthodox. He said that we first should start playing around with the system, adding in a large chunk of test content to see how it works in real time (for example, during a scheduled due date), and see how individuals intrinsically search for content (whether that be text or image files). After which, we might be able to decipher the naming convention. When the meeting ended, things were still up in the air, but it seemed that the only way to move ahead might just be as the writer suggested and dive right into it, putting the initial content in before a realistic due date.
Return to HOME page or leave comment below.
Since we were stalled with Flatirons (see prev. post) we decided to take advantage of our time and move ahead with the advice Dakota had bestowed us. In doing so, a few of us sat down to really analyze our current documents and find how much of our literature was really reusable. By reusable, I mean a chuck of content from a particular suite of documents that appears over and over again on a continual basis. Thus far, we had never really created a library of such content and that's what we had planned on doing today.
The problem that we ran into, that I am sure any company would run into, is consistency. Consistency is a critical in ensuring that the new content system would operate smoothly. Our first issue with consistency was figuring out what topics were reusable in what kind of deliverables. Further, we asked if we would always assume that said content be used with a particular deliverable?
The second issue with consistency came in how we were to save the topics or chucks of information; in other words, what would the actual file name be versus what was to be included in the metadata. Having a naming convention is critical because that's how future users of the content management system will search the repository (the place where the all the content lives).
Again we found ourselves in a relatively heated debate. Should we name the files with a series of numbers or actually describe what the content within that particular topic is? Would legal wording in one type of deliverable make its way into a second type of deliverable? How extensive should a reusable topic remain, if parsing it may make it more reusable? Would we always have to take out parts of a topic that were not needed for certain deliverables? These types of questions and many more permeated the meeting and left us confused as to how we were going to approach this entire situation.
Then one writer suggested something a little unorthodox. He said that we first should start playing around with the system, adding in a large chunk of test content to see how it works in real time (for example, during a scheduled due date), and see how individuals intrinsically search for content (whether that be text or image files). After which, we might be able to decipher the naming convention. When the meeting ended, things were still up in the air, but it seemed that the only way to move ahead might just be as the writer suggested and dive right into it, putting the initial content in before a realistic due date.
Return to HOME page or leave comment below.
Thursday, January 15, 2009
Week 7 - Content Management and Knowledge Transfer
So, ladies and gentlemen, today was an very interesting day in the world of content management. We had a serious sit-down with some knowledge transfer competitors by the name of Dakota Systems. They basically prepared a nice proposal, reasonably priced, that reflects their knowledge transfer services. What is knowledge transfer you ask? Simple, it's when one entity teaches another entity how to do stuff in an efficient and effective manner (and people versus instruction manuals are needed to transfer the knowledge because most of the knowledge is tacit, or ingrained in the minds of the experts, hardly ever extracted into the written word).
More specifically, in the field of content management, knowledge transfer equates to a team that teaches others how to use a CMS system, how to push the system to its boundaries, how to evaluate technical issues effectively, how to use the system in the most efficient way, how to ensure that the system remains expandable (so that, in the future, content can be built up, figures or diagrams can be easily altered, etc.). This knowledge transfer team also offers coding training in case the system actually needs to be reprogrammed for any reason. Lastly, and maybe most importantly, the knowledge transfer team should define and shape the process by which the actual content moves from the old suite of deliverables and documents into the new content management system.
Piece of cake right? No. That's why it's imperative to use a knowledge transfer team that knows what the hell they're talking about. And in our meeting today we had some individuals who did not hesitate to answer the tough questions. They seemed to already know the system (even though they did not build it) inside and out. They also easily grasped our current process and illuminated the direction in which we now need to head. And we haven't even hired these guys yet.
Return to HOME page or leave comment below.
More specifically, in the field of content management, knowledge transfer equates to a team that teaches others how to use a CMS system, how to push the system to its boundaries, how to evaluate technical issues effectively, how to use the system in the most efficient way, how to ensure that the system remains expandable (so that, in the future, content can be built up, figures or diagrams can be easily altered, etc.). This knowledge transfer team also offers coding training in case the system actually needs to be reprogrammed for any reason. Lastly, and maybe most importantly, the knowledge transfer team should define and shape the process by which the actual content moves from the old suite of deliverables and documents into the new content management system.
Piece of cake right? No. That's why it's imperative to use a knowledge transfer team that knows what the hell they're talking about. And in our meeting today we had some individuals who did not hesitate to answer the tough questions. They seemed to already know the system (even though they did not build it) inside and out. They also easily grasped our current process and illuminated the direction in which we now need to head. And we haven't even hired these guys yet.
Return to HOME page or leave comment below.
Friday, January 2, 2009
Week 12 - Migration and Legacy Content
Today we had a basic refresher on the benefits of Content Management via a presentation given by Vasont (this was done to keep CMS as a viable option given these rough economic conditions). The first and most important beneft would obviously be ROI. But Return on Investment can only first be calculated if it's possible to figure out what the cost to implement the system is in the first place. In figuring that cost out, one must examine a major aspect of implementing the CMS: Migration. Migration is the process by which the current doucmentation will "get" into the new content management system.
We can begin (and finish) discussing migration with the topic of Legacy Content. Legacy Content is content that was created before the content management system is implemented, but content that will be around after the CMS system is launched. It's content created with the idea that it will be converted into ACII text or a .cvs file in the future.
This conversion can happen in one of two ways: legacy content can be converted via a modular-source or book-source method. With modular source, you can mannually convert the legacy content across all documents, increasing conversion costs and time spent as each piece of content is converted as a stand-alone or peice meal effort. The benefit to this would be a granualr and micro examination of the content before being embedded into the CMS. On the other hand, book-sourced conversion, breaks down text by topics and headings and maximizes reuse. It is autmoated and does not require reauthoring; this method is of course cheaper and less time consuming.
Once a method is decided upon, or if both methods are used, the next thing one should look at are the best practices of Migration. The best practices comprise 5 activities:
We can begin (and finish) discussing migration with the topic of Legacy Content. Legacy Content is content that was created before the content management system is implemented, but content that will be around after the CMS system is launched. It's content created with the idea that it will be converted into ACII text or a .cvs file in the future.
This conversion can happen in one of two ways: legacy content can be converted via a modular-source or book-source method. With modular source, you can mannually convert the legacy content across all documents, increasing conversion costs and time spent as each piece of content is converted as a stand-alone or peice meal effort. The benefit to this would be a granualr and micro examination of the content before being embedded into the CMS. On the other hand, book-sourced conversion, breaks down text by topics and headings and maximizes reuse. It is autmoated and does not require reauthoring; this method is of course cheaper and less time consuming.
Once a method is decided upon, or if both methods are used, the next thing one should look at are the best practices of Migration. The best practices comprise 5 activities:
- Understanding the requirements of the system
- Understanding the requirements of your legacy content
- Test/Pilot conversion
- Flexing the system; getting to know it's boundaries
- Producing; loading the system with the leagacy content.
Subscribe to:
Posts (Atom)