Though I find Gordon Elliott thoroughly irritating, I like the idea behind his show, Doorknock Dinners. The basic premise is taking what you already have and creating something that’s better than the sum of its parts. To a very great degree the same can be done when deploying information management solutions.
Far too many organizations don’t understand what they have in their cupboards. So off they go and procure some massively expensive ECM tools and then try and build out an Enterprise grade solution and fall flat on their faces. This is exacerbated, in many cases, by not having an understanding of what the end result is even supposed to look like and by not knowing what internal capabilities (skills & tools) they already possess.
I was having lunch yesterday with some guys that are in the early stages of a project to implement physical records management. They identified some issues related to manually loading records metadata into the tool they had recently acquired. The issues include:
- Users don’t like the UI of the new tool;
- Remote users sometimes get disconnected and don’t know which was the last successful transaction;
- Some remote users work on a different cycle.
The initial thinking on the part of management was to spend time and effort to build an application solely to allow users to input the metadata (in real-time) and be assured that what they entered actually got in. My lunch companions didn’t give me hard numbers, but my guess is that the estimates came out to around 4-6 weeks of full-time effort for a consultant to build this tool. In addition to building the custom tool, effort would be required every time there was a patch or upgrade to the tool they had recently purchased.
My question to them was simply “How are you capturing metadata today?” As it turns out they already have a db tool that they use to capture the metadata. The tool works and is accepted by the users. They have the skills in-house to modify and maintain the tool. The new RM tool has batch loading capabilities. Their current tool can output metadata batches which can be loaded into the new tool. The incremental effort to modify their current tool and test the batch loading would be less than one week with no external consulting required. Any updates/patches/upgrades to the tool they recently purchased would have zero impact.
My point is that it’s all well and good to go out and buy new tools and engage consultants. Before you do so, however, understand what your current capabilities are.