Guest Post by John Gillies, Director of Practice Support at Cassels Brock & Blackwell
More and more law firms are looking to adopt an enterprise search engine as a way of finding a way into (and out of) the mass of information that they manage. The right search tool can give a firm immediate access to content that is essentially unfindable. Choosing and implementing the right search tool is, therefore, a critical knowledge management activity. I’ll share here the steps we used in our process, and provide detail about the first, perhaps most critical step, establishing business requirements.
We are a one-office firm of about 200 lawyers, with about eight million documents in our iManage document management system (DMS). We have three databases, the Legal db for client-matter workspaces and precedents, the Support db, which is used by the admin side of the firm, and the Admin db. Every firm member has a personal workspace on Admin, with a public and a private folder. The intention, when created, was that lawyers would use their public folders to save things like precedents, business development content, articles they had written, and so forth.
Our process involved the following steps:
• Establishing the business requirements
• Identifying the search engines that we would test and picking the “winner”
• The proof of concept phase
• Database cleanup
• Pilot testing
There are several possible search engines that either could service the legal market or are designed specifically to do so, and the only way to choose the right one is to know exactly what your needs are. That is why a detailed business requirements document is essential.
This document starts off as a wish list that itemizes all the features that you would want from a search engine. Ideally, you will have a good sense what your users need, which will get you started. (If you don’t have a good sense, or even if you do, a user survey asking about searching and pain points around finding work product could be helpful.) Your list will then be supplemented by your reading, comments from your counterparts at other firms who have already gone through this process, and any other research you can do. (In preparing your list, make sure that you review Doug Cornelius’ posts on Four Types of Document Searches, which should help you focus your ideas.)
We ended up with a list of approximately 150 requirements.
The next step is to group all the items under related topics, and then to prioritize them. We had five categories: “Essential,” “Very Important,” “Important”, “Nice to have,” and “Useful but not critical.” We also established a list of the top ten essential items, which proved very useful in doing a focused comparison between the two search engines that we compared.
We used an Excel spreadsheet for our business requirements document, which allowed us to give marks to each item on the list (on a scale of one to five) and also to provide a weighted score. (For example, the “Essential” items got a weighting of five and the “Useful but not critical” items got a ranking of one. Accordingly, a particular “Essential” item that got a rating of 3 would have a weighted score of 15.) We then compared the functions of each of the search engines we tested, giving it a score, and Excel then did the computations to come up with an overall score.
At the end of the process, we were able to come up with good scores from two different search engines to help us make the final choice.
One aspect to bear in mind, though, is that this scoring and weighting process is designed to select a search engine to test. There will necessarily be other relevant factors that cannot be reduced to a mere score. This process does, however, concentrate the mind wonderfully on all of the necessary features of your search engine.