Contributed by Kristy L. Peters and Mark S. Yacano, Hudson Legal
There is no question that undertaking a document review project is daunting. Whether your firm already has the necessary resources in place or instead selects specialized companies to handle all or part of the process, there are many mission-critical decisions that need to be made quickly. Given the incredible volume of data that typifies even today’s simplest matters, the consequences of e-Discovery mistakes are terrifying. Staggering expenses, client confidence and sanctions all hang in the balance.
By taking a bit of time at the outset to strategize, however, you can avoid common problems and increase your likelihood of conducting a successful and efficient review. Here are five keys:
Assemble the Right Team, Right Away
It is tempting to take each piece of the Electronic Discovery Reference Model (EDRM) one at a time and hire only those resources strictly necessary so as not to bust your client’s budget from the get-go―first turning to collection and processing, then choosing a platform, and then picking a review team, etc. Instead, as soon as you’re faced with a document review situation, consider putting together an experienced team of e-discovery experts, with a responsible point-person for each phase. Then, get all team members talking to each other right away. You’ll realize greater efficiencies with end-to-end solutions and companies that have demonstrated an ability to work well together. If you choose a provider that only handles one piece, be sure to loop other members of the team in as soon as is practicable so tasks can be divided as efficiently as possible. You don’t want to be faced with a situation, for example, where data is collected by a vendor in a way that turns out to be incompatible with your review platform. You’ll end up having to recollect or switch platforms―and being forced to do either is expensive and time-consuming. The last thing you want to do is have to go back to your client to say “we didn’t handle this correctly the first time and we’ve got to start over.”
It’s advisable to vet providers before you need them. You don’t want to have to scramble to figure out who does what―and of equal importance, who does what well―at the eleventh hour. Try to assess whether your unique needs and preferences are being understood and ask for references. All good service providers are willing to customize solutions and will treat you as their number one client no matter how big or small the assignment.
Communicate, Communicate, Communicate
It is critical to make your expectations clear and to also understand the expectations of others. This starts with getting on the same page with opposing counsel and the judge, and the dialogue continues with your internal team and those you hire. Don’t assume that other people can read your mind. Often, there are several valid and reasonable approaches to an issue. If you’ve got a preference for your review to be done a certain way, express that. You may find that after a sit-down with your team, they’re seeing issues that you’ve missed that make their approach the best course of action. Other times, they’re the ones who haven’t seen the forest for the trees. In either case, communication saves the day.
One critical group of people who are often overlooked for substantive input on reviews is the reviewers themselves. They’re the ones actually looking at the documents, so they may notice trends or themes that can only be gleaned by going through the data. Here’s an example: you queue up a set of documents and ask reviewers to tag documents whenever the terms “flooding” or “water damage” are encountered in a context that makes sense for your case. The review team learns after diving into the documents that the term “mold” also occurs in overlapping documents but that there is no tag for “mold.” “Mold” may or may not be relevant to your matter, but if you don’t know that it’s popping up, you’re not able to make an informed call about whether to create a new tag. Your reviewers (typically fully-licensed attorneys) are experienced, capable thinkers who may have excellent suggestions.
Consider having regular check-in meetings to see if anything relevant-seeming that didn’t make it on the coding form has caught their attention. Another alternative is to set up (or have your provider set up) email accounts where reviewers can send questions or comments to the attorney from your team who is responsible for overseeing the review. The attorney can then “reply all” to the team to indicate, for example, how a certain class of documents should be handled. Just like in school, if one reviewer has a question about how to approach something, you can be sure that others in the group do too. The end result of putting forth this extra effort? A higher quality deliverable.
Technology is Your Friend―Use it
The days of linear review are numbered. If cost is a concern (and when isn’t it?) talk to your team about new and different review methodologies that can help you conduct your review at maximum efficiency. Familiarize yourself with the latest approaches so you can decide whether a given approach makes sense for your situation.
Here are some concepts you may want to explore for your next document review:
In the past, most review platforms only allowed data to be reviewed in order, page by page. Searches could be performed for key words, but there was no other higher-level way to sort and filter the data so that only potentially relevant documents were presented for review. How many of us have spent hours as junior attorneys going one by one through a custodian’s Fantasy Football draft pick emails even though they had nothing whatsoever to do with our case? It’s wasted time and money. Non-linear review describes any method by which documents are reviewed in a way other than chronological
Predictive coding is a method of review whereby a computer program can categorize entire collections of documents as responsive or non-responsive without further human intervention. Typically, the program ranks documents from most to least likely to be responsive based on the parameters articulated at the outset. These rankings can then be used to determine which documents warrant further attention by human reviewers to QC the decisions made by the computer. Just as no spam filter perfectly categorizes all emails as junk or legitimate, predictive coding is not yet able to perfectly identify all relevant documents. The application of this concept is nuanced and there are a couple ways to approach its application, but it is definitely worth exploring.
Remember Boolean searches? And/or/not/within 3? Concept searching is the next step in searching evolution. A concept search (or conceptual search) is an automated information retrieval method that searches data for the ideas expressed rather than the proximity and appearance of search terms. Using semantic and statistical algorithms, related data can be grouped. With concept searching and the right inputs, your search for “fruit” would also yield data about strawberries, guava, jams and pies.
Clustering is any number of software applications that organize your documents by grouping them into clusters based on the similarity of the text they contain. Then, by looking at sample documents, you can often make accurate judgment calls on behalf of the entire group.
De-duplication and near de-duplication
With near de-duplication, functionally equivalent data that also appears elsewhere can be culled from your dataset prior to review so that you do not waste time reviewing multiple copies of the same data.
Technology is not nearly as daunting as it seems once you dig in a bit. People plus technology produce the best results.
Trust, but Verify
One of the biggest mistakes you can make is ignoring a review once it gets underway by assuming that everything is clicking along smoothly. It’s your case, your review, and your head on the line.
If you completely ignore your review through six months of clicking and categorizing, you can find out on the eve of production that some aspect of the review was flat out incorrect or that some of your reviewers were not up to snuff. Informing your client that they need to budget more time or money for the review because you failed to manage or delegate management responsibilities is not a conversation you want to have. The best way to prevent problems is to be hands-on with your review or hire someone, such as a project manager, to report back to you with regularity regarding project specifics to ensure that you are satisfied. If there are problems―big or small―let the responsible parties know what they need to cure and follow up to make sure things are as they should be.
Reviews don’t run themselves and no one wins when they are, however benevolently or excusably they are ignored.
Build on Your Successes and Failures
Once you’ve got a review protocol that works for your firm, consider formalizing your processes and relationships. At the end of the day, you want to feel good that the individuals you’ve trusted with your client’s data are well suited to the task. Like any relationship, good e-discovery relationships are built over time and it can take a project or two to work out all of the quirks. If you feel that one of your providers doesn’t listen to you, let them know. If they’re not responsive, find a provider that is a better fit for your firm.
Did you learn any tough lessons from a bad review? Share your experiences with your colleagues who may be less familiar with the field so they don’t have to make easily avoidable mistakes. Stay abreast (or task someone with staying abreast) of new legal developments and technological advancements. Your decision to be mindful and informed about alternatives will help keep your clients happy.
May your next review be a great one!
© 2011 Bloomberg Finance L.P. Originally published by Bloomberg Finance L.P. Reprinted with permission. The opinions expressed are those of the author.