ILTA White Papers

Litigation and Practice Support

Issue link: http://read.uberflip.com/i/25416

Contents of this Issue

Navigation

Page 38 of 47

RESPONSE VARIATION This is a strategy to help engage the reviewer by building variety into the task. This may involve altering behaviors, such as reversing the order in which subtasks are performed. In the context of linear review, response variation can help organize review batches so that your review teams alternate classifying documents for responsiveness, privilege and confidentiality. Another interesting approach would be to mix the review documents and suggest that each be reviewed for a specific target classification. FREE-FORM EXPLORATION Combining aspects of early case assessments and linear review is one form of exploration that is known to offer both a satisfying experience and effective results. While performing linear review, the ability to suspend the document being reviewed and switch to other similar documents and topics can give the reviewer a cognitive stimulus that improves knowledge acquisition. This offers an opportunity for the reviewer to learn some facts of the case that would normally be difficult to obtain, and approach the knowledge levels of a senior litigator on the case. After all, knowledge of the matter helps guide reviewers, so attempts to increase their knowledge of the case can only be beneficial. EXPANDING THE WORK PRODUCT Besides simply judging the review disposition of a document, the generation of higher-value output (e.g., document summaries, critical snippets and document metadata) also contributes to the overall assessment. This can help reduce any boredom experienced by the current reviewer and can contribute valuable insights to other reviewers. 40 Litigation and Practice Support ILTA White Paper REVIEW TECHNOLOGIES Enhancing linear review with technologies that can change the review workflow is worth considering, but the process must incorporate increasing the reviewers’ knowledge and ability to apply judgment. The Electronic Data Reference Model (EDRM) is an excellent resource for evaluating technologies and planning review projects. Some options that they provide include: • Initial Culling: Keyword sampling and culling is an effective way to filter for relevant information. The highly regarded industry think tank, The Sedona Conference, recommends a number of culling strategies. Chief among them is a technique called “judgmental sampling,” where the knowledge of the case is brought to bear on sample selections. Specific keywords, in combination with Boolean searches, can reduce the review workload dramatically. Another effective approach is to perform data analytics on the ESI. Automated analytics techniques can organize ESI in a form that allows for easy data manipulation. This reduces review workload by eliminating large classes of data, such as e-mail messages from irrelevant senders or file types that are not likely to contain user generated content. • Deduplication: Next-level document review savings are achieved by further processing of data that can identify duplicate documents. By applying appropriate deduplication processes, one can reduce the number of documents to review by 30 to 80 percent. • E-Mail Discussion Threads: Nearly 80 percent of e-discovery involves e-mail messages. Organizing

Articles in this issue

Archives of this issue

view archives of ILTA White Papers - Litigation and Practice Support