Study: Best E-Discovery Review Requires Humans and Technology


Before choosing technology-assisted review (TAR) over human review, consider that the best systems for e-discovery include a combination of people and software, new research has revealed.

The joint research project by nonprofit Electronic Discovery Institute (EDI) and tech giant Oracle Corp. found that TAR is often faster and cheaper when identifying relevant documents. However, when it comes to identifying privileged or sensitive information, human reviewers outperformed machines, the research shows.

While software programmed to identify responsive documents could cut up to 80% from firms’ discovery expenses, researchers found that privileged and so-called “hot” documents that require conceptual knowledge rather than keyword searches or predictive coding, are better reviewed by humans skilled in the practice area.

“It’s not a magic bullet. Nobody says that predictive coding is a good way of finding privileged documents, but it’s a good supplement if you’re using it for quality control,” TAR proponent U.S. Magistrate Judge Andrew J. Peck of the Southern District of New York told Legaltech News. “It’s a combination of the technology, the people involved, and the workflow process.”

According to Legaltech News, EDI’s research is the most comprehensive evaluation of document review in e-discovery ever conducted in the United States. The project, which began in 2012 and lasted four years, studied how effective technology is at complying with Rule 26(g) of the Federal Rules of Civil Procedure. It involved more than 1.6 million documents created by Oracle’s response to a federal government probe of Sun Microsystems.

In 2010, Oracle merged with Sun, but the agreement resulted in expensive litigation after the government charged Sun with violating the False Claims Act and anti-kickback regulations. Investigators said Sun paid systems-integrator companies to recommend its products to federal agencies and submitted false claims to the General Services Administration and other government payers. The case was resolved in 2011, but it cost Oracle $46 million, Legaltech News reported.

Documents from the settled case provided a real-life data sample that allowed researchers to study computer efficiency versus human reviews in high-stakes litigation.

John Rosenthal, a partner at firm Winston & Strawn, which runs one of the largest e-discovery operations in the country, assisted as a quality-control check. Rosenthal is a former member of The Sedona Conference® steering committee of judges, lawyers, and technology vendors focused on creating e-discovery and document retention laws.

“For those of us steeped in how the algorithms work, it really was not a surprise,” he said. “TAR is somewhat faster, somewhat less expensive, but not as dramatic as what the vendors would lead you to believe. I’m a proponent of it in that perspective, but if you’re choosing TAR because you think it will be better than human review, I don’t think the study substantiates that. You’ve got to have the right process, the right people, and they’ve got to be trained.”

November 21, 2016