skip to main content
10.1145/3295750.3298945acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
short-paper

Variations in Assessor Agreement in Due Diligence

Published: 08 March 2019 Publication History

Abstract

In legal due diligence, lawyers identify a variety of topic instances in a company's contracts that may pose risk during a transaction. In this paper, we present a study of 9 lawyers conducting a simulated review of 50 contracts for five topics. We find that lawyers agree on the general location of relevant material at a higher rate than in other assessor agreement studies, but they do not entirely agree on the extent of the relevant material. Additionally, we do not find strong differences between lawyers who have differing levels of due diligence expertise.
If we train machine learning models to identify these topics based on each user's judgments, the resulting models exhibit similar levels of agreement between each other as to the lawyers that trained them. This indicates that these models are learning the types of behaviour exhibited by their trainers, even if they are doing so imperfectly. Accordingly, we argue that additional work is necessary to improve the assessment process to ensure that all parties agree on identified material.

References

[1]
Aiman L. Al-Harbi and Mark D. Smucker. 2014. A Qualitative Exploration of Secondary Assessor Relevance Judging Behavior (IIiX '14).
[2]
Peter Bailey, Nick Craswell, Ian Soboroff, Paul Thomas, Arjen P. de Vries, and Emine Yilmaz. 2008. Relevance Assessment: Are Judges Exchangeable and Does It Matter (SIGIR '08).
[3]
Thomas Barnett and Svetlana Godjevac. 2011. Faster, better, cheaper legal document review, pipe dream or reality?. In Proc. ICAIL DESI IV Workshop.
[4]
Jacob Cohen. 1960. A Coefficient of Agreement for Nominal Scales. Educational and Psychological Measurement, Vol. 20, 1 (1960).
[5]
Gordon V. Cormack and Thomas R. Lynam. 2006. Statistical Precision of Information Retrieval Evaluation (SIGIR '06).
[6]
Maura R Grossman and Gordon V Cormack. 2010. Technology-assisted review in e-discovery can be more effective and more efficient than exhaustive manual review. Rich. JL & Tech., Vol. 17 (2010).
[7]
Maura R. Grossman and Gordon V. Cormack. 2011. Inconsistent Assessment of Responsiveness in E-Discovery: Difference of Opinion or Human Error?. In Proc. ICAIL DESI IV Workshop.
[8]
Kenneth A. Kinney, Scott B. Huffman, and Juting Zhai. 2008. How Evaluator Domain Expertise Affects Search Result Relevance Judgments (CIKM '08).
[9]
J. Richard Landis and Gary G. Koch. 1977. The Measurement of Observer Agreement for Categorical Data. Biometrics, Vol. 33, 1 (1977).
[10]
Christopher D. Manning, Prabhakar Raghavan, and Hinrich Schütze. 2008. Introduction to Information Retrieval. Cambridge University Press.
[11]
Douglas W Oard and William Webber. 2013. Information retrieval for e-discovery. Foundations and Trends® in Information Retrieval, Vol. 7, 2--3 (2013).
[12]
Martin Potthast, Benno Stein, Alberto Barrón-Cede no, and Paolo Rosso. 2010. An evaluation framework for plagiarism detection. In COLING '10.
[13]
Adam Roegiest, Gordon V. Cormack, Charles L.A. Clarke, and Maura R. Grossman. 2015. Impact of Surrogate Assessments on High-Recall Retrieval (SIGIR '15).
[14]
Adam Roegiest, Alexander K. Hudek, and Anne McNulty. 2018. A Dataset and an Examination of Identifying Passages for Due Diligence. In Proc. SIGIR 2018.
[15]
Adam Roegiest and Winter Wei. 2018. Redesigning a Document Viewer for Legal Documents. In Proc. CHIIR 2018.
[16]
Herbert L. Roitblat, Anne Kershaw, and Patrick Oot. 2010. Document Categorization in Legal Electronic Discovery: Computer Classification vs. Manual Review. J. Am. Soc. Inf. Sci. Tec., Vol. 61, 1 (2010).
[17]
Andrew Trotman and Dylan Jenkinson. 2007. IR evaluation using multiple assessors per topic. In Proc. ADCS '07.
[18]
Ellen M Voorhees. 2000. Variations in relevance judgments and the measurement of retrieval effectiveness. Information processing & management, Vol. 36, 5 (2000).
[19]
Simon Wakeling, Martin Halvey, Robert Villa, and Laura Hasler. 2016. A Comparison of Primary and Secondary Relevance Judgements for Real-Life Topics (CHIIR '16).
[20]
Jianqiang Wang and Dagobert Soergel. 2010. A user study of relevance judgments for E-Discovery. Proc. Am. Soc. Info. Sci. Tech., Vol. 47, 1 (2010).
[21]
William Webber. 2011. Re-examining the effectiveness of manual review. In Proc. SIGIR 2011 SIRE Workshop.
[22]
William Webber, Praveen Chandar, and Ben Carterette. 2012a. Alternative Assessor Disagreement and Retrieval Depth (CIKM '12).
[23]
William Webber and Jeremy Pickens. 2013. Assessor Disagreement and Text Classifier Accuracy (SIGIR '13).
[24]
William Webber, Bryan Toth, and Marjorie Desamito. 2012b. Effect of Written Instructions on Assessor Agreement (SIGIR '12).

Cited By

View all
  • (2020)The Utility of Context When Extracting Entities From Legal DocumentsProceedings of the 29th ACM International Conference on Information & Knowledge Management10.1145/3340531.3412746(2397-2404)Online publication date: 19-Oct-2020

Recommendations

Comments

Information & Contributors

Information

Published In

CHIIR '19: Proceedings of the 2019 Conference on Human Information Interaction and Retrieval
March 2019
463 pages
ISBN:9781450360258
DOI:10.1145/3295750
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 March 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. assessor agreement
  2. due diligence
  3. legal retrieval
  4. user study

Qualifiers

  • Short-paper

Conference

CHIIR '19
Sponsor:

Acceptance Rates

Overall Acceptance Rate 55 of 163 submissions, 34%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)2
  • Downloads (Last 6 weeks)0
Reflects downloads up to 25 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2020)The Utility of Context When Extracting Entities From Legal DocumentsProceedings of the 29th ACM International Conference on Information & Knowledge Management10.1145/3340531.3412746(2397-2404)Online publication date: 19-Oct-2020

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy