Gechuan Zhang ; Paul Nulty ; David Lillis - Enhancing Legal Argument Mining with Domain Pre-training and Neural Networks

jdmdh:9147 - Journal of Data Mining & Digital Humanities, 10 juin 2022, NLP4DH - https://doi.org/10.46298/jdmdh.9147
Enhancing Legal Argument Mining with Domain Pre-training and Neural NetworksArticle

Auteurs : Gechuan Zhang ; Paul Nulty ; David Lillis ORCID

    The contextual word embedding model, BERT, has proved its ability on downstream tasks with limited quantities of annotated data. BERT and its variants help to reduce the burden of complex annotation work in many interdisciplinary research areas, for example, legal argument mining in digital humanities. Argument mining aims to develop text analysis tools that can automatically retrieve arguments and identify relationships between argumentation clauses. Since argumentation is one of the key aspects of case law, argument mining tools for legal texts are applicable to both academic and non-academic legal research. Domain-specific BERT variants (pre-trained with corpora from a particular background) have also achieved strong performance in many tasks. To our knowledge, previous machine learning studies of argument mining on judicial case law still heavily rely on statistical models. In this paper, we provide a broad study of both classic and contextual embedding models and their performance on practical case law from the European Court of Human Rights (ECHR). During our study, we also explore a number of neural networks when being combined with different embeddings. Our experiments provide a comprehensive overview of a variety of approaches to the legal argument mining task. We conclude that domain pre-trained transformer models have great potential in this area, although traditional embeddings can also achieve strong performance when combined with additional neural network layers.


    Volume : NLP4DH
    Publié le : 10 juin 2022
    Accepté le : 6 avril 2022
    Soumis le : 1 mars 2022

    Fichiers

    Nom Taille
    JDMDH_submission.pdf.pdf
    md5 : 5ed2ef657fb43940d8fd181684da2aa9
    304.04 KB

    Publications

    Other
    • 1 Zenodo

    2 Documents citant cet article

    Statistiques de consultation

    Cette page a été consultée 2594 fois.
    Le PDF de cet article a été téléchargé 659 fois.