Liebeskind, Chaya and Liebeskind, Shmuel - Deep Learning for Period Classification of Historical Hebrew Texts

jdmdh:5864 - Journal of Data Mining & Digital Humanities, June 13, 2020, 2020
Deep Learning for Period Classification of Historical Hebrew Texts

Authors: Liebeskind, Chaya and Liebeskind, Shmuel

In this study, we address the interesting task of classifying historical texts by their assumed period of writ-ing. This task is useful in digital humanity studies where many texts have unidentified publication dates.For years, the typical approach for temporal text classification was supervised using machine-learningalgorithms. These algorithms require careful feature engineering and considerable domain expertise todesign a feature extractor to transform the raw text into a feature vector from which the classifier couldlearn to classify any unseen valid input. Recently, deep learning has produced extremely promising re-sults for various tasks in natural language processing (NLP). The primary advantage of deep learning isthat human engineers did not design the feature layers, but the features were extrapolated from data witha general-purpose learning procedure. We investigated deep learning models for period classification ofhistorical texts. We compared three common models: paragraph vectors, convolutional neural networks (CNN) and recurrent neural networks (RNN), and conventional machine-learning methods. We demon-strate that the CNN and RNN models outperformed the paragraph vector model and the conventionalsupervised machine-learning algorithms. In addition, we constructed word embeddings for each timeperiod and analyzed semantic changes of word meanings over time.

Volume: 2020
Published on: June 13, 2020
Submitted on: October 23, 2019
Keywords: Machine Learning,Deep Learning,Diachronic Corpus,Period Classification,[INFO]Computer Science [cs],[INFO.INFO-CL]Computer Science [cs]/Computation and Language [cs.CL]


Consultation statistics

This page has been seen 342 times.
This article's PDF has been downloaded 271 times.