This paper presents an end-to-end system to identify writers in medieval manuscripts. The proposed system consists in a three-step model for detection and classification of lines in the manuscript and page writer identification. The first two steps are based on deep neural networks trained with transfer learning techniques and specialized to solve the task in hand. The third stage is a weighted majority vote row-decision combiner that assigns to each page a writer. The main goal of this paper is to study the applicability of deep learning in this context when a relatively small training dataset is available. We tested our system with several state-of-the-art deep architectures on a digitized manuscript known as the Avila Bible, using only 9.6% of the total pages for training. Our approach proves to be very effective in identifying page writers, reaching a peak of 96.48% of accuracy and 96.56% of F1 score.
An end-to-end deep learning system for medieval writer identification
Cilia N. D.;De Stefano C.;Fontanella F.;Marrocco C.;Molinara M.;Scotto Di Freca A.
2020-01-01
Abstract
This paper presents an end-to-end system to identify writers in medieval manuscripts. The proposed system consists in a three-step model for detection and classification of lines in the manuscript and page writer identification. The first two steps are based on deep neural networks trained with transfer learning techniques and specialized to solve the task in hand. The third stage is a weighted majority vote row-decision combiner that assigns to each page a writer. The main goal of this paper is to study the applicability of deep learning in this context when a relatively small training dataset is available. We tested our system with several state-of-the-art deep architectures on a digitized manuscript known as the Avila Bible, using only 9.6% of the total pages for training. Our approach proves to be very effective in identifying page writers, reaching a peak of 96.48% of accuracy and 96.56% of F1 score.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.