Neural Decipherment via Minimum-Cost Flow: From Ugaritic to Linear B
Author(s)
Luo, Jiaming; Cao, Yuan; Barzilay, Regina
DownloadPublished version (508.5Kb)
Publisher with Creative Commons License
Publisher with Creative Commons License
Creative Commons Attribution
Terms of use
Metadata
Show full item recordAbstract
© 2019 Association for Computational Linguistics In this paper we propose a novel neural approach for automatic decipherment of lost languages. To compensate for the lack of strong supervision signal, our model design is informed by patterns in language change documented in historical linguistics. The model utilizes an expressive sequence-to-sequence model to capture character-level correspondences between cognates. To effectively train the model in an unsupervised manner, we innovate the training procedure by formalizing it as a minimum-cost flow problem. When applied to the decipherment of Ugaritic, we achieve a 5.5% absolute improvement over state-of-the-art results. We also report the first automatic results in deciphering Linear B, a syllabic language related to ancient Greek, where our model correctly translates 67.3% of cognates.
Date issued
2019-08Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence LaboratoryJournal
ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference
Publisher
Association for Computational Linguistics (ACL)
Citation
Luo, Jiaming, Cao, Yuan and Barzilay, Regina. 2019. "Neural Decipherment via Minimum-Cost Flow: From Ugaritic to Linear B." ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference.
Version: Final published version