Predicting conceptnet path quality using crowdsourced assessments of naturalness
Author(s)
Zhou, Yilun; Schockaert, Steven; Shah, Julie A
DownloadPublished version (1.275Mb)
Terms of use
Metadata
Show full item recordAbstract
In many applications, it is important to characterize the way in which two concepts are semantically related. Knowledge graphs such as ConceptNet provide a rich source of information for such characterizations by encoding relations between concepts as edges in a graph. When two concepts are not directly connected by an edge, their relationship can still be described in terms of the paths that connect them. Unfortunately, many of these paths are uninformative and noisy, which means that the success of applications that use such path features crucially relies on their ability to select high-quality paths. In existing applications, this path selection process is based on relatively simple heuristics. In this paper we instead propose to learn to predict path quality from crowdsourced human assessments. Since we are interested in a generic task-independent notion of quality, we simply ask human participants to rank paths according to their subjective assessment of the paths' naturalness, without attempting to define naturalness or steering the participants towards particular indicators of quality. We show that a neural network model trained on these assessments is able to predict human judgments on unseen paths with near optimal performance. Most notably, we find that the resulting path selection method is substantially better than the current heuristic approaches at identifying meaningful paths.
Date issued
2019Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence LaboratoryJournal
World Wide Web Conference (WWW)
Publisher
ACM Press
Citation
Zhou, Yilun, Steven Schockaert, and Julie A. Shah, "Predicting conceptnet path quality using crowdsourced assessments of naturalness." WWW '19: The World Wide Web Conference, May 2019, San Francisco, CA: ACM, 2019: p. 2460-71 doi 10.1145/3308558.3313486 ©2019 Author(s)
Version: Final published version
ISBN
978-1-4503-6674-8