Semantic parsing aims to map natural language to structured meaning representations. Traditional approaches rely on discrete categorical labels for semantic roles and relations, which can be brittle when faced with non-prototypical language use and difficult to port across domains or languages.
Universal Decompositional Semantic parsing addresses these limitations by jointly learning to predict both semantic graph structures and their associated real-valued attributes. This approach decomposes complex semantic phenomena into scalar-valued dimensions that capture graded properties of predicates and their arguments.
The parsing models employ neural architectures that perform joint syntactic and semantic analysis, learning compositional representations that can handle cross-lingual transfer and domain adaptation. These parsers are trained on the UDS corpus, which provides continuous-valued annotations for multiple semantic properties anchored to Universal Dependencies trees.
Key technical contributions include multitask learning frameworks that jointly model syntax and semantics, graph-based neural architectures for structured prediction, and cross-lingual transfer learning techniques that leverage the language-neutral nature of the decompositional semantic representations.
Publications
-
Zhang, Sheng, Xutai Ma, Rachel Rudinger, Kevin Duh, and Benjamin Van Durme. 2018. Cross-lingual Semantic Parsing. Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 1664-1675. Brussels, Belgium: Association for Computational Linguistics.
-
Stengel-Eskin, Elias, Aaron Steven White, Sheng Zhang, and Benjamin Van Durme. 2020. Universal Decompositional Semantic Parsing. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 8427–8439. Online: Association for Computational Linguistics.
-
Stengel-Eskin, Elias, Kenton Murray, Sheng Zhang, Aaron Steven White, and Benjamin Van Durme. 2021. Joint Universal Syntactic and Semantic Parsing. In Transactions of the Association for Computational Linguistics, 756–773.