Empirical sufficiency lower bounds for language modeling with locally-bootstrapped semantic structures
- In this work we build upon negative results from an attempt at language modeling with predicted semantic structure, in order to establish empirical lower bounds on what could have made the attempt successful. More specifically, we design a concise binary vector representation of semantic structure at the lexical level and evaluate in-depth how good an incremental tagger needs to be in order to achieve better-than-baseline performance with an end-to-end semantic-bootstrapping language model. We envision such a system as consisting of a (pretrained) sequential-neural component and a hierarchical-symbolic component working together to generate text with low surprisal and high linguistic interpretability. We find that (a) dimensionality of the semantic vector representation can be dramatically reduced without losing its main advantages and (b) lower bounds on prediction quality cannot be established via a single score alone, but need to take the distributions of signal and noise intoIn this work we build upon negative results from an attempt at language modeling with predicted semantic structure, in order to establish empirical lower bounds on what could have made the attempt successful. More specifically, we design a concise binary vector representation of semantic structure at the lexical level and evaluate in-depth how good an incremental tagger needs to be in order to achieve better-than-baseline performance with an end-to-end semantic-bootstrapping language model. We envision such a system as consisting of a (pretrained) sequential-neural component and a hierarchical-symbolic component working together to generate text with low surprisal and high linguistic interpretability. We find that (a) dimensionality of the semantic vector representation can be dramatically reduced without losing its main advantages and (b) lower bounds on prediction quality cannot be established via a single score alone, but need to take the distributions of signal and noise into account.…


| Author: | Jakob PrangeGND, Emmanuele Chersoni |
|---|---|
| URN: | urn:nbn:de:bvb:384-opus4-1178271 |
| Frontdoor URL | https://opus.bibliothek.uni-augsburg.de/opus4/117827 |
| ISBN: | 978-1-959429-76-0OPAC |
| Parent Title (English): | Proceedings of the 12th Joint Conference on Lexical and Computational Semantics (*SEM 2023), July 13-14, 2023, Toronto, Canada |
| Publisher: | Association for Computational Linguistics (ACL) |
| Place of publication: | Stroudsburg, PA |
| Editor: | Alexis Palmer, Jose Camacho-Collados |
| Type: | Conference Proceeding |
| Language: | English |
| Year of first Publication: | 2023 |
| Publishing Institution: | Universität Augsburg |
| Release Date: | 2025/01/07 |
| First Page: | 456 |
| Last Page: | 468 |
| DOI: | https://doi.org/10.18653/v1/2023.starsem-1.40 |
| Institutes: | Fakultät für Angewandte Informatik |
| Fakultät für Angewandte Informatik / Institut für Informatik | |
| Fakultät für Angewandte Informatik / Institut für Informatik / Lehrstuhl für Computerlinguistik | |
| Dewey Decimal Classification: | 0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 004 Datenverarbeitung; Informatik |
| Licence (German): | CC-BY 4.0: Creative Commons: Namensnennung |



