Generalised Winograd Schema and its Contextuality

Kin Ian Lo
(University College London, London, UK)
Mehrnoosh Sadrzadeh
(University College London, London, UK)
Shane Mansfield
(Quandela, Paris, France)

Ambiguities in natural language give rise to probability distributions over interpretations. The distributions are often over multiple ambiguous words at a time; a multiplicity which makes them a suitable topic for sheaf-theoretic models of quantum contextuality. Previous research showed that different quantitative measures of contextuality correlate well with Psycholinguistic research on lexical ambiguities. In this work, we focus on coreference ambiguities and investigate the Winograd Schema Challenge (WSC), a test proposed by Levesque in 2011 to evaluate the intelligence of machines. The WSC consists of a collection of multiple-choice questions that require disambiguating pronouns in sentences structured according to the Winograd schema, in a way that makes it difficult for machines to determine the correct referents but remains intuitive for human comprehension. In this study, we propose an approach that analogously models the Winograd schema as an experiment in quantum physics. However, we argue that the original Winograd Schema is inherently too simplistic to facilitate contextuality. We introduce a novel mechanism for generalising the schema, rendering it analogous to a Bell-CHSH measurement scenario. We report an instance of this generalised schema, complemented by the human judgements we gathered via a crowdsourcing platform. The resulting model violates the Bell-CHSH inequality by 0.192, thus exhibiting contextuality in a coreference resolution setting.

In Shane Mansfield, Benoît Valiron and Vladimir Zamdzhiev: Proceedings of the Twentieth International Conference on Quantum Physics and Logic (QPL 2023), Paris, France, 17-21st July 2023, Electronic Proceedings in Theoretical Computer Science 384, pp. 187–202.
Published: 30th August 2023.

ArXived at: https://dx.doi.org/10.4204/EPTCS.384.11 bibtex PDF
References in reconstructed bibtex, XML and HTML format (approximated).
Comments and questions to: eptcs@eptcs.org
For website issues: webmaster@eptcs.org