"Language is renormalization (and its implications in physics, linguistics and machine learning)"

Who: Roman Orus, Johannes Gutenberg-Universitšt Mainz, Germany

Place: Donostia International Physics Center

Date: Thursday, 3 May 2018, 12:00

In this talk I will consider some well-known facts in syntax from a physics perspective, which allows to establish some remarkable equivalences. Specifically, I will show how Chomsky?s linguistic MERGE operation can be interpreted as a physical information coarse-graining.

Thus, MERGE in linguistics entails information renormalization in physics, according to different time scales. I will make this point mathematically formal in terms of language models, i.e., probability distributions over word sequences, widely used in natural language processing as well as other ambits. The probability vectors of meaningful sentences are naturally given by tensor networks (TN) that are mostly loop-free, such as Tree Tensor Networks and Matrix Product States. These structures have geodesic correlations by construction and, because of the peculiarities of human language, they are extremely efficient to manipulate computationally. I will also show how to obtain such language models from probability distributions of certain TN quantum states that can be efficiently prepared by a quantum computer.

Entanglement properties of these states are related to information properties of the distribution. Moreover, I will explain why all this formally justifies the empirical observation of critical correlations in language, as well as the adequacy of neural networks for language processing. Other implications of these results will also be discussed.

Back to seminars List