Semantic Vector Models and Functional Models of Pregroup Grammars
Abstract
We show that vector space semantics and functional semantics in two-sorted first order logic are equivalent for pregroup grammars. An algorithm translates functional expressions to vector expressions and vice-versa. The semantics is compositional, variable free and invariant under permutations and repetitions. It includes the semantic vector models of Information Retrieval Systems and has an interior logic including a comprehension schema. The truth of a sentence in the interior logic is equivalent to the 'usual' first order formula rendering the sentence. The examples include negation, universal quantification and relative pronouns.