Cocos, Anne O'donnell
Email Address
ORCID
Disciplines
Search Results
Now showing 1 - 1 of 1
Publication Paraphrase-Based Models Of Lexical Semantics(2019-01-01) Cocos, Anne O'donnellModels of lexical semantics are a key component of natural language understanding. The bulk of work in this area has focused on learning the meanings of words and phrases and their inter-relationships from signals present in large monolingual corpora -- including the distributional properties of words and phrases, and the lexico-syntactic patterns within which they appear. Each of these signals, while useful, has drawbacks related to challenges in modeling polysemy or limited coverage. The goal of this thesis is to examine bilingually-induced paraphrases as a different and complementary source of information for building computational models of semantics. First, focusing on the two tasks of discriminating word sense and predicting scalar adjective intensity, we build models that rely on paraphrases as a source of signal. In each case, the performance of the paraphrase-based models is compared to that of models incorporating more traditional feature types, such as monolingual distributional similarity and lexico-syntactic patterns. We find that combining these traditional signals with paraphrase-based features leads to the highest performing models overall, indicating that the different types of information are complementary. Next, we shift focus to the use of paraphrases to model the fine-grained meanings of a word. This idea is leveraged to automatically generate a large resource of meaning-specific word instances called Paraphrase-Sense-Tagged Sentences (PSTS). Distributional models for sense embedding, word sense induction, and contextual hypernym prediction are trained successfully by using PSTS as a sense-tagged corpus. In this way we reaffirm the notion that signals from paraphrases and monolingual distributional properties can be combined to construct robust models of lexical semantics.