Joshi, Aravind K
Now showing 1 - 10 of 21
PublicationFeature Structures Based Tree Adjoining Grammars(1988-10-01) Joshi, Aravind K; Joshi, Aravind K; Shanker, K. VijayWe have embedded Tree Adjoining Grammars (TAG) in a feature structure based unification system. The resulting system, Feature Structure based Tree Adjoining Grammars (FTAG), captures the principle of factoring dependencies and recursion, fundamental to TAG's. We show that FTAG has an enhanced descriptive capacity compared to TAG formalism. We consider some restricted versions of this system and some possible linguistic stipulations that can be made. We briefly describe a calculus to represent the structures used by this system, extending on the work of Rounds, and Kasper [Rounds et al. 1986, Kasper et al. 1986)involving the logical formulation of feature structures. PublicationParsing With Lexicalized Tree Adjoining Grammar(1990-02-01) Joshi, Aravind K; Joshi, Aravind KMost current linguistic theories give lexical accounts of several phenomena that used to be considered purely syntactic. The information put in the lexicon is thereby increased in both amount and complexity: see, for example, lexical rules in LFG (Kaplan and Bresnan, 1983), GPSG (Gazdar, Klein, Pullum and Sag, 1985), HPSG (Pollard and Sag, 1987), Combinatory Categorial Grammars (Steedman, 1987), Karttunen's version of Categorial Grammar (Karttunen 1986, 1988), some versions of GB theory (Chomsky 1981), and Lexicon-Grammars (Gross 1984). We would like to take into account this fact while defining a formalism. We therefore explore the view that syntactical rules are not separated from lexical items. We say that a grammar is lexicalized (Schabes, AbeilK and Joshi, 1988) if it consists of: (1) a finite set of structures each associated with lexical items; each lexical item will be called the anchor of the corresponding structure; the structures define the domain of locality over which constraints are specified; (2) an operation or operations for composing the structures. The notion of anchor is closely related to the word associated with a functor-argument category in Categorial Grammars. Categorial Grammar (as used for example by Steedman, 1987) are 'lexicalized' according to our definition since each basic category has a lexical item associated with it. PublicationDynamically Altering Agent Behaviors Using Natural Language Instructions(2000-06-03) Allbeck, Jan M.; Badler, Norman I; Joshi, Aravind K.; Allbeck, Jan M.; Badler, Norman I; Joshi, Aravind K.; Palmer, MarthaSmart avatars are virtual human representations controlled by real people. Given instructions interactively, smart avatars can act as autonomous or reactive agents. During a real-time simulation, a user should be able to dynamically refine his or her avatar’s behavior in reaction to simulated stimuli without having to undertake a lengthy off-line programming session. In this paper, we introduce an architecture, which allows users to input immediate or persistent instructions using natural language and see the agents’ resulting behavioral changes in the graphical output of the simulation. PublicationFlexible Margin Selection for Reranking with Full Pairwise Samples(2004-03-22) Joshi, Aravind K; Joshi, Aravind KPerceptron like large margin algorithms are introduced for the experiments with various margin selections. Compared to the previous perceptron reranking algorithms, the new algorithms use full pairwise samples and allow us to search for margins in a larger space. Our experimental results on the data set of  show that a perceptron like ordinal regression algorithm with uneven margins can achieve Recall/Precision of 89.5/90.0 on section 23 of Penn Treebank. Our result on margin selection can be employed in other large margin machine learning algorithms as well as in other NLP tasks. PublicationThe Convergence of Mildly Context-Sensitive Grammar Formalisms(1990) Joshi, Aravind K; Joshi, Aravind K; Shanker, K. Vijay; Weir, DavidInvestigations of classes of grammars that are nontransformational and at the same time highly constrained are of interest both linguistically and mathematically. Context-free grammars (CFG) obviously form such a class. CFGs are not adequate (both weakly and strongly) to characterize some aspects of language structure. Thus how much more power beyond CFG is necessary to describe these phenomena is an important question. Based on certain properties of tree adjoining grammars (TAG) an approximate characterization of class of grammars, mildly context-sensitive grammars (MCSG), has been proposed earlier. In this paper, we have described the relationship between several different grammar formalisms, all of which belong to MCSG. In particular, we have shown that head grammars (HG), combinatory categorial grammars (CCG), and linear indexed grammars (LIG) and TAG are all weakly equivalent. These formalisms are all distinct from each other at least in the following aspects: (a) the formal objects and operations in each formalism, (b) the domain of locality over which dependencies are specified, (c) the degree to which recursion and the domain of dependencies are factored, and (d) the linguistic insights that are captured in the formal objects and operations in each formalism. A deeper understanding of this convergence is obtained by comparing these formalisms at the level of the derivation structures in each formalism. We have described a formalism, the linear context-free rewriting system (LCFR), as a first attempt to capture the closeness of the derivation structures of these formalisms. LCFRs thus make the notion of MCSGs more precise. We have shown that LCFRs are equivalent to muticomponent tree adjoining grammars (MCTAGs), and also briefly discussed some variants of TAGs, lexicalized TAGs, feature structure based TAGs, and TAGs in which local domination and linear precedence are factored TAG(LD/LP). PublicationThe Linguistic Relevance of Tree Adjoining Grammar(1985-04-01) Joshi, Aravind K; Joshi, Aravind KIn this paper we apply a new notation for the writing of natural language grammars to some classical problems in the description of English. The formalism is the Tree Adjoining Grammar (TAG) of Joshi, Levy and Takahashi 1975, which was studied, initially only for its mathematical properties but which now turns out to be a interesting candidate for the proper notation of meta-grammar; that is for the universal grammar of contemporary linguistics. Interest in the application of the TAG formalism to the writing of natural language grammars arises out of recent work on the possibility of writing grammars for natural languages in a metatheory of restricted generative capacity (for example, Gazdar 1982 and Gazdar et al. 1985). There have been also several recent attempts to examine the linguistic metatheory of restricted grammatical formalisms, in particular, context-free grammars. The inadequacies of context-free grammars have been discussed both from the point of view of strong generative capacity (Bresnan et al. 1982) and weak generative capacity (Shieber 1984, Postal and Langendoen 1984, Higginbothem 1984, the empirical claims of the last two having been disputed by Pullum (Pullum 1984)). At this point TAG grammar becomes interesting because while it is more powerful than context-free grammar, it is only "mildly" so. This extra power of TAG is a direct corollary of the way TAG factors recursion and dependencies, and it can provide reasonable structural descriptions for constructions like Dutch verb raising where context-free grammar apparently fails. These properties of TAG and some of its mathematical properties were discussed by Joshi 1983. PublicationA Default Temporal Logic for Regulatory Conformance Checking(2008-04-05) Dinesh, Nikhil; Joshi, Aravind K; Lee, Insup; Sokolsky, Oleg; Dinesh, Nikhil; Joshi, Aravind K; Lee, Insup; Sokolsky, OlegThis paper considers the problem of checking whether an organization conforms to a body of regulation. Conformance is cast as a trace checking question – the regulation is represented in a logic that is evaluated against an abstract trace or run representing the operations of an organization. We focus on a problem in designing a logic to represent regulation. A common phenomenon in regulatory texts is for sentences to refer to others for conditions or exceptions. We motivate the need for a formal representation of regulation to accommodate such references between statements. We then extend linear temporal logic to allow statements to refer to others. The semantics of the resulting logic is defined via a combination of techniques from Reiter’s default logic and Kripke’s theory of truth. This paper is an expanded version of . PublicationCentering: A Framework for Modelling the Local Coherence of Discourse(1995) Joshi, Aravind K.; Grosz, Barbara J.; Weinstein, Scott; Joshi, Aravind K.; Weinstein, ScottThis paper concerns relationships among focus of attention, choice of referring expression, and perceived coherence of utterances within a discourse segment. It presents a framework and initial theory of centering which are intended to model the local component of attentional state. The paper examines interactions between local coherence and choice of referring expressions; it argues that differences in coherence correspond in part to the inference demands made by different types of referring expressions given a particular attentional state. It demonstrates that the attentional state properties modelled by centering can account for these differences. PublicationSense Annotation in the Penn Discourse Treebank(2008-02-10) Miltsakaki, Eleni; Lee, Alan; Joshi, Aravind K; Robaldo, Livio; Lee, Alan; Joshi, Aravind KAn important aspect of discourse understanding and generation involves the recognition and processing of discourse relations. These are conveyed by discourse connectives, i.e., lexical items like because and as a result or implicit connectives expressing an inferred discourse relation. The Penn Discourse TreeBank (PDTB) provides annotations of the argument structure, attribution and semantics of discourse connectives. In this paper, we provide the rationale of the tagset, detailed descriptions of the senses with corpus examples, simple semantic definitions of each type of sense tags as well as informal descriptions of the inferences allowed at each level. PublicationCharacterizing Structural Descriptions Produced by Various Grammatical Formalisms(1988-09-01) Vijay-Shanker, K.; Joshi, Aravind K; Joshi, Aravind KWe consider the structural descriptions produced by various grammatical formalisms in terms of the complexity of the paths and the relationship between paths in the sets of structural descriptions that each system can generate. In considering the relationships between formalisms, we show that it is useful to abstract away from the details of the formalism, and examine the nature of their derivation process as reflected by properties of their derivation trees. We find that several of the formalisms considered can be seen as being closely re1aled since they have derivation tree sets with the same structure as those produced by Context-Free Grammars. On the basis of this observation, we describe a class of formalisms which we call Linear Context Free Rewriting Systems, and show they are recognizable in polynomial time and generate only semilinear languages.