Last edited by Akimi
Tuesday, May 19, 2020 | History

2 edition of computational analysis of lexical cohesion with applications in information retrieval found in the catalog.

computational analysis of lexical cohesion with applications in information retrieval

Mark A. Stairmand

computational analysis of lexical cohesion with applications in information retrieval

by Mark A. Stairmand

  • 367 Want to read
  • 27 Currently reading

Published by UMIST in Manchester .
Written in English


Edition Notes

StatementMark A. Stairmand ; supervised by W.J. Black.
ContributionsBlack, W. J., Language Engineering.
ID Numbers
Open LibraryOL17466861M

Lexical analysis for information retrieval systems is the same as lexical analysis for other text processing systems; in particular, it is the same as lexical analysis for program translators. This problem has been studied thoroughly, so we ought to adopt the solutions in the program translation literature (Aho, Sethi, and Ullman ). Exploring Lexical Patterns in Text of high/low informational load, informational statuses (Given/New)), and com-putational processing, e.g., for applications such as information extraction or information retrieval. Another type of cohesion, coacting with reference to create texture, is lexical cohesion (cf. Halliday and Hasan, ).

(). A Computational Analysis of Lexical Cohesion with Applications in Information Retrieval. (). A critique and improvement of an evaluation metric for text segmentation. (). Advances in domain independent linear text segmentation. (). Cohesion in English. (). Detecting and correcting malapropisms with lexical chains. ().Author: Nicola Stokes, Joe Carthy and Alan F. Smeaton. information. Lexical knowledge bases (LKBs), also known as lexico-semantic resources, provide information about words and potentially entities, and are at the core of knowledge-based approaches. They are widely used in a variety of NLP tasks (e.g., word sense disambiguation, information retrieval, and question answering), all the more so since.

The quantification of lexical semantic relatedness has many applications in NLP, and many different measures have been proposed. We evaluate five of these measures, all of which use WordNet as their central resource, by comparing their performance in detecting and correcting real-word spelling : BudanitskyAlexander, HirstGraeme. Natural Language Processing and Computational Linguistics 2: Semantics, Discourse and Applications Mohamed Zakaria Kurdi Natural Language Processing (NLP) is a scientific discipline which is found at the intersection of fields such as Artificial Intelligence, Linguistics, and Cognitive Psychology.


Share this book
You might also like
The big screen

The big screen

Records of the Joint Chiefs of Staff.

Records of the Joint Chiefs of Staff.

Whatcom museum of history and art (cover title)

Whatcom museum of history and art (cover title)

concise English dictionary, literary, scientific and technical

concise English dictionary, literary, scientific and technical

Subject Catalog of the World War I Collection

Subject Catalog of the World War I Collection

IV Workshop on Atomic and Molecular Physics

IV Workshop on Atomic and Molecular Physics

Eucharist and the confessional

Eucharist and the confessional

Reforming international institutions

Reforming international institutions

The reef-coral fauna of Carrizo Creek, Imperial County, California, and its significance

The reef-coral fauna of Carrizo Creek, Imperial County, California, and its significance

Country houses of the midlands

Country houses of the midlands

Effective teaching in secondary schools

Effective teaching in secondary schools

Talk Is Cheap

Talk Is Cheap

Increasing the number of naval aviators, and other bills before the committee

Increasing the number of naval aviators, and other bills before the committee

Computational analysis of lexical cohesion with applications in information retrieval by Mark A. Stairmand Download PDF EPUB FB2

A Computational Analysis of Lexical Cohesion with Applications in Information Retrieval Author: Stairmand, Mark. ISNI: Awarding Body: The University of Manchester Current Institution: University of Manchester Date of Award.

Applications of lexical cohesion analysis in the topic detection and tracking domain, Ph.D. thesis, National university of Ireland, Dublin, Apr il. St-Onge, A Computational Analysis of Lexical Cohesion with Applications in IR, PhD Thesis, Dept. of Language Engineering, UMIST () Google Scholar Stokes, N., Carthy, J.: First Story Detection using a Composite Document by: A Computational Analysis of Lexical Cohesion with applications in information Retrieval, Ph.D.

Thesis, UMIST () Google Scholar 5. Green, S.J.: Automatically Generating Hypertext By Comparing Semantic Similarity, University of Toronto, Technical Report number Cited by: Lexical cohesion is the textual q uality responsible for making the sentences of a text seem ‘to hang together’ [4], while coherence refers to the fact that ‘there is sense in the text’ [4].

Computational Lexical Semantics is one of the first volumes to provide models for the creation of various kinds of computerized lexicons for the automatic treatment of natural language, with applications to machine translation, automatic indexing, and database front-ends, knowledge extraction, among other things.

Computational Analysis of Lexical and Cohesion Differences in Deceptive Language: The Role of Accordance Ali Heidari,1 Meredith D’Arienzo,1 Scott A. Crossley,1 Nicholas Duran2 1Department of Applied Linguistics & ESL, 25 Park Place, Atlanta, GAUSA {aheidari1, mhall56, scrossley}@ Coh-Metrix: The Analysis of Cohesion and Lexical Network Density The Coh-Metrix is a computational linguistics tool that was designed for the analysis of cohesion in native speaker written discourse so that text readability could be matched to educational levels, ensuringFile Size: KB.

The user experiences them as Web sites or software applications that permit searching and browsing lexical information; that is, information about the words of a language.

However, behind that interaction is typically a database that contains many, highly structured lexical entries. An analysis of lexical cohesion, primarily by counting repetitions, synonyms, superordinate terms and paraphrases, leads to the establishment of a network of.

Lexical Chains: Accuracy Example: Entertainment-service 1 auto-maker 1 enterprise 1 massachusetts-institute 1 technology-microsoft 1 microsoft 10 concern 1 company 6 • The accuracy bounded by the quality of a lexical resource • The need in disambiguation makes the task harder Disambiguation accuracy around 60%.

Stairmand, M.: A Computational Analysis of Lexical Cohesion with Applications in Information Retrieva, Ph.D. Dissertation, Center for Computational Cited by: Computing Lexical Cohesion as a T o ol for ext Analysis Hideki Kozima Course in Computer Science and Information Mathematics Graduate Sc ho ol of Electro-Comm unications Univ ersit y of Electro-Comm unications Do ctoral Thesis, Decem b er 13, Abstract Recognizing coherent structure of a text is an essen tial task in natural language.

cohesion has underscored its importance as an indicator of text unity. Lexical cohesion is the cohesion that arises from semantic relationships between words. All that is required is that there be some recognizable relation between the words. Collocational word similarity is considered a source of text cohesion that is hard to measure and quantify.

The work presented here explores the use of information from a training corpus in measuring word similarity and evaluates the method in the text segmentation task. The purpose of this paper is to provide a detailed analysis of how lexical differences related to cohesion and connectionist models can distinguish first language (L1) writers of English from second language (L2) writers of English.

Key to this analysis is the use of the computational tool Coh-Metrix, which measures cohesion and text difficulty Cited by: Stairmond, M.A.: A Computational Analysis of Lexical Cohesion with Applications in Information Retrieval. thesis, Center for Computational Linguistics, UMIST, Cited by:   This book presents in four chapters the state of the art and fundamental concepts of key NLP areas.

Are presented in the first chapter the fundamental concepts in lexical semantics, lexical databases, knowledge representation paradigms, and ontologies. The second chapter is about combinatorial and formal semantics.

Marzena H. Makuta (). A computational model of lexical cohesion analysis and its application to the evaluation of text by: 2. A computational analysis of lexical cohesion with applications in information retrieval. Department of language Engineering: University of Manchester Institute of Science and Technology.

Google ScholarAuthor: Caroline Chibelushi, Mike Thelwall. In text, lexical cohesion is the result of chains of related words that contribute to the continuity of lexical meaning.

These lexical chains are a direct result of units of text being "about the same thing," and finding text structure involves finding units of text that are about the same by: Recently, a few studies proposed to fill this gap by means of automatic indexes of lexical cohesion obtained from Latent Semantic Analysis, but the results were : Mihai Dascalu.Book Abstract: with a preface by George Miller WordNet, an electronic lexical database, is considered to be the most important resource available to researchers in computational linguistics, text analysis, and many related areas.

Its design is inspired by current psycholinguistic and computational theories of human lexical memory.