Memory-Based Language Processing (Studies in Natural by Walter Daelemans

By Walter Daelemans

Memory-based language processing--a laptop studying and challenge fixing approach for language technology--is in keeping with the concept that the direct re-use of examples utilizing analogical reasoning is extra fitted to fixing language processing difficulties than the applying of ideas extracted from these examples. This ebook discusses the speculation and perform of memory-based language processing, displaying its comparative strengths over replacement tools of language modelling. Language is advanced, with few generalizations, many sub-regularities and exceptions, and the benefit of memory-based language processing is that it doesn't summary clear of this priceless low-frequency info.

Show description

Read or Download Memory-Based Language Processing (Studies in Natural Language Processing) PDF

Similar ai & machine learning books

Artificial Intelligence Through Prolog

Synthetic Intelligence via Prolog ebook

Language, Cohesion and Form (Studies in Natural Language Processing)

As a pioneer in computational linguistics, operating within the earliest days of language processing by way of computing device, Margaret Masterman believed that that means, no longer grammar, used to be the foremost to knowing languages, and that machines might be sure the which means of sentences. This quantity brings jointly Masterman's groundbreaking papers for the 1st time, demonstrating the significance of her paintings within the philosophy of technology and the character of iconic languages.

Handbook of Natural Language Processing

This research explores the layout and alertness of traditional language text-based processing platforms, in line with generative linguistics, empirical copus research, and synthetic neural networks. It emphasizes the sensible instruments to deal with the chosen approach

Additional resources for Memory-Based Language Processing (Studies in Natural Language Processing)

Example text

A new example obtains its class by finding its position as a point in this space, and extrapolating its class from the k nearest examples in its neighborhood. Nearness is defined as the reverse of Euclidean distance. An early citation that nicely captures the intuitive attraction of the nearest-neighbor approach is the following: This ”rule of nearest neighbor” has considerable elementary intuitive appeal and probably corresponds to practice in many situations. For example, it is possible that much medical diagnosis is influenced by the doctor’s recollection of the subsequent history of an earlier patient whose symptoms resemble in some way those of the current patient.

3. 2 provide a great deal of information about the problem being studied: feature weighting about the relevance of different information sources, and MVDM about the implicit clustering of feature values in a task-dependent way. In addition, individual classifications can reveal interesting details such as the actual nearest neighbors used; also, a fully classified test set offers the possibility to compute detailed statistics more informative than accuracy, the overall percentage of correctly classified test instances.

The clustering for the gender (after mapping double gender assignments such as “Masculine or Feminine” to the most frequent gender) and nucleus of the last syllable shows that MVDM uses sensible phonological and lexical constructed knowledge implicitly. The lumping together of masculine and neuter gender and phonological categories such as front and back vowels do make sense and are sometimes used in morphological theories of German. The main advantage of MVDM is that these categories are automatically grouped in a task-dependent way, tuned to the task at hand, and arguably with more subtlety and more fine-grained than a representation in terms of phonological features and lexical classes.

Download PDF sample

Rated 4.96 of 5 – based on 33 votes