By W.J. Hutchins
Read Online or Download Machine Translation: Past, Present, Future PDF
Best ai & machine learning books
Synthetic Intelligence via Prolog e-book
As a pioneer in computational linguistics, operating within the earliest days of language processing by means of computing device, Margaret Masterman believed that which means, no longer grammar, was once the major to knowing languages, and that machines may well confirm the which means of sentences. This quantity brings jointly Masterman's groundbreaking papers for the 1st time, demonstrating the significance of her paintings within the philosophy of technological know-how and the character of iconic languages.
This learn explores the layout and alertness of usual language text-based processing platforms, in response to generative linguistics, empirical copus research, and synthetic neural networks. It emphasizes the sensible instruments to house the chosen approach
Extra info for Machine Translation: Past, Present, Future
In the first case, it was the coastguard who had the binoculars; therefore the PP with the binoculars modifies the verb. But in the second case, the PP with a beard modifies the preceding noun man. Only semantic information can assist the analysis by assigning semantic codes allowing binoculars as ‘instruments’ to be associated with 'perceptual' verbs such as observe but prohibiting beards to be associated with objects of verbs such as sell. Such solutions have been applied in many MT systems since the mid-1960's (as the following descriptions of systems will show).
E. routines devised for one corpus were tested on another, improved, tested on a third corpus, improved again, and so forth. The empirical approach was in fact fully in accord with the dominant linguistic methodology of the 1940’s and 1950’s in the United States, the descriptivist and structuralist ‘tradition’ associated particularly with Leonard Bloomfield (1933). The descriptivists adopted the behaviourist and positivistic method which insisted that only interpersonally observed phenomena should be considered ‘scientific’ data, and which rejected introspections and intuitions.
Etc. g. He left home → His leaving home, Dogs bark → The barking of dogs. For Harris, transformations were a descriptive mechanism for relating surface structures, while in Chomsky’s model, transformational rules derive surface structures from ‘deeper’ structures. e. from a ‘deep’ structure should be generated semantically equivalent surface structures. Although Chomsky’s syntactic theory has undoubtedly had most influence, the formalisation of transformations by Harris had considerable impact in MT research, particularly in the representation of SL-TL structural transfer rules.