Neural computing architectures : the design of brain-like by Igor Aleksander

By Igor Aleksander

McClelland and Rumelhart's Parallel allotted Processing was the 1st e-book to give a definitive account of the newly revived connectionist/neural internet paradigm for synthetic intelligence and cognitive technology. whereas Neural Computing Architectures addresses a similar concerns, there's little overlap within the study it reviews. those 18 contributions supply a well timed and informative assessment and synopsis of either pioneering and up to date eu connectionist study. numerous chapters specialise in cognitive modeling, even though, lots of the paintings lined revolves round summary neural community conception or engineering functions, bringing very important complementary views to at present released paintings in PDP.

In 4 components, chapters absorb neural computing from the classical standpoint, together with either foundational and present paintings; the mathematical standpoint (of good judgment, automata concept, and likelihood theory), providing much less famous paintings during which the neuron is modeled as a good judgment fact functionality that may be carried out in a right away manner as a silicon learn in simple terms reminiscence. They current new fabric either within the kind of analytical instruments and types and as feedback for implementation in optical shape, and summarize the PDP point of view in one prolonged bankruptcy overlaying PDP thought, software, and hypothesis in U.S. study. each one half is brought through the editor.

Igor Aleksander is Professor of laptop technology at Imperial collage in London

Show description

Read Online or Download Neural computing architectures : the design of brain-like machines PDF

Best ai & machine learning books

Artificial Intelligence Through Prolog

Synthetic Intelligence via Prolog ebook

Language, Cohesion and Form (Studies in Natural Language Processing)

As a pioneer in computational linguistics, operating within the earliest days of language processing through machine, Margaret Masterman believed that which means, now not grammar, was once the most important to knowing languages, and that machines may well ensure the that means of sentences. This quantity brings jointly Masterman's groundbreaking papers for the 1st time, demonstrating the significance of her paintings within the philosophy of technology and the character of iconic languages.

Handbook of Natural Language Processing

This learn explores the layout and alertness of traditional language text-based processing structures, in line with generative linguistics, empirical copus research, and synthetic neural networks. It emphasizes the sensible instruments to house the chosen process

Additional info for Neural computing architectures : the design of brain-like machines

Example text

The state of the system might be described by a two-dimensional vector whose elements are the angles 0. and (}2 of the two pivot arms. However, it is obvious that this system only has one degree of freedom because 0l and O2 are completely correlated. Fig. lOth) shows a pattern space description of the possible states which are clearly shown to be inherently one­ dimensional. This is a very common type of natural structure: the inherent dimensionality of a set of patterns may be much less tha n the number of p (a) Figure 10 50 t °1 (b) Natural pattern structure: constrailledfrecdolll.

If this constraint is not obeyed, the map will be topologically correct when viewed locally, but not ordered globally. ----� vector X Figure 12 52 A one-dimellsiollailleural array. Neural map applicatiolls In our work, we have used t he scalar prod uct of X a nd Wi as the sim ila rity metric, S. This metric seems to have many ad vantages over other metrics, particularly in the speech recognition a ppl ications. This is because the ranki ng of t h e excitations of the neu ral elem e n t is unchanged by a change in magnitude of X and consequently neural maps become o rdered in terms of t h e profiles of i n p ut pat terns alone and a re not affected, for exam ple, by the loud ness of a particular soun d .

W. IEEETC 1 8, 40 1 -409 ( 1 969). 4. K o h o n e n , T. I n Proc. 8th IIll. jliit ioll, pp. 1 1 48 - 1 1 5 1 ( I E E E Com puter Society Press, Was hi ngto n, DC, 1 986). 5. Meddis, R. J. A couse. Soc. Am. 79, 703-7 1 1 ( 1 986). 6. Kohonen, T. BioI. C)'b. 43, 59-69 ( ) 982). 7. Kohonen, T. Self- Orf]Gn iza r ioll Gnd A ssoriar iet! Memory ( Spri n g e r Verlag, H eidel berg, 1 9 84; 2nd edn 1 988). 8. Kohonen, T. BioI. Cyb. 44. 1 35 - 1 40 ( 1 982). 9. K ohonen, T. In Proc. 6th IIll. COli! 011 Pattern R ecogllition, pp.

Download PDF sample

Rated 4.28 of 5 – based on 16 votes