12. Features of modern architecture neural networks

Lecture



Modern neural network architectures. Current areas of basic research. Software and hardware implementations of neural networks. Neuroprocessors. Scientific and industrial applications.

Features of modern architecture.

Classical studies performed in the postwar years and further rapid progress in neuroinformatics in the 80s identified some common features of promising architectures and areas of research. And, although any assessments in this area are very subjective, the author found it possible to state his point of view on the observed trends. Let's stop on some of them.

  1. Tight conjugation of theoretical studies with the search for new physical principles and physical media for the hardware implementation of neural networks. Here, first of all, it should be noted optical systems, both linear and nonlinear: Fourier optics, holograms, nonlinear photorefractive crystals, optical waveguide fibers, electron-optical multipliers and others. Perspectives are also environments with natural autowave properties (chemical and biological). All these environments implement an important property of massive parallelism in information processing. In addition, they, as a rule, contain mechanisms of "self-regulation", allowing to organize training without a teacher.
  2. The hierarchy of architectures and the separation of neuron functions. In modern architectures, layers or individual neurons of several different types are used: command neurons-switches, threshold neurons, neural layers with lateral inhibition, operating on the principle of "winner takes all." The a priori separation of neuron functions greatly simplifies learning, since the network is initially structurally consistent with the task.
  3. Primary use of teaching methods without a teacher, due to self-organization. These methods have deep biological bases, they provide the local nature of training. This allows you not to use global network connectivity. Only external, output layers of neurons are trained with a teacher, and the role of a teacher is often reduced only to a general expert assessment of the quality of the network.
  4. Focus research and architectures directly on the application. Models of a general nature, such as the Hopfield network or multilayer perceptron, are mainly of scientific interest, since they allow for a relatively complete theoretical study.

This list is, of course, far from complete. It does not include, for example, modern research in the field of hybrid non-neon-expert systems using both formal logic and associative recognition. The reader can also analyze the types of neural networks in question to identify common properties and trends.

  12. Features of modern architecture neural networks

Today is Neuroscience.

The reader has already gathered some information from the history of neuroscience in the introduction. Fundamental research in the theory of neural networks and intellectual methods of information processing has reached a new phase after a series of specialized conferences held since 1986, directly devoted to neuroscience. In the autumn of 1988, the International Neural Networks Society (INNS - International Neural Networks Society) was established, which coordinates the world's "neuroactivity".

The World Congress on Neural Networks, to be organized in the summer of 1994, organized by this society, will sum up the main results and demonstrate the current state of basic research. To cover the development trends of neuroscience in general, we will focus on the main thematic issues of the program of this congress.

1. Biological vision. This section is headed by S.Grossberg.

2. Machine vision. The section covers aspects of modeling visual functions in technical systems. Special attention will be paid to the principles of selective attention to objects of the visual scene.

3. Speech and language. Various aspects of speech synthesis and recognition.

4. Biological neural networks. The topics of the section cover the properties of individual neurons, neural networks of control of movement and hearing, aspects of training in biological networks, as well as ways of transition from biological neurons to artificial (silicon).

5. Neuromanagement and robotics.

6. Training with a teacher.

7. Education without a teacher.

8. Pattern recognition.

9. Forecast and system identification. The methods of cybernetic modeling of complex systems based on neural networks are considered.

10. Neuroscience about consciousness. Aspects of the organization and modeling of higher nervous activity.

11. Connection of the science of consciousness with artificial intelligence.

12. Fuzzy neural systems. Construction of neuromodels of fuzzy logic.

13. Signal processing. One of the oldest areas of application of neural networks and pattern recognition theory is the isolation and analysis of signal properties from noise.

14. Neurodynamics and chaos. This includes the properties of neural networks as nonlinear dynamic systems.

15. Hardware implementations. The key issue for promising applications is new physical principles and environments for information processing.

16. Associative memory.

17. Applications. This section will appear to be the most widely represented.

18. Neurocomputation and virtual reality. Here we consider the possibility of using neural networks and highly parallel computing on them to create artificial reality. A complex hardware-software system of virtual reality models the main signals perceived by a person from the outside world and reacts to his actions, replacing the real world.

19. Networks and systemic neuroscience. The focus of this section will be on the temporal behavior of signals in the neural circuits of both biological and artificial networks.

20. Mathematical foundations.

  12. Features of modern architecture neural networks

Some sections, such as teaching with and without a teacher, neurodynamics and associative memory, pattern recognition, solving mathematical problems on neural networks, have been touched upon in this book as the main classical results. Others may be familiar to the reader from other books (including science fiction). Some seemed completely new. For all of them, we look forward to the results of the congress.

Comment to the 1998 electronic version. The 1994 Congress successfully held. After it passed and other forums, neuroinformatics replenished with new applications. Of particular interest has appeared in applications in the field of economics and finance.

Software and hardware. Neuro-computer.

To date, an extensive market for neural network products has emerged. The vast majority of products are presented as modeling software. Leading firms are also developing specialized neurochips or neural payments in the form of consoles for ordinary computers (as a rule, personal computers of the IBM PC AT line). In this program can work without neuro consoles, and with them. In the latter case, the speed of the hybrid computer increases hundreds and thousands of times.

We list some of the most well-known and popular neural systems and their manufacturers.

NeuralWorks Professional II Plus. This is one of the latest versions of the NeuralWorks software product developed by NeuralWare. The package contains software models of dozens of neural network architectures (including some of those discussed in this book). The company also announced the release of a version of the package for workstations such as SUN and parallel processors nCUBE.

ExploreNet 3000 software package. Developed by HNC, founded by Prof. Robert Hecht-Nielsen. The package provides ample opportunities for modeling and data management. Hardware accelerators from HNC, the ANZA and ANZA + neuroprocessors, which are among the first hardware solutions, are used as an accelerator. The firm also offered a tool for developing application programs — the specialized AXON programming language based on the C language.

NeuroShell 2.0 shell. The advantage of this program is compatibility with the popular Microsoft Excel Excel data management package, which makes the product convenient for mass use.

In Russia, also known are the developments of the Research Institute of Multiprocessor Computing Systems, Taganrog (VLSI for digital neurocomputers, having about 100,000 gates and operating at 20 MHz), the Moscow Center for Neurocomputers (hardware systems based on transputers). Among the software systems should be noted the development of the Neurocybernetics Department of the Krasnoyarsk University, the image recognition system of the Research Institute of Neurocybernetics of the Rostov University and the Institute of Applied Physics in Nizhny Novgorod.

In 1993, the German company Simens announced the release of the fastest neurocomputer to date, called SYNAPSE-I . This neurocomputer as a whole is a system from a controlling (host) machine and a specialized neural processor with a local memory for synaptic scales. In each neural network paradigm, a relatively small set of operations specific to neural networks can be distinguished, which can be very effectively performed in parallel mode on a specialized processor. Such operations include, for example, multiplication and addition of matrices and vectors, matrix transposition, calculation of threshold transformations, parallel calculation of table functions, and others. The remaining fragments of the algorithm, which have a well-developed logic, but usually requiring only a few percent of the total computation time, can be successfully performed on an ordinary computer. In the SYNAPSE-1 neurocomputer, the Sun Sparc Station II workstation acts as such a host machine. The planned acceleration on neuro-operations in SYNAPSE-1 will be 8000 times (!) Compared to the host station. A user-friendly neural network problem-oriented nAPL programming language, a C ++ programming environment and a convenient UNIX-compatible operating system are provided for the user.

The neural systems listed above are relatively expensive and are intended primarily for professional use. For educational and research purposes in the annex to this book is a simple program that implements the algorithms for learning and recognition of a single-layer perceptron. A reader familiar with the Pascal programming language can use this program by providing it with I / O modules for experimenting with a neuron network, as well as an introduction to the technology of creating neuroprogramming software.

Results

This book is complete, but in neuroscience, of course, early to put an end. The author hopes that this textbook will not only perform its main function - a systematic introduction to the theory of neural networks - but also help to get closer to answering the important question: are artificial neural networks a long-awaited mainstream in which the development of artificial intelligence will continue, or they will turn out to be a trend of a peculiar fashion, as it used to be with expert systems and some other scientific research instruments (for example, Feynman diagrams), which were initially expected from Evolutionary breakthroughs. Gradually, however, these methods revealed their limitations and occupied an appropriate (but worthy!) Place in the general structure of science.

Today, neural networks are no longer the lot of a small group of theorists. Engineers and researchers of various specialties are connected to neural network applications. Particularly pleased with the progress in building successful neural network models of the phenomena studied, based entirely on experimental data. Here the remarkable properties of artificial neural systems are most fully manifested: massive parallel processing of information, memory associativity and opportunity for learning by experience. This opens up new perspectives for systematization of numerous experimental information in areas of knowledge where mathematical formalism is traditionally difficult to take root, for example, in medicine, psychology and history.


Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

Computational Neuroscience (Theory of Neuroscience) Theory and Applications of Artificial Neural Networks

Terms: Computational Neuroscience (Theory of Neuroscience) Theory and Applications of Artificial Neural Networks