Neurocomputer

Lecture



  Neurocomputer
Frank Rosenblatt and Mark 1 on the left.

A neurocomputer is a device for processing information based on the principles of operation of natural neural systems. [1] These principles were formalized, which allowed to speak about the theory of artificial neural networks. The problem of neurocomputers is to build real physical devices, which will allow not only to simulate artificial neural networks on a regular computer, but also to change the principles of computer operation, which makes it possible to say that they work in accordance with the theory of artificial neural networks.

Content

  • 1. History
  • 2 The main idea - connectionism
  • 3 The problem of efficient concurrency
  • 4 Modern Neurocomputers
  • 5 New twist - “wet product”
  • 6 Applications
  • 7 See also
  • 8 Literature
  • 9 Notes

Story

The terms neurocybernetics, neuroinformatics, neurocomputers entered into scientific use recently - in the mid 80s of the XX century. However, the electronic and biological brains were constantly compared throughout the history of computing. The famous book of N. Wiener "Cybernetics" (1948) [2] has the subtitle "Control and communication in the animal and the machine."

The first neurocomputers were Rosenblatt perceptrons: Mark-1 (1958) and Tobermory (1961-1967) [3] , as well as Adaline, developed by Widrow and Hoff (1960) on the basis of the delta rule ( Widrow formula ) [4] . Currently, Adaline (an adaptive adder learning using the Widrow formula) is a standard feature of many signal processing and communication systems. [5] In the same series of the first neurocomputers is the program "Cora", developed in 1961 under the leadership of M. M. Bongard [6] .

The monograph of Rosenblatt (1958) [7] played a major role in the development of neurocomputers.

The idea of ​​neuro-bionics (creation of technical means on neuro-principles) began to be intensively implemented in the early 1980s. The impulse was the following contradiction: the sizes of the elementary parts of computers became equal to the sizes of elementary “information converters” in the nervous system, the speed of individual electronic elements was achieved millions of times greater than that of biological systems, and the efficiency of solving problems, especially related tasks of orientation and decision making natural environment, living systems are still unattainable above.

The theoretical impetus of the 1980s on the theory of neural networks (the Hopfield network, the Kohonen network, the back propagation error method) gave another impetus to the development of neurocomputers.

Main idea - connectionism

Unlike digital systems, which are combinations of processor and storage units, neuroprocessors contain memory distributed in the connections between very simple processors, which can often be described as formal neurons or blocks of similar formal neurons. Thus, the main load on the implementation of specific functions by processors rests on the system architecture, the details of which, in turn, are determined by interneuron connections. An approach based on the representation of both data memory and algorithms by a system of links (and their weights) is called connectionism.

Three main advantages of neurocomputers:

  1. All algorithms of neuroinformatics are highly parallel, and this is a guarantee of high speed.
  2. Neurosystems can easily be made very resistant to interference and destruction.
  3. Stable and reliable neural systems can also be created from unreliable elements with a significant variation in parameters.

Developers of neurocomputers seek to combine the stability, speed and parallelism of AVMs - analog computers - with the versatility of modern computers. [eight]

The problem of efficient concurrency

For the role of the central problem solved by all neuroinformatics and neurocomputing, A. Gorban [9] proposed the problem of efficient parallelism. It has long been known that computer performance increases much more slowly than the number of processors. M. Minsky formulated a hypothesis: the performance of a parallel system increases (approximately) in proportion to the logarithm of the number of processors — this is much slower than a linear function (Minsky’s hypothesis).

To overcome this limitation, the following approach is used: for various classes of problems, maximally parallel solution algorithms are built that use some kind of abstract architecture (paradigm) of fine-grained parallelism, and for concrete parallel computers, means are created for implementing parallel processes of a given abstract architecture. As a result, an efficient apparatus for the production of parallel programs appears.

Neuroinformatics supplies universal fine-grained parallel architectures for solving various classes of problems. For specific tasks, an abstract neural network implementation of the solution algorithm is built, which is then implemented on specific parallel computing devices. Thus, neural networks make it possible to effectively use parallelism.

Modern neurocomputers

The long-term efforts of many research groups have resulted in a large number of different “learning rules” and architectural networks, their hardware implementations and techniques for using neural networks to solve applied problems.

These intellectual inventions [10] exist as a “zoo” of neural networks. Each network of the zoo has its own architecture, the rule of learning and solves a specific set of tasks. In the last decade, serious efforts have been made to standardize the structural elements and transform this “zoo” into a “technopark” [11] : each neural network from the zoo is implemented on an ideal universal neurocomputer with a given structure.

The basic rules for the allocation of functional components of an ideal neurocomputer (according to Mirkes):

  1. Relative functional isolation: each component has a clear set of functions. Its interaction with other components can be described in the form of a small number of requests.
  2. The ability to interchange different implementations of any component without changing other components.

The neurocomputer market is gradually taking shape. Currently, various highly parallel neuro-accelerators [12] (co-processors) for various tasks are widely distributed. There are few models of universal neurocomputers in the market, partly because most of them are implemented for special applications. Examples of neurocomputers are the Synapse neurocomputer (Siemens, Germany), [13] NeuroMatrix processor [14] . A specialized scientific and technical journal “Neurocomputers: development, application” is being published [15] . Annual conferences on neurocomputers are held [16] . From a technical point of view, today's neurocomputers are computing systems with parallel streams of the same commands and multiple data streams (MSIMD architecture). This is one of the main directions of development of computing systems with massive parallelism .

An artificial neural network can be transmitted from a (neuro) computer to a (neuro) computer, as well as a computer program. Moreover, based on it, specialized high-speed analog devices can be created. There are several levels of alienation of a neural network from a universal (neuro) computer [17] : from a network learning on a universal device and using rich possibilities in manipulating a task book, learning algorithms and architecture modifications, until complete alienation without training and modification opportunities, only the functioning of the trained network .

One way to prepare a neural network for transmission is to verbalize it: the trained neural network is minimized while retaining useful skills. The description of a minimized network is more compact and often allows for a clear interpretation.

New twist - “wet product”

In neurocomputing, a new direction is gradually maturing, based on the connection of biological neurons with electronic elements. By analogy with Software (software - "soft product") and Hardware (electronic hardware - "hard product"), these developments were calledWetware ( English ) - "wet product".

Currently, there is already a technology for connecting biological neurons with ultra miniature field-effect transistors using nanofibers (Nanowire). [18] The development uses modern nanotechnology. In particular, carbon nanotubes are used to create connections between neurons and electronic devices. [nineteen]

Another definition of the term “Wetware”, a human component in human-computer systems, is also common.

Applications

  1. Real-time control [20] [21] , including:
    • airplanes and rockets [22] ,
    • technological processes of continuous production (in power, metallurgy, etc.) [23] ,
    • hybrid car engine [24] ,
    • pneumatic cylinder [25] ,
    • welding machine [26] ,
    • electric furnace [27] ,
    • turbogenerator [28] .
  2. Pattern recognition:
    • images [29] , human faces [30] , letters and hieroglyphs, fingerprints in forensic science, speech, radar and sonar signals,
    • elementary particles and physical processes occurring with them (experiments on accelerators or observation of cosmic rays),
    • diseases by symptoms (in medicine) [31] ,
    • areas where minerals should be sought (in geology, by indirect evidence),
    • danger signs in security systems
    • properties of chemical compounds by structure (in chemo informatics) [32]
  3. Real Time Forecasting:
    • weather
    • stock price (and other financial indicators) [33]
    • treatment outcome,
    • political events (election results, international relations, etc.) [34] [35]
    • enemy’s behavior (real or potential) in military conflict and economic competition,
    • sustainability of marital relations.
  4. Optimization - search for the best options:
    • when designing technical devices, [36]
    • when choosing an economic strategy,
    • when selecting a team (from employees of the enterprise to athletes and members of polar expeditions),
    • in the treatment of the patient.
  5. Signal processing in the presence of large noise.
  6. Prosthetics (“smart prostheses”) and enhancement of natural functions [37] , including through direct connection of the human nervous system to computers (Neuro-computer interface).
  7. Psychodiagnostics [38] [39] [40]
  8. Telecommunications fraud, its detection and prevention using neural network technologies - according to some experts [41] is one of the most promising technologies in the field of information security in telecommunications networks.
  9. Information Security [42]

Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

Computer circuitry and computer architecture

Terms: Computer circuitry and computer architecture