Introduction to the history of computer technology

Lecture



  Introduction to the history of computer technology

III century. Abacus with moving knuckles allowed to speed up the calculations

At all times, people had to count. In the foggy prehistoric past, they counted on the fingers or made notches on the bones. Approximately 4,000 years ago, at the dawn of human civilization, already quite complex number systems were invented, allowing trade transactions, astronomical cycles, and other calculations to be made. A few millennia later, the first hand-held computing tools appeared. And nowadays, the most complicated computational problems, like many other operations, seemingly unrelated to numbers, are solved with the help of an "electronic brain", which is called a computer.

Experts, probably, will not fail to notice that a computer is not a brain (at least for now - some will clarify). It is simply another tool, another device designed to lighten our work or strengthen our power over nature. Indeed, with all its seeming splendor, a modern computer has essentially one and only talent to react with lightning speed to electrical voltage pulses. True greatness lies in a person, his genius, who has found a way to convert a variety of information coming from the real world into a sequence of zeros and units of binary code, that is, to write it in a mathematical language that is ideally suited to computer electronic circuits.

And yet, perhaps no other car in history has brought such rapid and profound changes to our world. Thanks to computers, significant achievements such as the landing of vehicles on the lunar surface and the exploration of the planets of the solar system became possible. Computers create thousands of amenities and services in our daily lives. They operate anesthesia equipment in operating rooms, help children study in schools, “invent” video tricks for cinema. Computers took over the functions of typewriters in the editorial offices of newspapers and computers in banks. They improve the quality of the television image, manage telephone exchanges, and determine the price of purchases at the box office of the general store. In other words, they have so firmly entered modern life that it is almost impossible to do without them.

In recent years, truly dizzying successes have been achieved in the power of computers and the breadth of their use. Basically, this was made possible by the appearance in the early 70s of a tiny technological miracle, called a microprocessor. On a small silicon crystal - smaller than the infant's claw - hundreds of thousands of electronic components are placed, exceeding in their performance the entire dinosaur halls that dominated the computer world several years ago.

Despite such rapid progress in our day, the laying of the foundation of the computer revolution was slow and far from smooth. The starting point of this process can be considered the invention of accounts, made more than 1500 years ago, apparently, in the countries of the Mediterranean. Merchants used this simple device consisting of a set of knuckles strung on rods for their calculations. In the arithmetic sense, the bars of the accounts are digits of the number system: each knuckle on the first core has a value of 1, a value of 10 on the second core, etc. The scores turned out to be a very effective tool and soon spread around the world, and in some countries they also to this day. Up to the 17th century, which was marked by an unprecedented upsurge of creative thought, the scores as a computing tool remained practically out of competition.

  Introduction to the history of computer technology

1617 g. In Naper's counting device, the multiplication operation was performed by adding the numbers located in adjacent segments.

European thinkers of that era were passionate about the idea of ​​creating counting devices. One of the most fruitful inventors was Scottishman John Napier, a theologian, mathematician and inventor of the “weapon of death”, who decided to design a system of mirrors and lenses that would hit a target with a deadly sunbeam. However, a more noticeable mark in history was left by the invention of logarithms, as reported in the 1614 publication. Logarithm is a measure of the degree to which a number must be erected (the base of the logarithm) in order to obtain another given number. Napier understood that any number could be expressed in this way. For example, 100 is 10 squared, and 23 is 10 to the extent of 1.36173. Moreover, he discovered that the sum of the logarithm of the numbers a and b is equal to the logarithm of the product of these numbers. Due to this property, the complex action of multiplication was reduced to a simple addition operation. To multiply two large numbers, you only need to look at their logarithms in the table, add the values ​​found and find the number corresponding to this sum in the inverse table, called the anti-log table.

Napier's tables, the calculation of which required a lot of time, were later “built in” into a convenient device, which extremely speeds up the calculation process, a slide rule; It was invented in the late 1620s. Napier invented in 1617 (the year of his death) and another - non-logarithmic - method of multiplying numbers. The tool, known as Naper's knuckles, consisted of a set of segmented rods, which could be positioned in such a way that by folding the numbers in horizontal segments adjacent to each other, we obtained the result of their multiplication. The theory of logarithms of Napier was destined to find extensive applications. However, his “knuckles” were soon supplanted by a slide rule and other computing devices - mostly of the mechanical type.


Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

History of computer technology and IT technology

Terms: History of computer technology and IT technology