Download a presentation on the history of the creation of the computer. The emergence of computers. It is customary to divide electronic computers into generations.

History of origin: The word computer, came to us from the distant eighteenth
century. It first appears in the Oxford Dictionary.
Initially, the concept of a computer was interpreted as a calculator.
It is this translation of this word with in English... It
differed from today in what could be applied
absolutely to any computing device, and not
necessarily electronic.

If we take exactly the meaning of this word, then to the computer you can
count any counting devices (in fact, even
counting sticks, remember, there were such in childhood?). But, exactly
computer, a device for computing, the most logical
to call abacus - ancient Arabic abacus. Why abacus? Many,
probably remember the time when sellers in stores had
not calculators, but abacus with wooden knuckles? Remember,
How cleverly did the saleswomen handle them? On accounts you can
after all, not only add and subtract, you can multiply and divide,
perform calculations with fractions and even powers! Need to
just be able to. So, in essence, abacus is the first
calculator, that is, a computer.

However, there are a lot of such computers.
devices and mechanisms invented by mankind in
time interval between the ancient abacus and the modern
computer.
This and a Greek counter found at the end of the past
century and dated to about the eightieth year BC
era, and called Antikythera.
The legendary Leonardo da Vinci did not pass by his attention
the topic of calculators. Truth from him we got only
drawings of his developments, and it is not known whether he embodied them in
reality is the inventor himself. But our contemporaries
embodied. They say the device is quite viable.

In the seventeenth century, mechanical calculators appeared, so
called counting hours. Why watch? Because
the mechanism is like a watch - a gear transmission. The "parent" of these
the name of the clock was Wilhelm Schickard.
A little later, in the same seventeenth century, were invented
slide rule, straight and circular.
In general, the seventeenth century turned out to be very rich in
invention of all sorts of computing devices

So in the same seventeenth century, a mechanism was invented,
who could subtract and add decimal numbers... And not
only invented, but also quite successfully implemented! Was called
he is "Pascalina", by the name (or rather surname) of its creator,
Bezil Pascal.
At the end of the same seventeenth century, a German scientist created
adding machine.

The adding machine has been improved. I put my hand to it
many people, but the modern computer was still far away.
At the end of the nineteenth century, adding machines were already working on
special tables. Thanks to the advent of electricity,
an electrical tabulating system was invented.
In general, since the second half of the nineteenth century, the process
improvements began, as they say, to gain momentum,
like a snowball. Who just took part in it! Scientists
all countries, as if competing, improved, improved,
added.
And by the beginning of the twentieth century, adding machines were not only adding and
subtracted, they did complex calculations with fractions, integrals.

In the first half of the twentieth century, in America, in
University of Massachusetts, thanks to Vannyvarow Bush
an analog computer appeared. Well, not that at all
computer, of course. Rather mechanical integrating
calculator. And used it (who would doubt) the military for
calculations of the trajectory of the projectile.
And only in the forties of the last century a car appeared,
having the basic functions of a modern computer. It was
the so-called Z-three, created by the German scientist Konrad
Zuse.

With the outbreak of World War II, work on
computer improvements have become more intense.
Americans invented a computer for their armed forces
Mark-one, similar to the already existing computing
Bush machine, but more perfect and faster in calculations.
The Germans invented a cipher machine, in response the British
make a decryption machine. Etc.
A decade after the end of the war, when
The "competition" between the USA and the USSR has gained momentum,
American scientists create a transistor computer.

A year later, scientists of the USSR created a ternary
ternary computing machine
calculus.
But all these electronic computers, though
worked not from mechanical "turning the handle", but from
electricity were so bulky that they took
whole rooms. One room - one computer! Consumed
a lot of energy, it took a considerable staff of workers to
to service this structure. In fact, the people who worked
with such computers, worked inside these same
computers.
And only since the mid-sixties of the twentieth century, since
the invention of mini parts that gradually replaced
tube transistors and resistors, the size of the computer began
decrease rapidly, gradually reaching the usual
us today.

In the early seventies of the twentieth century by Tedd Hoff
a microprocessor was developed. But the creator of the computer,
widely adopted, which became truly
personal, became Steve Wozniak, one of the founding fathers
famous and popular Apple company.

The computer was bulky. A monitor made by the principle
TV, took up a lot of space, was not color.
The programs also left much to be desired. In fact it was
overgrown calculator.

It seems that the computer is a child of many "parents". After all, a lot
people contributed to the improvement of the calculator,
which, in fact, is a computer. But modern
the computer is not only concerned with calculus. After all, I must
to be someone who was able to "teach" our computers all that
what they can do.
Indisputably. Perhaps a computer as we know it
today, appeared thanks to the competition of two firms, Microsoft and
Apple.

That is, to the creators of the computer we are used to is
include Bill Gates and Steve Jobs, with their companions,
developers, programmers ...
It is their firms, competing for potential buyers
products, all improved programs, improved
the appearance of their products, increased productivity and
achieved greater compactness.

Today, almost every home has a computer, if not
one. He became integral part our life, especially with
the advent of the Internet. Computer - an assistant in work, a storehouse
information, communication, entertainment ... Now we are not
imagine life without a computer. Us today
it is difficult to understand why the older generation cannot
to master simple actions and refers to a computer with some kind,
almost biased, caution. The younger generation even
does not imagine that until recently computers were not at all
It was!

Computers and computer technology constantly evolving,
gaining momentum in power and its mechanical "mind".
A huge drop in the price of cyber systems has led to the fact that
now, none of even the smallest enterprises
dispenses with a computer. They have penetrated almost everything
spheres of human activity. In production, medicine,
agriculture, educational institutions and more.

It's very hard to imagine a modern office worker from
thick piles of papers, lined with shelves
with folders and bills in hands. All this replaced
electronic and virtually dimensionless computer memory.
A modern office has gotten rid of the bureaucratic routine,
manual filling of documents and dusty light workers.
All thanks to our electronic assistant - a computer.

Imagine how much effort it took for an architect and
the designer to create a project for a new building
or a complex of structures. It was necessary to make a layout,
create hundreds and thousands of drawings, calculate characteristics
materials used to build. All this took away
a lot of time and effort for a whole staff of workers. Today
most of the tasks are transferred to the computer.

With the advent of the computer to the trade, the era ended
mechanical scales and cash registers... Seller work,
including weighing the goods, estimating the cost and counting
the amount has accelerated significantly. It is enough to spend
the barcode of the goods in front of the electronic eye, and the computer itself
will calculate the cost of goods, say the amount and issue a receipt
to the buyer. Shops and market points finally got rid of
from agonizing time in queues.

Remove your computer from medical centersand we will return to
a hundred years ago. Making a diagnosis incredibly difficult
patient or body diagnostics. Young mother until the very
the moment the baby is born, he will not be able to know for sure about the course
pregnancy and possible complications during childbirth without
ultrasound examination. With the help of a computer
tomography, the doctor can penetrate into the most hidden areas
the patient's brain and thus in advance
prevent possible violations, and if such are not
avoid, correctly determine the method of treatment and increase
chances of recovery.

A special place is occupied by a computer in educational institutions. Such
impartial and patient teacher, it is impossible to find
among people. Assignments in the form of e-learning
materials are often entertaining or even playful
character. Thus, the material is much more efficient.
absorbed by students without causing boredom and fatigue.
The computer as an examiner, will never lean
on personal claims to the student and will give a grade based on
only for practical knowledge.

Conclusion

The computer is the most important part of a person. Thanks to
computer, we have progress in science, medicine, etc. Now
through a computer, you can make purchases online and
transfer data and much more.

The first adaptations. About when mankind learned to count, we can only speculate. But it's safe to say that our ancestors used fingers for simple counting, a method that we still successfully use today. And what to do in the event that you want to remember the results of calculations or count something that is more than fingers. In this case, you can make notches in wood or bone. Most likely, this is what the first people did, as evidenced by archaeological excavations. Perhaps the oldest such instrument found is a bone with notches found in the ancient settlement of Dolny Vestonizi in the south-east of the Czech Republic in Moravia. This object, called the "Vestonitskaya bone", was supposedly used in 30 thousand years BC. e. Despite the fact that at the dawn of human civilizations, rather complex systems of calculation were invented, the use of serifs for counting continued for quite some time. So, for example, 2 thousand years BC. a ruler was carved into the lap of the statue of the Sumerian king Gudea, divided into sixteen equal parts. One of these parts was in turn divided into two, the second into three, the third into four, the fourth into five, and the fifth into six equal parts. In this case, in the fifth part, the length of each division was 1 mm. About when mankind learned to count, we can only speculate. But it's safe to say that our ancestors used fingers for simple counting, a method that we still successfully use today. And what to do if you want to remember the results of calculations or to count something that is more than fingers. In this case, you can make notches in wood or bone. Most likely, this is what the first people did, as evidenced by archaeological excavations. Perhaps the oldest such instrument found is a bone with notches found in the ancient settlement of Dolny Vestonizi in southeastern Bohemia in Moravia. This item, called the "Vestonitskaya bone", was supposedly used for 30 thousand years BC. e. Despite the fact that at the dawn of human civilizations, rather complex systems of calculation were invented, the use of serifs for counting continued for quite some time. So, for example, 2 thousand years BC. on the lap of the statue of the Sumerian king Gudea, a ruler was carved, divided into sixteen equal parts. One of these parts was in turn divided into two, the second into three, the third into four, the fourth into five, and the fifth into six equal parts. In this case, in the fifth part, the length of each division was 1 mm.


This period was the beginning of the commercial use of electronic computers for data processing. Computing machines of that time used vacuum tubes and external memory on a magnetic drum. They were entangled with wires and had an access time of 1x10-3 s. Production systems and compilers have not yet emerged. At the end of this period, magnetic core memory devices began to be produced. The reliability of this generation of computers was extremely low. This period was the beginning of the commercial use of electronic computers for data processing. Computing machines of that time used vacuum tubes and external memory on a magnetic drum. They were entangled with wires and had an access time of 1x10-3 s. Production systems and compilers have not yet emerged. At the end of this period, magnetic core memory devices began to be produced. The reliability of this generation of computers was extremely low.


Second generation of computers (gg.) The element base of machines of this generation were semiconductor devices. The machines were designed to solve various labor-intensive scientific and technical problems, as well as to control technological processes in production. The advent of semiconductor elements in electronic circuits has significantly increased the capacity of random access memory, the reliability and speed of computers. Reduced size, weight and power consumption. With the advent of second-generation machines, the scope of use of electronic computing technology has significantly expanded, mainly due to the development of software. Specialized machines also appeared, for example, computers for solving economic challenges, for management production processes, information transmission systems, etc. Semiconductor devices were the element base of this generation of machines. The machines were intended for solving various labor-intensive scientific and technical problems, as well as for controlling technological processes in production. The advent of semiconductor elements in electronic circuits has significantly increased the capacity of random access memory, the reliability and speed of computers. Reduced size, weight and power consumption. With the advent of second-generation machines, the scope of use of electronic computing technology has significantly expanded, mainly due to the development of software. Specialized machines also appeared, for example, computers for solving economic problems, for managing production processes, information transmission systems, etc.


This period is characterized by the widespread use of transistors and advanced core memory circuits. Much attention began to be paid to the creation of system software, compilers and input-output devices. At the end of this period, universal and fairly effective compilers for Cobol, Fortran and other languages \u200b\u200bappeared. This period is characterized by the widespread use of transistors and advanced core memory circuits. Much attention began to be paid to the creation of system software, compilers and input-output devices. At the end of this period, universal and fairly effective compilers for Cobol, Fortran and other languages \u200b\u200bappeared. The access time of 1x10-6 s was already achieved, although most of the elements of the computer were still connected by wires. The access time of 1x10-6 s was already achieved, although most of the elements of the computer were still connected by wires. Computing machines of this period were successfully used in areas related to the processing of data sets and solving problems that usually require the execution of routine operations in factories, offices and banks. These computers worked on the principle of batch processing. In essence, manual data processing methods were copied. The new possibilities offered by computers were practically not used. Computing machines of this period were successfully used in areas related to the processing of data sets and solving problems that usually require the execution of routine operations in factories, offices and banks. These computers worked on the principle of batch processing. In essence, manual data processing methods were copied. The new possibilities offered by computers were practically not used. It was during this period that the profession of a computer scientist emerged, and many universities began to provide an opportunity for education in this area. It was during this period that the profession of a computer scientist emerged, and many universities began to provide an opportunity for education in this area.


Third generation of computers (years) Third generation of computers (years) The third generation machines included "Dnepr-2", computers Unified System (EC-1010, EC-1020, EC-1030, EC-1040, EC-1050, EC-1060 and several of their intermediate modifications - EC-1021, etc.), MIR-2, "Nairi-2" and a number of others ... The third generation machines included "Dnepr-2", computers of the Unified System (EC-1010, EC-1020, EC-1030, EC-1040, EC-1050, EC-1060 and several of their intermediate modifications - EC-1021, etc. ), MIR-2, "Nairi-2" and a number of others. This period is associated with the rapid development of real-time computers. A tendency has emerged according to which, in control tasks, along with large computers, there is a place for the use of small machines. So, it turned out that the mini-computer copes exceptionally well with the control functions of complex industrial installations, where a large computer often fails. Complex control systems are divided into subsystems, each of which uses its own mini-computer. A large real-time computer is assigned planning (observation) tasks in a hierarchical system in order to coordinate the management of subsystems and process central data about an object. This period is associated with the rapid development of real-time computers. A tendency has emerged according to which, in control tasks, along with large computers, there is a place for the use of small machines. So, it turned out that the mini-computer copes exceptionally well with the control functions of complex industrial installations, where a large computer often fails. Complex control systems are divided into subsystems, each of which uses its own mini-computer. A large real-time computer is assigned planning (observation) tasks in a hierarchical system in order to coordinate the management of subsystems and process central data about an object.


The fourth generation of computers (gg.) Software for small computers at first it was quite elementary, but by 1968 the first commercial real-time operating systems appeared, specially developed programming languages \u200b\u200bfor them high level and cross systems. This has made small machines available for a wide variety of applications. Today it is hardly possible to find such a branch of industry in which these machines would not be successfully used in one form or another. Their functions in production are very diverse; so, you can specify simple data collection systems, automated test benches, process control systems. It should be emphasized that the control computer is now increasingly intruding into the field of commercial data processing, where it is used to solve commercial problems. The software for small computers was at first quite elementary, but by 1968 the first commercial real-time operating systems appeared, specially developed high-level programming languages \u200b\u200band cross systems. This has made small machines available for a wide variety of applications. Today it is hardly possible to find a branch of industry in which these machines would not be successfully used in one form or another. Their functions in production are very diverse; so, you can specify simple data collection systems, automated test benches, process control systems. It should be emphasized that the control computer is now increasingly intruding into the field of commercial data processing, where it is used to solve commercial problems.


Fifth generation of computers In parallel with the hardware improvement of modern computers, technological developments are being developed to increase the number of instructions. The first development in this area was MMX (MultiMedia eXtension) technology, which can turn a "simple" Pentium PC into a powerful multimedia system. In parallel with the hardware improvement of modern computers, technological developments are being developed to increase the number of instructions. The first development in this area was MMX (MultiMedia eXtension) technology, which can turn a "simple" Pentium PC into a powerful multimedia system. With MMX technology, Intel aimed to achieve two goals: at first, take advantage of unused features, and second, increase CPU performance when running typical multimedia programs. For this purpose, additional instructions (57 in total) and additional data types were added to the processor instruction system, and the registers of the floating point calculation unit perform the functions of working registers. With MMX technology, Intel aimed to address two goals: first, to take advantage of unused features, and second, to increase CPU performance when running typical multimedia programs. For this purpose, additional instructions (57 in total) and additional data types were added to the processor instruction system, and the registers of the floating point calculation unit perform the functions of working registers.



Slide number 1


Slide number 2

Do you know how computers actually evolved, before taking on the look we are used to. It was a rather long story with numerous discoveries, each of which gradually propelled humanity towards the digital age.


Slide number 3

2700 BC Abacus. Although the exact place and date of the accounts are still in doubt, it is likely that abacus was invented by the Sumerians (a people of southern Mesopotamia) about 5,000 years ago. With the help of special knuckles, they made it possible to perform quick and rather complex calculations, so that the abacus could be called the first computer.


Slide number 4


Slide number 5


Slide number 6

Blaise Pascal (1623 - 1662) The machine went down in the history of computing technology under the name "Pascaline". During his work on the device, Pascal made more than 50 different models of his car, in which he experimented not only with materials, but also with the shape of the machine parts. The first working machine was made already in 1842, but its final version did not appear until 1654.


Slide number 7

Gottfried Wilhelm Leibniz (1646 - 1716) The German philosopher, mathematician and physicist Gottfried Wilhelm Leibniz in 1670 gave the first description of his arithmetic tool that allows you to add, subtract, multiply, divide, and extract square roots, while using the binary number system. The final version was completed in 1710. It was a more sophisticated device that used a moving part (the prototype of the carriage) and a handle with which the operator rotated the wheel.


Slide number 8

Charles Babbage (1792 - 1871) In 1822, a trial model of the Difference Engine was built, capable of calculating and printing large mathematical tables.


Slide number 9


Slide number 10

The development of computer technology, following the generally accepted classification, can be divided into the following stages: Manual - from the 5th millennium. Mechanical - from the middle of the 17th century. Electromechanical - since the 90s of the 19th century Electronic - since the 40s of the 20th century.


Slide number 11

1801: Jacquard Loom. Designed by Joseph Marie Jacquard, it was the first machine to use punched cards to control a series of sequences. In order to change the pattern of the fabric being made, the machine used a punch card. It was a kind of binary code: according to the principle "there is a hole - there is no hole." The Jacquard loom was a key step in the development of computer programming.


Slide number 12

At the end of the 19th century, Herman Hollerith in America invented counting and punching machines. They used punched cards for storage. Each such machine could only execute one specific program, manipulating the punched cards and the numbers punched on them. Counting and perforating machines carried out perforation, sorting, summing, printing out numerical tables. These machines were able to solve many typical tasks statistical processing, accounting other.


Slide number 13

The first computer, a universal machine based on electronic tubes, was built in the USA in 1945. This machine was called ENIAC (stands for electronic digital integrator and calculator). The ENIAC designers were J. Mauchley and J. Eckert. The counting speed of this machine exceeded the speed of the relay machines of that time by a thousand times.


Slide number 14

In our country, the first computer was created in 1951. It was called MESM - a small electronic calculating machine. The designer of the MESM was Sergey Alekseevich Lebedev. Lebedev in the 50s, serial lamp computers BESM-1 (large electronic calculating machine), BESM-2, M-20 were built. At the time, these machines were some of the best in the world.


Slide number 15

Second-generation computers were transistors, they took up less space, consumed less electricity and were more reliable. The highest achievement of domestic computer technology created by S.A. Lebedev was the development in 1966 of a semiconductor computer BESM-6 with a capacity of 1 million operations per second.


Slide number 16

The third-generation computer is obliged to create an integrated circuit (IC) in the form of a single crystal, in a miniature case of which transistors, diodes, capacitors, and resistors were concentrated. The processors were created on the basis of planar diffusion technology.


Slide number 17

Improvement in integrated circuits led to the emergence of microprocessors made in a single chip, including rAM (LSI - Large Integrated Circuits), which marked the transition to the fourth generation of computers. They have become smaller, more reliable and cheaper. The creation of the fourth generation computer led to the rapid development of mini- and especially micro-computers - personal computers (1968), which allowed the mass user to get a means to enhance their intellectual capabilities.

Slide 1

Slide 2

IN last years there is a rapid development computer technology... The computer is being introduced into almost all areas of our life. But few people know where computer technologies came to us and who invented them. The purpose of my work is to study the history of one of the most important subjects modern life - computer.

Slide 3

The word computer comes from the English word computer, which means "calculator". At first, counting was inseparable from curling the fingers. Fingers became the first computing technique. The coup came with the invention of the abacus. Even if you have not heard this word, you have met, and more than once, the Russian version of this device - abacus.

Slide 4

But the calculations became more complex as they evolved, and people wanted to entrust the account to the machine. Around 1632, the German scientist Wilhelm Schickard invented the first counting mechanism in history. In 1642, the French scientist Blaise Pascal created a machine that could add and subtract. In 1672, Wilhelm Leibniz created an adding machine that could still multiply and divide.

Slide 5

In the 19th century, the Englishman Charles Babbage developed the design of a machine that can be called the first computer. But he was never able to build it, since no one wanted to finance his project.

Slide 6

In 1944, the Mark-1 machine was created at the IBM enterprise by order of the US Navy. It was a monster weighing about 35 tons.

Slide 7

But "Mark-1" did not work fast enough and in 1946 the first electronic machine ENIAC was built. Its weight was 30 tons and it required 170 m2 of space. The ENIAC contained 18,000 lamps, which emitted so much light that flying insects caused malfunctions.

Slide 8

In 1947, the Americans invented transistors. One transistor replaced 40 lamps. As a result, the speed increased 10 times, the weight and size of the machines decreased. A new computer era has begun - computers of the second generation have appeared.

Slide 9

In 1959, chips were invented. Computer speed has increased tenfold. The dimensions of the machines are noticeably reduced. The chip marked the birth of the third generation of computers. It was a box for the body and a set of parts. To work with it, you had to solder yourself, collect all the parts, and master programming. First set-top box Altair-8800.

Slide 10

In the 1970s, the American company Apple (Apple) creates the first personal computer. In 1977, the Apple II was released, which already had a keyboard, monitor, sound and a plastic case.

Slide 11

The first computer that included a mouse was the Xerox 8010. The manipulator got its name "mouse" because of the similarity of the signal wire to the mouse tail (in earlier models it came out from the back).

Slide 12

After that, computer technology began to develop rapidly. Every year new technologies and new models of computers appeared.

1 slide

2 slide

1930 Vanniver Bush designs a differential analyzer. In fact, this is the first successful attempt to create a computer capable of performing cumbersome scientific calculations. Bush's role in the history of computer technology is very large, but his name most often comes up in connection with the prophetic article "As We May Think", in which he describes the concept of hypertext. In 1945, by the way.

3 slide

1934 Forced to perform many calculations of the same type, German engineer Konrad Zuse tries to improve the design of adding machines. As a result, he comes to completely original idea an automatic calculator consisting of a main control program, memory and a computing module.

4 slide

1937 Alan Turing first describes the Alan Turing machine, and John Atanasov develops the principles of the first electronic digital computer.

5 slide

1938 William Hewlett and David Packard form the Hewlett-Packard Company. Initially, the corporation is based in a garage, which will eventually become a rule of good form. According to legend, there were two variants of the name of the company, and the familiar acronym HP was chosen using a coin toss.

6 slide

1943 Construction of perhaps the most famous of the large computers begins - ENIAC (Electronic Numerical Integrator And Computer). Completed three years later, ENIAC weighs 30 tons, consists of 18 thousand vacuum tubes, and has a capacity of five thousand operations per second. The computer will live for nine years and will be last turned on in 1955. In December, the British Colossus computer, the first fully electronic computing device, was completed. Its main purpose is to decrypt secret messages encoded with the help of German Enigma machines. A total of ten Colossus were built, but they were all destroyed after they were no longer needed. According to British intelligence services, the Colossus was such an advanced development that it was not a sin to liquidate it, if only it did not fall into the wrong hands.

7 slide

1945 John von Neumann comes to mind that it would be nice to store programs somewhere, and not to re-enter them every time. Grace Hopper, while developing the Mark-II computer, discovers a moth that has burned one of the relays. An entry appears in the laboratory journal: “First actual case of bug being found”. In general, if some program does not work correctly for you, it is quite possible that this is not a bug, but just a moth burned some relay. All the same Konrad Zuse began work on Plankalkul, the first algorithmic programming language.

8 slide

1947 Bell Labs engineers William Shockley, John Bardeen and Walter Bratten invent the transistor. In nine years they will share Nobel Prize in physics. Norbert Wiener introduces the term "cybernetics". John Presper Eckert and John Mauchly of the ENIAC project start working on UNIVAC. The latter will convincingly demonstrate its power in 1952 by processing preliminary data on the voting and "predicting" the victory of Eisenhower in the presidential elections.

9 slide

1949 Popular Mechanics magazine makes a bold prediction: "In the future, there may be computers weighing less than one and a half tons," which comes true unexpectedly soon: Mark is created in Manchester - the namesake of the Harvard Mark, nicknamed Baby because of its small size. The "kid" weighed only one ton. MIT is working on the Whirlwind project, the first real-time computer, and John Mauchly comes up with Short Order Code, the first high-level programming language.

10 slide

1951 Grace Hopper builds the world's first A-0 compiler, and the indefatigable William Shockley builds the Junction Transistor.

11 slide

1953 Magnetic core memory is invented. There are already a hundred computers in the world. IBM launches its first electronic computer (IBM 701). For three years, they sold as many as nineteen pieces.

12 slide

1954-57 IBM engineer John Backus and his comrades begin to develop the FORTRAN programming language (FORmula TRANslation). Their work will be completed only in three years. The first dot matrix printers and prototypes of the first hard drives (IBM 305 RAMAC) appear. Isaac Asimov invents the MULTIVAC supercomputer, all the surviving documentation is contained in his stories "Anniversary", "The Last Question", "Elections", etc. Life is becoming more and more fun. Inspired scientists host the first conference on artificial intelligence.

13 slide

1958 Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor independently invent the integrated circuit.

14 slide

1959-64 The COBOL language is developed, which became the main programming language in the 60-70s. A little later, ALGOL and BASIC were born. DEC began selling the PDP-1, the first commercial mini-computer (about the size of a car) with a monitor and keyboard. Fifty systems were sold at $ 120,000 each. The PDP-1 became, in fact, the first gaming platform thanks to MIT student Steve Russell, who wrote the Star War computer game for it! The American National Standards Institute has adopted the ASCII encoding table. Doug Engelbart invents and patents the mouse manipulator. This is far from his only invention, but it was he who glorified the inventor several decades later.

15 slide

1965 Maurice Wilkes first talks about cache memory, Gordon Moore talks about Moore's Law, and Donald Davis invents "packet switching". DEC produces a cheap, only $ 18,000, PDP-8 computer. Cheaper only for nothing. This year is undoubtedly a landmark event: the launch of the Mac project, the development of which will lead to the emergence of the Multics OS, which, in turn, will lead to the emergence of the Unix OS.

16 slide

1966-69 In 1967, the development of the first object-oriented programming language Simula was completed. In America, the YYMMDD format was adopted as the standard for storing the date, which thirty years later caused the Y2K problem, which did not lead to anything special in the end. One of the most important events of the late 60s was the departure from Fairchild Semiconductor of Robert Noyce and Gordon Moore. At first they wanted to name their new company Moore Noyce, but in the end it was decided to call itself Intel (Integrated Electronics). Unfortunately, this name was already taken - that was the name of a small motel chain, from which they had to buy all the rights to trade mark Intel. And soon Fairchild Semiconductor left a few more people, led by Jerry Sanders. Can you guess what kind of company they organized? That's right, AMD. At the same time, an unknown engineer from IBM exclaimed about microchips: "And what is it for?" Which probably explains why history has not preserved his name. The 60s are ending with the launch of the military project ARPANet, which will gradually be reborn into the Internet, in 1990 will forget its militaristic roots and will be "removed from the register".

18 slide

1971 Intel developed the world's first microprocessor Intel 4004 in the bowels of Intel. This miracle can perform as much as 60 thousand operations per second and costs only 300 dollars. Ray Tomlison sends the first email... His great predecessors - Morse and Bell - also did not really rack their brains over the content of their first messages. Morse, for example, found nothing better than to tap the following: "What hath god wrought!" ("What makes God shudder!"). Tomlinson didn’t think much - he sent himself a “QWERTYUI” message. The first Poketronic pocket calculator is launched. In a year, the world will be swept by "calculator fever", and in most of the civilized world the slide rule will finally go down in history. Niklaus Wirth develops Pascal. The language that Wirth viewed as a means of learning programming principles, thanks to the efforts of Borland, took root among programmers. And still alive - one of the most popular RAD tools, Delphi, is based on Object Pascal.

19 slide

1972 Two iconic computer companies are founded (Nolan Bushnell, inspired by the success of his video game Pong, founds Atari, and Seymour Cray founds Cray Research) and three foundational (each in its own field) programming languages \u200b\u200b(C, SmallTalk, and Prolog). The name C (by the way, the language was invented by one of the fathers of Unix, Dennis Ritchie), the language got due to the fact that its now forgotten predecessor was called B. The Telnet protocol appeared. In 1973, TCP development will begin. And FTP development will end.