An electronic computer is a machine that processes data according to a series of instructions. Related technical research is called computer science, and data-centered research is called information technology.
There are many kinds of computers. In fact, computers are generally tools for processing information. According to Turing machine theory, a computer with the most basic functions should be able to do anything that other computers can do. Therefore, regardless of time and storage factors, all personal digital assistants (PDA) and supercomputers should be able to accomplish the same job. In other words, even computers with the same design should be used for various tasks, from company salary management to unmanned spacecraft control, as long as they are modified accordingly. Due to the rapid progress of science and technology, the next generation of computers can always significantly surpass their predecessors in performance, which is sometimes called "Moore's Law".
Computers have different forms in composition. Early computers were as big as a house, but today some embedded computers may be smaller than a deck of playing cards. Of course, even today, there are still a large number of supercomputers serving large organizations for special scientific computing or transaction processing needs. Relatively small computers designed for personal applications are called microcomputers, or simply microcomputers. We usually mention this when we use the word "computer" in our daily life today. However, the most common application form of computer now is embedded. Embedded computers are usually relatively simple and small in size, and are used to control other devices-whether airplanes, industrial robots or digital cameras.
The above definition of electronic computer includes many special devices that can calculate or have limited functions. But when it comes to modern electronic computers, its most important feature is that any electronic computer can simulate the behavior of any other computer as long as it is given the correct instructions (only limited by the storage capacity and execution speed of the electronic computer itself). Therefore, compared with early electronic computers, modern electronic computers are also called general electronic computers.
history
ENIAC is a milestone in the history of computer development. The English word "computer" originally refers to a person engaged in data calculation. And they often need to use some mechanical computing devices or analog computers. The ancestors of these early computing devices include abacus and Antioch-Kitera mechanism, which can be traced back to 87 BC and used by the ancient Greeks to calculate planetary motion. With the re-prosperity of mathematics and engineering in Europe at the end of the Middle Ages, Wilhelm Schickard took the lead in developing the first computing device in Europe in 1623, which is a "computational clock" that can add and subtract numbers within six digits and output answers through ringtones. Use the rotating gear for operation.
1642, French mathematician Pascal improved the slide rule on the basis of William Oughtred, and was able to perform eight-bit calculation. It also sold many products and became fashionable goods at that time.
180 1 year, Joseph Marie Jacquard improved the design of loom, in which he used a series of punched paper cards as a program to weave complex patterns. Although jacquard loom is not considered as a real computer, its appearance is indeed an important step in the development of modern computers.
Charles. Babic was the first person who conceived and designed a fully programmable computer in 1820. However, due to technical conditions, financial constraints, and unbearable constant repair of the design, this computer never came out in his lifetime. By the end of19th century, many technologies proved to be of great significance to computer science appeared one after another, including punched cards and vacuum tubes. Hermann Hollerith designed a machine for tabulation, which realized large-scale automatic data processing by using punched cards.
In the first half of the 20th century, in order to meet the needs of scientific calculation, many single-purpose and increasingly complex analog computers were developed. These computers are mechanical or electronic models based on the specific problems they aim at. In 1930s and 1940s, the performance of computers became stronger, the universality was improved, and the key functions of modern computers continued to increase.
From 65438 to 0937, Claude elwood Shannon published his great paper Symbol Analysis in Relays and Switching Circuits, in which the application of digital electronic technology was mentioned for the first time. He showed people how to use switches to realize logical and mathematical operations. Since then, he has further consolidated his ideas by studying Nivard Bush's differential simulator. This is an important moment, marking the beginning of the design of binary electronic circuits and the application of logic gates. As the pioneers of the birth of these key ideas, it should include: Almon Strowger, who applied for a patent for a device containing logic gates; Nicholas? Nikola tesla, as early as 1898, applied for circuit equipment with logic gates; Lee De Forest, in 1907, he replaced the relay with a vacuum tube.
The Amiga 500 computer produced by Commodore Company in 1980s, along such a long journey, it is quite difficult to define the so-called "first electronic computer". 194 1 12 in may, Konrad Zuse completed his electromechanical equipment "Z3", which was the first computer with automatic binary mathematical calculation and feasible programming functions, but it was not an "electronic" computer. In addition, other notable achievements mainly include: atanasoff-Berry computer, born in the summer of 194 1, is the first electronic computer in the world, which uses vacuum tube calculator, binary values and reusable memory; 1943 The mysterious colossus computer exhibited in Britain really tells people that it is reliable to use vacuum tubes and can realize electrification reprogramming, although its programming ability is extremely limited. Harvard Mark I; Harvard university; And binary-based "ENIAC" (ENIAC, 1944), which is the first computer with general purpose, but its structural design is not flexible enough, so every reprogramming means reconnecting the electrical and physical circuits.
The team that developed Eneike further improved the design according to its defects, and finally presented the Von Neumann structure (program storage architecture) that we are familiar with today. This system is the foundation of all computers today. In the middle and late 1940s, a large number of computers based on this system began to be developed, among which Britain was the earliest. Although the first machine developed and put into operation was the "Small Experimental Machine" (SSEM), the practical machine really developed is probably EDSAC.
Throughout the 1950s, vacuum tube computers dominated. 1958 On September 2nd, under the leadership of robert noyce (founder of Intel Corporation), the integrated circuit was invented. Soon after, the microprocessor came out. Computers designed between 1959 and 1964 are generally called second-generation computers.
In the 1960s, transistor computers took its place. Transistors are smaller, faster, cheaper and more reliable, which makes them commercialized. Computers from 1964 to 1972 are generally called the third generation computers. A large number of integrated circuits are used, and the typical model is IBM360 series.
In the 1970s, the introduction of integrated circuit technology greatly reduced the production cost of computers, and computers began to move towards thousands of households. Computers after 1972 are customarily called the fourth generation computers. Based on VLSI and later VLSI. 1 972 On April1day, Intel introduced the 8008 microprocessor. 1976 Stephen Woznak and Steve Jobs founded Apple Computer Company. And launched the Apple I computer. 1977 Apple's second-generation computer was released in May. 1 979 June1day, Intel released an 8-bit 8088 microprocessor.
From 65438 to 0982, microcomputers became popular and entered schools and families in large numbers. 1982 65438+ 10 Commodore 64 computer release, price: 595 USD. 1982 released on February 80286. The clock frequency is increased to 20MHz, which increases the protection mode and can access 16M memory. Support virtual memory above 1GB. It executes 2.7 million instructions per second and integrates134,000 transistors.
1990165438+1October: the first generation of MPC (multimedia personal computer standard) was released. The processor was at least 80286/ 12MHz, and later it was increased to 80386SX/ 16 MHz, and the transmission rate of the optical drive was at least 150 KB/ sec. 1994 10 June10, Intel released the 75 MHz Pentium processor. 1 995165438+1October1Pentium Pro released. The main frequency can reach 200 MHz, 440 million instructions are completed per second, and 5.5 million transistors are integrated. 1997 65438+1On October 8th, Intel released Pentium MMX. Games and multimedia functions have been enhanced.
Since then, computers have changed with each passing day, and Moore's Law published in 1965 has been continuously proved, and the prediction is still applicable in the next 10~ 15 years.
principle
The main structure of personal computer:
director
mainboard
cpu
main storage
expansion card
Power?Supply?
CD-ROM drive
Auxiliary memory (hard disk)
keyboard
mouse
Although computer technology has developed rapidly since the birth of the first electronic general-purpose computer in the 1940s, today's computers still basically adopt the stored program structure, that is, the von Neumann structure. This structure realizes a practical general-purpose computer.
The stored program structure describes the computer as four main parts: arithmetic logic unit (ALU), control circuit, memory and input/output device (I/O). These components are connected by a set of flat cables (especially when a set of wires is used for data transmission with different intentions, it is also called a bus) and are driven by a clock (of course, some other events may also drive the control circuit).
Conceptually, the memory of a computer can be regarded as a group of "cells". Each "cell" has a number called an address; But also can store smaller fixed-length information. This information can be instructions (telling the computer what to do) or data (the processing object of instructions). In principle, each "cell" can store any one of them.
Arithmetic logic unit (ALU) can be called the brain of a computer. It can do two operations: the first is arithmetic operation, such as addition and subtraction of two numbers. The function of arithmetic operators in ALU is very limited. In fact, some ALUs don't support multiplication and division at circuit level at all (because users can only do multiplication and division through programming). The second is comparison operation, that is, given two numbers, ALU compares them to determine which is larger.
Input-output system is a means for computer to receive external information and feed back the operation results to the outside world. For a standard personal computer, the input devices are mainly keyboard and mouse, while the output devices are monitors, printers and many other I/O devices that can be connected to the computer.
The control system connects all parts of the computer. Its function is to read instructions and data from memory and input/output devices, decode instructions, and pass the correct input that meets the requirements of instructions to ALU, telling ALU how to handle these data and where to return the resulting data. An important part of the control system is a counter, which is used to record the address of the current instruction. Usually, the counter accumulates with the execution of the instruction, but sometimes if the instruction indicates a jump, the rule is not followed.
Since 1980s, ALU and control unit (both integrated into central processing unit (CPU)) have been gradually integrated into an integrated circuit, which is called microprocessor. The working mode of this computer is very intuitive: in a clock cycle, the computer first obtains instructions and data from the memory, then executes the instructions, stores the data, and then obtains the next instruction. This process is repeated until a termination instruction is obtained.
According to the controller's explanation, the instruction set executed by the arithmetic unit is a set of simple instructions with a very limited number. Generally, it can be divided into four categories: 1), data movement (for example, copying a numerical value from storage unit A to storage unit B)2), number and logic operation (for example, calculating the sum of storage unit A and storage unit B and returning the result to storage unit C)3), conditional verification (for example, if the numerical value in storage unit A is 100, then the next instruction.
Instructions, like data, are represented in binary in a computer. For example, 10 1 10000 is a copy instruction code of Intel x86 microprocessor. The instruction set supported by the computer is the machine language of the computer. Therefore, using the popular machine language will make the established software easier to run on the new computer. Therefore, people who develop commercial software usually only pay attention to one or several different machine languages.
More powerful small computers, large computers and servers may be different from the above computers. They usually share tasks with different CPUs to perform. Nowadays, microprocessors and multi-core personal computers are also developing in this direction.
Supercomputers usually have a significantly different architecture from basic stored program computers. They usually have thousands of CPUs, but these designs seem to be only useful for specific tasks. In various computers, some microcontrollers use Harvard architecture to separate programs and data.
Digital circuit realization of computer
The physical realization of these conceptual designs is varied. As we mentioned earlier, stored program computers can be either mechanical or based on digital electronics. The digital circuit can realize arithmetic and logic operations using binary numbers by electronically controlling switches such as relays. Shannon's paper just shows us how to arrange relays and form logic gates that can realize simple Boolean operations. Other scholars quickly pointed out that vacuum tubes can replace relay circuits. Vacuum tubes were originally used as amplifiers in radio circuits, and later they began to be used more and more as fast switches in digital electronic circuits. When one pin of an electron tube is energized, current can flow freely between the other two ends.
Through the arrangement and combination of logic gates, we can design and complete many complex tasks. For example, an adder is one of them. The device realizes the addition of two numbers in the electronic field and saves the result-in computer science, such a method of achieving a specific intention through a set of operations is called an algorithm. Finally, people successfully assembled a complete ALU and controller through a considerable number of logic gates. It is a considerable number, just look at CSIRAC, which may be the smallest practical electron tube computer. The machine contains 2000 electron tubes, many of which are dual-purpose devices, which means there are 2000 to 4000 logic devices in total.
Vacuum tubes obviously cannot make large-scale gate circuits. Expensive, unstable (especially in large quantities), bloated, high energy consumption, and not fast enough-although far beyond mechanical switching circuits. All these led to their replacement by transistors in the 1960s. The latter has smaller volume, convenient operation, high reliability, more energy saving and lower cost.
Integrated circuit is the foundation of today's electronic computer. After the 1960s, transistors began to be gradually replaced by integrated circuits, which put a large number of transistors, other electronic components and connecting wires on a silicon board. In 1970s, ALU and controller, as two parts of CPU, began to be integrated into one chip, called "microprocessor". Along the development history of integrated circuits, we can see that the number of integrated devices on a chip increases rapidly. The first integrated circuit only contained dozens of components. By 2006, the number of transistors on an Intel Core dual-core processor was as high as 1.5 1 billion.
Whether it is an electron tube, a transistor or an integrated circuit, it can be used as a "storage" component in the storage program architecture by using the trigger design mechanism. In fact, flip-flops are indeed used as small-scale ultra-high-speed storage. However, almost no computer design uses triggers for large-scale data storage. The earliest computers used Williams tubes to send electron beams to TV screens or several mercury delay lines (sound waves travel slowly enough to be considered "stored" on them), and then read them. Of course, these effective but elegant methods were eventually replaced by magnetic storage. For example, magnetic core memory, the current representing information can generate a permanent weak magnetic field in iron material, and when this magnetic field is read out again, data recovery is realized. Dynamic random access memory (DRAM) was also invented. It is an integrated circuit containing a large number of capacitors, which are responsible for storing data charges-the intensity of the charge is defined as the value of the data.
inputoutput device
Input/output devices (I/O) are the general names of devices that send information to computers from the outside and devices that return processing results to the outside. These returned results may be intuitively experienced by the user, or may be the input of other devices controlled by the computer: for a robot, the output of the control computer is basically the robot itself, such as making various behaviors.
The types of input and output devices of the first generation computers were very limited. The usual input device is a card reader with punched cards, which is used to input instructions and data into the memory; The output device used to store the results is usually a magnetic tape. With the progress of science and technology, the richness of input and output equipment has been improved. Take a personal computer as an example: keyboard and mouse are the main tools for users to input information directly into the computer, while monitors, printers, speakers and headphones return the processing results. In addition, there are many input devices that can accept other different kinds of information, such as digital cameras that can input images. Among the input and output devices, there are two types worthy of attention: the first type is secondary storage devices, such as hard disks, optical disks or other devices with slow speed but large capacity. The second is the computer network access equipment, through which the direct data transmission between computers greatly enhances the value of computers. Today, the Internet has enabled tens of millions of computers to transmit various types of data to each other.
procedure
Simply put, a computer program is a series of instructions executed by a computer. It can be a few instructions to perform a simple task, or it can be a complex instruction queue to manipulate a large amount of data. Many computer programs contain millions of instructions, many of which may be executed repeatedly. In 2005, a typical personal computer could execute about 3 billion instructions per second. Computers usually don't execute complicated instructions to get extra functions, but more often they run simple but numerous short instructions according to the programmer's arrangement.
Generally speaking, programmers don't directly write instructions to computers in machine language. The result of this can only be time-consuming, laborious, inefficient and full of loopholes. Therefore, programmers usually write programs in "high-level" languages, and then some special computer programs, such as interpreters or compilers, translate them into machine language. Some programming languages look like machine languages, such as assembly language, and are considered as low-level languages. Other languages, such as Prolog, which is an abstract principle, completely ignore the operational details of the actual operation of the computer, and can be described as high-level languages. For a specific task, we should choose the corresponding language according to its transaction characteristics, programmer skills, available tools and customer needs, among which customer needs are the most important (engineering projects in the United States and China usually require the use of Ada language).
Computer software is another word that is not equivalent to computer program. Computer software is a more inclusive technical term, which includes all kinds of programs and all related materials used to complete tasks. For example, electronic games include not only the program itself, but also pictures, sounds and other data content to create a virtual game environment. In the retail market, an application on a computer is just a copy of software for a large number of users. The trite example here is of course Microsoft's office software group, which includes a series of interrelated programs that meet general office needs.
Using those extremely simple machine language instructions to realize countless powerful application software means that its programming scale is doomed to be large. Windows XP, an operating system program, contains 40 million lines of C++ high-level language source code. Of course, this is not the biggest. Such a huge software scale also shows the importance of management in the development process. In actual programming, the program will be subdivided into scales that each programmer can complete in an acceptable time.
Even so, the process of software development is slow, unpredictable and full of omissions. With the requirements of the times, software engineering focuses on how to speed up the work progress and improve efficiency and quality.
Libraries and operating systems
Shortly after the birth of computers, people found that some tasks must be performed in many different programs, such as calculating some standard mathematical functions. In order to improve efficiency, the standard versions of these programs are collected into a "library" for each program to call. Many tasks often need to deal with various input and output interfaces. At this time, the library used for connection can come in handy.
In 1960s, with the popularization of computer industrialization, computers were more and more used to handle different tasks in organizations. Soon, special software that can automatically arrange the continuation and execution of jobs appeared. These softwares, which control hardware and are responsible for job scheduling, are called "operating systems". An example of an early operating system was IBM OS/360.
In the process of continuous improvement, the operating system introduces a time-sharing mechanism-concurrency. This enables many different users to use the machine to execute their own programs at the same time, as if everyone had their own computer. To this end, the operating system needs to provide a "virtual machine" for each user to separate different programs. As more and more devices need operating system control, one of them is hard disk. Therefore, the operating system also introduces file management and directory management (folder), which greatly simplifies the application of such permanent storage devices. In addition, the operating system is also responsible for security control to ensure that users can only access those files that have been allowed.
Of course, so far, the last important step in the development of operating system is to provide a standard graphical user interface (GUI) for programs. Although there is no technical reason why the operating system must provide these interfaces, operating system vendors always hope and encourage the software running on their systems to be consistent or similar to the operating system in appearance and behavior characteristics.
In addition to these core functions, the operating system also encapsulates a series of other commonly used tools. Some of them are of little significance to computer management, but they are very useful to users. For example, Apple's Mac OS X includes a video editing application.
The operating system of some small computers may not use such versatility. Due to the limited memory and processing power, early microcomputers did not provide additional functions, while embedded computers used special operating systems or did not use them at all. They often directly express some functions of the operating system through applications.
App application
Machines controlled by computers are common in industry.
Many modern mass-produced toys, such as Furby, are inseparable from cheap embedded processors.
At first, huge and expensive digital computers were mainly used for scientific calculation, especially for military projects. For example, ENIAC was first used to calculate the neutron density of artillery ballistic cross section and design hydrogen bombs (many supercomputers still play a huge role in simulating nuclear tests today). CSIR Mk I is the first stored program computer designed in Australia, which is responsible for evaluating the rainfall in the catchment area of hydropower projects. Others are used for decryption, such as the "Colossus" programmable computer in Britain. In addition to these early scientific or military applications, computers are widely used in other fields.
From the beginning, stored program computers have been closely related to the solution of business problems. Long before the birth of IBM's first commercial computer, J. Lyons and others in Britain designed and manufactured LEO for asset management or catering to other commercial purposes. Due to continuous quantity and cost control, computers began to spread to smaller organizations. Coupled with the invention of microprocessors in the 1970s, cheap computers became a reality. In the 1980s, personal computers became popular, and repetitive report operations such as electronic document writing and printing and budget calculation began to rely more and more on computers.
As computers become cheaper and cheaper, creative works of art begin to use them. People use synthesizers, computer graphics and animations to create and modify sounds, images and videos. The industrialization of video games also shows that computers have created a new history in entertainment.
Since the miniaturization of computers, the control of mechanical equipment has also begun to rely on the support of computers. In fact, it was the construction of an embedded computer small enough to control the Apollo spacecraft that stimulated the leap of integrated circuit technology. Today, it is much more difficult to find an active mechanical device that is not controlled by a computer than one that is even partially controlled by a computer. Perhaps the most famous computer control device is the robot, which has a certain subset of human appearance and human behavior more or less. In mass production, industrial robots have become commonplace. However, fully anthropomorphic robots still exist only in science fiction or laboratories.
Robot technology is essentially a physical expression link in the field of artificial intelligence. The so-called artificial intelligence is a vague concept, but what is certain is that this subject tries to make computers have capabilities that are not available at present, but are inherent in human beings. Over the years, many new methods have been developed to make computers do things that used to be thought that only humans could do. Like reading and playing chess. However, up to now, the development of computers with human general "whole" intelligence is still very slow.
Network, internet
Since 1950s, computers have been used as a tool to coordinate information from different places. SAGE is the first large-scale system in this field. After that, a series of specialized commercial systems such as "saber" also appeared.