Главная Обратная связь Дисциплины:
Архитектура (936)
|
First general-purpose computing device
Contents [hide] · 1 Etymology · 2 History o 2.1 First general-purpose computing device o 2.2 Analog computers o 2.3 The modern computer § 2.3.1 Electromechanical computers § 2.3.2 Electronic programmable computer § 2.3.3 Stored program computer o 2.4 Transistor computers o 2.5 The integrated circuit · 3 Programs o 3.1 Stored program architecture o 3.2 Bugs o 3.3 Machine code o 3.4 Programming language § 3.4.1 Low-level languages § 3.4.2 Higher-level languages o 3.5 Program design · 4 Components o 4.1 Control unit o 4.2 Arithmetic logic unit (ALU) o 4.3 Memory o 4.4 Input/output (I/O) o 4.5 Multitasking o 4.6 Multiprocessing o 4.7 Networking and the Internet o 4.8 Computer architecture paradigms · 5 Misconceptions o 5.1 Required technology · 6 Further topics o 6.1 Artificial intelligence o 6.2 Hardware § 6.2.1 History of computing hardware § 6.2.2 Other hardware topics o 6.3 Software o 6.4 Languages o 6.5 Professions and organizations · 7 Degradation · 8 See also · 9 Notes · 10 References · 11 External links Etymology The first use of the word “computer” was recorded in 1613 in a book called “The yong mans gleanings” by English writer Richard Braithwait I haue read the truest computer of Times, and the best Arithmetician that euer breathed, and he reduceth thy dayes into a short number. It referred to a person who carried out calculations, or computations, and the word continued with the same meaning until the middle of the 20th century. From the end of the 19th century the word began to take on its more familiar meaning, a machine that carries out computations.[3] History Main article: History of computing hardware Rudimentary calculating devices first appeared in antiquity and mechanical calculating aids were invented in the 17th century. The first recorded use of the word "computer" is also from the 17th century, applied to human computers, people who performed calculations, often as employment. The first computer devices were conceived of in the 19th century, and only emerged in their modern form in the 1940s. First general-purpose computing device
A portion of Babbage's Difference engine. Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the "father of the computer",[4] he conceptualized and invented the first mechanical computer in the early 19th century. After working on his revolutionary difference engine, designed to aid in navigational calculations, in 1833 he realized that a much more general design, an Analytical Engine, was possible. The input of programs and data was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as theJacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. The Engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete.[5][6] The machine was about a century ahead of its time. All the parts for his machine had to be made by hand - this was a major problem for a device with thousands of parts. Eventually, the project was dissolved with the decision of the British Government to cease funding. Babbage's failure to complete the analytical engine can be chiefly attributed to difficulties not only of politics and financing, but also to his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Nevertheless his son, Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906. Analog computers Sir William Thomson's third tide-predicting machine design, 1879-81 During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.[7] The first modern analog computer was a tide-predicting machine, invented by Sir William Thomson in 1872. Thedifferential analyser, a mechanical analog computer designed to solve differential equations by integration using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson, the brother of the more famous Lord Kelvin.[8] The art of mechanical analog computing reached its zenith with the differential analyzer, built by H. L. Hazen and Vannevar Bush at MIT starting in 1927. This built on the mechanical integrators of James Thomson and the torque amplifiers invented by H. W. Nieman. A dozen of these devices were built before their obsolescence became obvious. The modern computer Alan Turing was the first to conceptualize the modern computer, a device that became known as the Universal Turing machine. The principle of the modern computer was first described by computer scientist Alan Turing, who set out the idea in his seminal 1936 paper,[9] On Computable Numbers. Turing reformulated Kurt Gödel's 1931 results on the limits of proof and computation, replacing Gödel's universal arithmetic-based formal language with the formal and simple hypothetical devices that became known as Turing machines. He proved that some such machine would be capable of performing any conceivable mathematical computation if it were representable as an algorithm. He went on to prove that there was no solution to the Entscheidungsproblem by first showing that the halting problem for Turing machines is undecidable: in general, it is not possible to decide algorithmically whether a given Turing machine will ever halt. He also introduced the notion of a 'Universal Machine' (now known as a Universal Turing machine), with the idea that such a machine could perform the tasks of any other machine, or in other words, it is provably capable of computing anything that is computable by executing a program stored on tape, allowing the machine to be programmable. Von Neumann acknowledged that the central concept of the modern computer was due to this paper.[10] Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to beTuring-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine. Electromechanical computers Replica of Zuse's Z3, the first fully automatic, digital (electromechanical) computer. Early digital computers were electromechanical - electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes. The Z2, created by German engineer Konrad Zuse in 1939, was one of the earliest examples of an electromechanical relay computer.[11] In 1941, Zuse followed his earlier machine up with the Z3, the world's first working electromechanical programmable, fully automatic digital computer.[12][13]The Z3 was built with 2000 relays, implementing a 22 bit word length that operated at a clock frequency of about 5–10 Hz.[14] Program code and data were stored on punched film. It was quite similar to modern machines in some respects, pioneering numerous advances such as floating point numbers. Replacement of the hard-to-implement decimal system (used in Charles Babbage's earlier design) by the simpler binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time.[15] The Z3 was probably a complete Turing machine.
|