The story of computers is the story of human efforts to solve problems. It required discoveries in several direction to lay the groundwork for computing as we know it: in spending up calculations, in automating repetitive processes, and in learning to code and store information in ways that speeded up its handling. In the history of computing the first invention was probably the abacus. From around 3000 bc trades living around the Mediterranean worked out their prices and profits using this simple device of rods and beads - the beads in different sections representing different units of value. In 17th century Europe the ferment of interest in the new sciences, such as astronomy and navigation, spurred creative minds to simplify computations.
It could take years for early scientists to calculate the vast quantities of numerical data whose patterns they were trying to unravel. In 1614 the Scotsman John Napier reported his discovery of logarithms, enabling the products of complex multiplications to be reduced to a process of simple addition. very soon after, in the 1620s, the slide rule was invented, based upon the mathematical principles Napier had discovered. In the later 1600s the Frenchman Blaise pascal and the German Gottfried Wilhelm von Leibniz devised simple calculators - they were not easy to use reliably but the logical problems underlying devices continued to challenge advanced thinkers over the next centuries. it was not until 19th century that inventors were moving towards the design of a prototype computer.
The desire of a French cloth manufacture to automate the weaving of complex patterns advanced the technology needed for computing joseph jacquard, in 1804, began using a loom which used punched cards to control the creation of complex fabric designs.
{The same technique came to be used in pianolas or 'player pianos', which utilized punched cards to play back piano music, both popular and classical, and which where able to preproduce the performance recorded by famous artist.} Here are most surprising name appears in the cached of computer enthusiasts - the lady Augusta Byron, daughter of the poet, lord Byron. Her active interest in promotion of a machine divided by the 19th century inventor, Charles Babbage, have led some to describe her has 'the first computer programmer'.
She saw machine as a salt of mathematical chicle loom, that could carry out any pattern off calculation which had been punched on to Babbage never succeeded in manufacturing is analytical engine which conventional wisdom declared to be too advanced for the technology of his time. However, the Swedish inventor, Scheutz, demonstrated a simpler version ( based on Babbage's earlier difference engine) at the Paris exposition of 1855.Building on similar ideas in the USA for 1890 census Herman hollerith devised a a punched card machine to record census information on every citizen. To public amazement, the census results were ready in only six weeks time. Hollerith's success in selling his machine - he even sold one to Czarist Russia - led him to founded the company which eventually become known as IBM.
From Leibniz's time, advanced thinkers had seen the merits of a simple logic system such as EITHER-OR' or TRUE-FALSE, to test a series of logical propositions. In calculating devices, such dualities could be converted into SWITCH ON or SWITCH OFF choices to record data.
After Jacquard, punched cards could be used to indicate the presence or absence of information on any point- a perforation showing the presence of data and non-perforation showing its absence. The cards were manipulated by cogs and wheels or, on Hollerith's census machine, by pins, which recorded the answers given by every citizen on every question included in the census.
In 1930, Vannevar Bush (USA) at MIT designed the differential analyser, marking the start of our computer age; the analyser was an electro mechanical machine which measured degrees of change in a model. The machine took up most of a large room. In order to analyse a new problem
. engineers had to change the gear ratios, and they would emerge two or three days later, hands coated in oil. Nevertheless, the machine's ability to handle complex calculations far surpassed any previous invention. In 1936 the maverick scientist, Alan Turing of Britain, captured scientific attention with his influential paper On Computable Numbers with an Application to the Entscheidungs problem suggesting that if his vision of a universal computer were implemented, solutions might now be found to previously unsolvable problems.
The history of computers starts out about 2000 years ago in Babylonia (Mesopotamia), at the birth of the abacus, a wooden rack holding two horizontal wires with beads strung on them. Blaise Pascal is usually credited for building the first digital computer in 1642. It added numbers entered with dials and was made to help his father, a tax collector. The basic principle of his calculator is still used today in water meters and modern-day odometers. Instead of having a carriage wheel turn the gear, he made each ten-teeth wheel accessible to be turned directly by a person's hand (later inventors added keys and a crank), with the result that when the wheels were turned in the proper sequences, a series of numbers was entered and a cumulative sum was obtained. The gear train supplied a mechanical answer equal to the answer that is obtained by using arithmetic. This first mechanical calculator, called the Pascaline, had several disadvantages. Although it did offer a substantial improvement over manual calculations, only Pascal himself could repair the device and it cost more than the people it replaced! In addition, the first signs of technophobia emerged with mathematicians fearing the loss of their jobs due to progress. Contrary to Pascal, Leibniz (1646-1716) successfully introduced a calculator onto the market. It is designed in 1673 but it takes until 1694 to complete. The calculator can add, subtract, multiply, and divide. Wheels are placed at right angles which could be displaced by a special stepping mechanism. The speed of calculation for multiplication or division was acceptable.
But like the Pascaline, this calculator required that the operator using the device had to understand how to turn the wheels and know the way of performing calculations with the calculator. Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the "father of the computer", he conceptualized and invented the first mechanical computer in the early 19th century. After working on his revolutionary difference engine, designed to aid in navigational calculations, in 1833 he realized that a much more general design, an Analytical Engine, was possible. A step towards automated computing was the development of punched cards, which were first successfully used with computers in 1890 by Herman Hollerith and James Powers, who worked for the US. Census Bureau.
They developed devices that could read the information that had been punched into the cards automatically, without human help. Because of this, reading errors were reduced dramatically, work flow increased, and, most importantly, stacks of punched cards could be used as easily accessible memory of almost unlimited size. Furthermore, different problems could be stored on different stacks of cards and accessed when needed.
These advantages were seen by commercial companies and soon led to the development of improved punch-card using computers created by International Business Machines (IBM), Remington (yes, the same people that make shavers), Burroughs, and other corporations. These computers used electromechanical devices in which electrical power provided mechanical motion -- like turning the wheels of an adding machine. Such systems included features to: o feed in a specified number of cards automatically o add, multiply, and sort o feed out cards with punched results The start of World War II produced a large need for computer capacity, especially for the military. New weapons were made for which trajectory tables and other essential data were needed. In 1942, John P. Eckert, John W. Mauchly, and their associates at the Moore school of Electrical Engineering of University of Pennsylvania decided to build a high - speed electronic computer to do the job. This machine became known as ENIAC (Electrical Numerical Integrator And Calculator)
0 Comments
Post a Comment