The first substantial computer was the giant ENIAC machine by John W. Mauchly and J. Presper Eckert at the University of Pennsylvania. ENIAC (Electrical Numerical Integrator and Calculator) used a 10 decimal digit word instead of binary ones like previous automated calculators/computers. ENIAC was also the first machine to use more than 2,000 vacuum tubes, using nearly 18,000 vacuum tubes. Storing all those vacuum tubes and machinery needed to keep things cool took up over 167 square meters (1800 square feet) of floor space. However, it had punch card inputs and outputs, and arithmetically had 1 multiplier, 1 divider-square root, and 20 adders using decimal "ring counters", which acted as adders and also as fast-access read-write register memory. The executable instructions that make up a program were embedded in separate ENIAC units, which were connected together to form a path through the machine for the flow of computations. These connections had to be redone for each different problem, along with pre-setting the function tables and switches. This "hard-wired-your-own" instruction technique was inconvenient and only with a certain ENIAC license could it be considered programmable; it was, however, efficient at handling the particular programs for which it was designed. The ENIAC is generally recognized as the first successful high-speed electronic digital computer (EDC) and was used productively from 1946 to 1955. Say no to plagiarism. Get a tailor-made essay on "Why Violent Video Games Shouldn't Be Banned"? Get an original essay In 1971, however, a controversy developed over the patentability of ENIAC's basic digital concepts, as it was claimed that another US physicist, John V. Atanasoff, had already used the same ideas in a vacuum tube device simpler one that he built in the 1930s while at Iowa State College. In 1973, the court found in favor of the company using Atanasoff's request, and Atanasoff received the acclaim he rightly deserved. In the 1950s, two devices were invented that would advance the field of computing and kick off the information revolution. The first of these two devices was the transistor. Invented in 1947 by William Shockley, John Bardeen and Walter Brattain of Bell Labs, the transistor was destined to supplant the days of vacuum tubes in computers, radios and other electronic devices. In 1958 this problem was also solved by Jack St. Clair Kilby of Texas Instruments. He produced the first integrated circuit or chip. A chip is actually a collection of tiny transistors that are connected together during the manufacturing of the transistor. Therefore, the need to solder large numbers of transistors together has been virtually eliminated; now only connections to other electronic components were needed. In addition to saving space, the speed of the machine was now increased as the distance the electrons had to travel decreased. The 1960s saw large mainframe computers become much more common in large industries and in the US military and space program. IBM became the undisputed market leader in selling these large, expensive, error-prone, and very difficult to use machines. A real explosion of personal computers occurred in the early 1970s, starting with Steve Jobs and Steve Wozniak exhibiting the first Apple II at the First West Coast Computer Faire in San Francisco. The Apple II boasted the integrated BASIC programming language, color graphics, and 4100 character memory for just $1298. Programs and Data,.
tags