Monday, June 15, 2009

Transistor Radios and the Start of Summer



The start of summer always reminds me of my first transistor radio.

By 1960 vacuum tubes were rapidly being replaced by small, lightweight transistors.
Jack Kilby (Texas Instruments, 1958) is credited with having developed the concept of integrating device and circuit elements onto a single silicon chip, see graphic.

Robert Noyce (Fairchild Semiconductor, 1959e is given credit for having conceived the method for integrating the separate elements.

IC's advantages over vacuum tubes were obvious: they became less expensive, did not burn out in service, and were much smaller and more reliable.


Early ICs contained about 10 individual components on a silicon chip 3 mm (0.12 inch) square. By 1970 the number was up to 1,000 on a chip of the same size at no increase in cost. Late in the following year the first microprocessor was introduced. The device contained all the arithmetic, logic, and control circuitry required to perform the functions of a computer’s central processing unit (CPU). This type of large-scale IC was developed by a team at Intel Corporation, the same company that also introduced the memory IC in 1971. The stage was now set for the computerization of small electronic equipment.

Until the microprocessor appeared on the scene, computers were essentially discrete pieces of equipment used primarily for data processing and scientific calculations. They ranged in size from minicomputers, comparable in dimensions to a small filing cabinet, to mainframe systems that could fill a large room.

The microprocessor enabled computer engineers to develop microcomputers—systems about the size of a lunch box or smaller but with enough computing power to perform many kinds of business, industrial, and scientific tasks. Such systems made it possible to control a host of small instruments or devices (e.g., numerically controlled lathes and one-armed robotic devices for spot welding) by using standard components programmed to do a specific job. The very existence of computer hardware inside such devices is not apparent to the user.

The large demand for microprocessors generated by these initial applications led to high-volume production and a dramatic reduction in cost. This in turn promoted the use of the devices in many other applications—for example, in household appliances and automobiles, for which electronic controls had previously been too expensive to consider.

Continued advances in IC technology gave rise to very large-scale integration (VLSI), which substantially increased the circuit density of microprocessors. These technological advances, coupled with further cost reductions stemming from improved manufacturing methods, made feasible the mass production of personal computers for use in offices, schools, and homes.

By the mid-1980s inexpensive microprocessors had stimulated computerization of an enormous variety of consumer products. Common examples included programmable microwave ovens and thermostats, clothes washers and dryers, self-tuning television sets and self-focusing cameras, videocassette recorders and video games, telephones and answering machines, musical instruments, watches, and security systems. Microelectronics also came to the fore in business, industry, government, and other sectors. Microprocessor-based equipment proliferated, ranging from automatic teller machines (ATMs) and point-of-sale terminals in retail stores to automated factory assembly systems and office workstations.

By mid-1986 memory ICs with a capacity of 262,144 bits (binary digits) were available. In fact, Gordon E. Moore, one of the founders of Intel, observed as early as 1965 that the complexity of ICs was approximately doubling every 18–24 months, which was still the case in 2000. This empirical “Moore’s law” is widely used in forecasting the technological requirements for manufacturing future ICs.


from

http://www.britannica.com/EBchecked/topic/183904/electronics

No comments: