Tracking Tech History

A look at the evolution of three critical innovations

8 June 2009
tech Clockwise from top: Geographic map of the ARPANET; Fairchild Semiconductor’s first commercial planar transistor; Motorola’s Razr cellphone on top of the company’s first cellphone, a DynaTAC 8000X.

IEEE’s 125th anniversary is intended not only to honor engineering’s past but also to celebrate its future. Here we look at three technologies IEEE members have been involved with that have had a glorious past and still promise a shining future: the telephone, the integrated circuit, and the Internet.

MOBILITY RULES
Who doesn’t like mobile phones? But from 1946—when mobile telephone service (MTS) was introduced—until the mid-1980s—when cellular phones went into wide commercial use—only the wealthy could afford them. The MTS system was not sophisticated, having only a dozen two-way channels. And because it could not reuse frequencies within a metro­politan area, it served relatively few. The system covered its area from a single high-power base station, with each call taking up an entire channel.

Today, thanks to the cellular concept with its multiple low-power base stations and much broader frequency allocation, nearly everyone has a cellphone. Some developing countries might even skip installing landlines altogether and move right to cellphones.

As cellphones became more popular, they gained features not traditionally associated with telephony. In fact, making a phone call seems almost a side issue today when you consider the number of functions provided by so-called smartphones. Basically, the tiny tykes are computers built on an open-source operating system. Their many applications come from the phone manufacturer, the network operator, and third-party software developers. Applications include e-mail, Web browsing, and the ability to take, display, and transmit photos. Users can play games, listen to music, watch videos, and read documents. Some phones include a GPS receiver—which is leading to a variety of location-based services, from simply finding an address to locating a movie theater or restaurant. Certainly, Alexander Graham Bell, the 1891–1892 president of the American Institute of Electrical Engineers, wouldn’t know what to do first. Who can tell what else the future will bring?

FROM PLANAR TO 3-D
p-n junctions with a protective layer of silicon dioxide to guard against contamination, a source in some transistors of amplification instability. The immediate result was better transistors.

But that flat oxide layer also turned out to be an excellent substrate for depositing metal traces for interconnecting other components fabricated on the same piece of silicon. Thus the modern IC was born. In the early days, a typical IC, cut from a 50-millimeter-diameter silicon wafer, held a few dozen transistors. Today’s ICs, fabricated on 300-mm wafers, can hold more than a billion.

The history of IC development is a series of victories over problems that arose as transistors kept shrinking. The number of transistors per chip has grown exponentially, doubling about every 18 months, following Moore’s Law. Many times since Gordon Moore, an IEEE Life Fellow, first described the phenomenon in 1965, problems threatened to end the law’s predictive sway. Most recently, two related issues were a threat: Chips had to dissipate too much power and signal transmission delays were too long. But chip ­designers solved those problems, and the validity of Moore’s Law is poised to extend into the future.­

The power problem was solved by gating off parts of the chip circuit that weren’t being used at a particular time. And signal delays were trimmed by making chip interconnect lines shorter—with three-dimensional wiring that replaces long horizontal interconnects with short vertical ones, a technique described in February at the IEEE International Solid-State Circuits Conference. The technique involves stacking planar devices and interconnecting them with metal placed in through-silicon vias, or holes. (Through-hole vias have been used for years on dense printed circuit boards.) Other benefits of the construction include the ability to integrate in a single device circuit layers made of different—even incompatible—processes; to pack more functions into a given footprint; and to make it difficult to copy a device by reverse-engineering it.

THE INTERNET
Beginning life as the ARPANET in the 1960s, the Internet was meant to allow scientists funded by the U.S. Department of Defense to run programs on widely separated computers and to share software. Its designers chose to implement the new network with then-untried packet-­switching technology, which could carry data more efficiently than conventional circuit switching could.

Packet switching had been conceived a few years earlier, not in search of efficiency but for its efficacy in moving information across a distributed data network then being developed for the U.S. military. The network, which was never built, was to have had a distributed architecture to ensure that it could survive an enemy attack. This architecture, inspired by Cold War military concerns, became the Internet’s bedrock.

But another factor, also of military origin, was perhaps even more important. Realizing that military field operations often relied on radio—and, increasingly, satellite—communications, the Department of Defense’s Advanced Research Project Agency (then referred to as ARPA, and now as DARPA) built a pair of packet-radio networks: the terrestrial PRNET and the satellite-based SATNET. It quickly became clear that the two networks had to be connected. From the effort to link the dissimilar packet networks came most of the key ideas behind today’s Internet, including the multilayered protocol stack in which computers at the end points of a communication path, rather than the network, take on responsibility for communications reliability.

The Internet has grown enormously in the decades since its birth, but its basic structure has remained unchanged. It has incorporated technical advances from other fields, including fiber-optic cables and other components, advanced computers, and faster semiconductors.

By far the Internet’s most important application is the World Wide Web. Timothy Berners-Lee, the 2008 IEEE/RSE Wolfson James Clerk Maxwell Award recipient, developed the foundation for the Web while at the European Organization for Particle Research (CERN) near Geneva. Once the public realized what the Web could do, the Internet took its final step in moving from military project to the specialists’ tool to everyone’s principal communications medium.

Recently, marketers heralded the arrival of Web 2.0 as a way of describing the Internet’s latest capabilities compared with what’s now retro­spectively called Web 1.0. Web 2.0 fosters innovation by making it easier for anyone to create Web sites and services by combining existing features. To learn the direction Web 2.0 will be taking, stay tuned.

—Compiled with the help of the IEEE History Center

Learn More