Ever wonder how humans went from the massive computers of the 1940s to the handheld smartphones of the modern era? The answer lies within the transistor. In every electronic device, from cell phones to smart fridges, these components are building blocks of modern technology.
75 years ago, American physicists William Shockley, Walter Brittain and John Bardeen from Bell Labs demonstrated the first working semiconductor transistor as a “magnificent Christmas present” to their executives, according to Shockley. This prototype, known as the point-contact transistor, consisted of a block of germanium, a semiconductor and gold foil wrapped around a spring-held plastic wedge designed to make point contact with the base in order to facilitate the flow of current.
In January 1948, Shockley, the lead physicist of the research group, developed a more robust version named the junction transistor. These smaller, more reliable junction transistors replaced unstable, bulky vacuum tubes, allowing for more compact and efficient computer designs.
In 1958, Jack Kilby, employee at Texas Instruments and future Nobel Prize laureate, and Robert Noyce, co-founder of Fairchild Semiconductor and Intel Corporation, independently developed methods to integrate, or group, transistors on silicon substrates to form integrated circuits (IC).
The basis of transistors lies within semiconductors (most commonly silicon), which only conducts electricity in certain conditions. Pure silicon alone conducts current poorly, so impurities are introduced to the material in a process called doping, forming two types of semiconductors: p-type and n-type. When put side by side, they form a PN junction, and this configuration allows amplification of current as well as the regulation of its flow.
Nowadays, highly complicated integrated circuits contain billions of transistors on one chip. Nvidia’s H100 GPU, the computing engine behind AI, utilizes 80 billion transistors.
“We used to build things with either relays, mechanical switches or vacuum tubes, which are electrically controlled switches, but they’re big and finicky,” upper school computer science teacher Marina Peregrino said. “To get the computers to function, they’d have to go over and replace all the broken vacuum tubes, and same with mechanical relay. Something [physical] is actually switching, so that’s what wears out and breaks them down. A transistor is just a crystal. If you use them properly, they don’t wear out, because there’s nothing moving.”
So how exactly is this feat possible? Wafer fabrication squeezes billions of nanometer-wide transistors onto miniature computer chips.
First, companies produce a silicon wafer: a sheet of the semiconductor material which serves as the foundation for the integrated circuit. A rotating machine then coats the circular wafer with photoresist, a substance that softens upon exposure to ultraviolet light. Next, they shine light through masks containing circuit patterns onto the wafer, similar to how graffiti artists use stencils to paint specific shapes onto a surface. After a chemical wash removes the photoresist, they fill the gaps of the imprinted patterns with a conductive metal, most commonly copper.
Moore’s Law, created by and named after Intel’s co-founder Gordon Moore, empirically predicts that the number of total transistors on an integrated circuit doubles approximately every two years, a trend that has continued since the 1970s. The law has developed into a self-fulfilling prophecy, as engineers base their advancements off these expectations, attempting to meet the benchmark each year.
Upper school robotics instructor Martin Baynes, who worked in the semiconductor industry for almost two decades and stood in the same room as Moore during Baynes’ career at Fairchild Semiconductor, shares his perspective on the law.
“Gordon Moore was a visionary, and he could see, without knowing the details, that problems were solvable,” Baynes said. “Effectively, what Gordon Moore did was he said, ‘This is the hurdle we’re going to jump in two years, this is the hurdle we’re going to jump in four years,’ Baynes said. “The industry can work at that kind of pace. He wasn’t telling the industry how to do it, but he could see that that could be done right.”
But growth inevitably slows down. Transistors can’t continue to shrink. The debate over when exactly Moore’s law will die has long since existed within the scientific community. Many researchers worry that the law will become obsolete. Some claim it has already ended. Others, including Moore himself, predict that it will halt in the near future. Baynes remains optimistic that improvements will continue to happen.
“I think Moore’s law will continue to hold because it’s not a physical law,” Baynes said. “It’s a law of business behavior … For a number of decades, people have been saying you can’t pass this point, and we keep passing them. So I think it will hold. It may tail off a bit, but it’s going to hold on by and large.”
Besides everyday electronic devices, transistors also drive the AI revolution. Intelligences like ChatGPT or OpenArt seem smooth and efficient, yet behind the scenes, these robots need a vast amount of data. Without billions of transistors, processing hundreds of gigabytes would take an immeasurable amount of time.
“A lot of the ideas and math behind AI was developed even before we really had access to so much computing power, but it wasn’t really feasible,” Peregrino said. “Now it’s feasible. For example, with wave functions, you could have a great wave function with mathematics, but [you] can’t solve it. Well, you could put all this data into the computer, and the computer can march through and estimate the solution to that. Science increases engineering, engineering increases science, and so it snowballs.”
Recent developments in transistor production show this self-perpetuating growth. Companies are experimenting with artificial intelligence to resolve design issues.
For instance, light diffraction poses a major problem when shining light through small and intricate mask patterns. Gaps in the “stencil” are so small (often 2 angstroms, or 0.2 nanometers wide) that it interferes with light waves, spreading them out and essentially blurring the focus. Artificial intelligence may help compensate for these difficulties and improve transistor technology, which leads to enhancements in AI.
Whether or not development continues or slows, transistors remain the cornerstone of technology. They enable key functionalities in computers and phones and help programs perform calculations.
“I have a feeling transistors will stay important just because they’re so versatile,” AI Club Lecturer Aniketh Tummala (12) said. “You can make logic circuits out of transistors, memory too, so they’ll definitely play a heavy hand. I don’t see us phasing them out anytime soon.”