While the way CPUs work may seem like magic, it’s the result of decades of clever engineering. As transistors—the building blocks of any microchip—shrink to microscopic scales, the way they are produced grows ever more complicated.
Transistors are now so impossibly small that manufacturers can’t build them using normal methods. While precision lathes and even 3D printers can make incredibly intricate creations, they usually top out at micrometer levels of precision (that’s about one thirty-thousandth of an inch) and aren’t suitable for the nanometer scales at which today’s chips are built.
Photolithography solves this issue by removing the need to move complicated machinery around very precisely. Instead, it uses light to etch an image onto the chip—like a vintage overhead projector you might find in classrooms, but in reverse, scaling the stencil down to the desired precision.
The image is projected onto a silicon wafer, which is machined to very high precision in controlled laboratories, as any single speck of dust on the wafer could mean losing out on thousands of dollars. The wafer is coated with a material called a photoresist, which responds to the light and is washed away, leaving an etching of the CPU that can be filled in with copper or doped to form transistors. This process is then repeated many times, building up the CPU much like a 3D printer would build up layers of plastic.
The Issues With Nano-Scale Photolithography
It doesn’t matter if you can make the transistors smaller if they don’t actually work, and nano-scale tech runs into a lot of issues with physics. Transistors are supposed to stop the flow of electricity when they’re off, but they’re becoming so small that electrons can flow right through them. This is called quantum tunneling and is a massive problem for silicon engineers.
Defects are another problem. Even photolithography has a cap on its precision. It’s analogous to a blurry image from the projector; it’s not quite as clear when blown up or shrunk down. Currently, foundries are trying to mitigate this effect by using “extreme” ultraviolet light, a much higher wavelength than humans can perceive, using lasers in a vacuum chamber. But the problem will persist as the size gets smaller.
Defects can sometimes be mitigated with a process called binning—if the defect hits a CPU core, that core is disabled, and the chip is sold as a lower end part. In fact, most lineups of CPUs are manufactured using the same blueprint, but have cores disabled and sold at a lower price. If the defect hits the cache or another essential component, that chip may have to be thrown out, resulting in a lower yield and more expensive prices. Newer process nodes, like 7nm and 10nm, will have higher defect rates and will be more expensive as a result.
Packaging it Up
Packaging the CPU for consumer use is more than just putting it in a box with some styrofoam. When a CPU is finished, it’s still useless unless it can connect to the rest of the system. The “packaging” process refers to the method where the delicate silicon die is attached to the PCB most people think of as the “CPU.”
This process requires a lot of precision, but not as much as the previous steps. The CPU die is mounted to a silicon board, and electrical connections are run to all of the pins that make contact with the motherboard. Modern CPUs can have thousands of pins, with the high-end AMD Threadripper having 4094 of them.
Since the CPU produces a lot of heat, and should also be protected from the front, an “integrated heat spreader” is mounted to the top. This makes contact with the die and transfers heat to a cooler that is mounted on top. For some enthusiasts, the thermal paste used to make this connection isn’t good enough, which results in people delidding their processors to apply a more premium solution.
Once it’s all put together, it can be packaged into actual boxes, ready to hit the shelves and be slotted into your future computer. With how complex the manufacturing is, it’s a wonder most CPUs are only a couple hundred bucks.