The death of Moore’s Law and Semiconductor Miniaturisation

Pranav AVN
2 min readJan 30, 2024

--

The death of Moore’s Law and Semiconductor Miniaturisation

In 1965 Gordon Moore observed the number of transistors in a dense integrated circuit will double every 18 months, (which he later revised to two years) thereby increasing processing power. In 1968, Moore went on to co-found Intel with Robert Noyce and his observation became the driving force behind Intel’s success with the semiconductor chip. The fact that Moore’s Law survived for over 50 years as a guide for innovation surprised Moore himself, and in a 2015 interview, he describes a couple of potential obstacles related to further miniaturisation: the speed of light, the atomic nature of materials and growing costs.

A CPU (Central Processing Unit) performs basic arithmetic operations. A microprocessor incorporates features of a CPU on a single integrated circuit, which itself consists of transistors. Nowadays, a CPU is consist of billions of transistors.The first Intel microprocessor, Intel 4004, had 2,300 transistors, each 10 microns in size. As of 2019, a single transistor on the mass market is, on average, 14 nanometers (nm), with many 10 nm models entering the market in 2018. Intel managed to pack over 100 million transistors on each square millimetre.

The speed of light is finite, constant and provides a natural limitation on the number of computations a single transistor can process. After all, information can’t be passed quicker than the speed of light. Currently, bits are modelled by electrons traveling through transistors, thus the speed of computation is limited by the speed of an electron moving through matter. Wires and transistors are characterised by capacitance C and resistance R. With miniaturisation, R goes up while C goes down and it becomes more difficult to perform correct computations.

As we continue to miniaturise chips, Heisenberg’s uncertainty principle takes precedence, which limits precision at the quantum level, thus limiting our computational capabilities. Experts have calculated that, due to the uncertainty principle alone, Moore’s Law will be obsolete by 2036.

Another factor threatening the future of Moore’s Law is the growing costs related to energy, cooling and manufacturing. Building new CPUs or GPUs (graphics processing unit) can cost a lot. The cost to manufacture a new 10 nm chip is around $170 million, almost $300 million for a 7 nm chip and over $500 million for a 5 nm chip. Those numbers can only grow with some specialised chips. For example, Nvidia spent over $2 billion on research and development to produce a GPU designed to accelerate AI.

--

--

Pranav AVN
Pranav AVN

Written by Pranav AVN

Aspiring Electrical Engineer

Responses (1)