“Moore’s Law” is more of an observation and forecast rather than a genuine law. It originated in an April 1965 article in the 35th anniversary issue of “Electronics” magazine by Intel co-founder Gordon Moore, who thought that ongoing technological improvements in the semiconductor industry had led to the doubling every year of the number of transistors on integrated circuits, and that this trend would continue for the next ten years (until 1975).
In 1975 Moore throttled back on his prediction by half, stating, “The number of transistors incorporated in a chip will approximately double every 24 months.”
Many pundits mistakenly quote the law as a doubling of chip components every 18 months, which is actually a prediction by Moore’s colleague David House, who thought that such things as Moore’s law would lead to integrated circuits doubling in performance (not density) every 18 months.
In any case, Moore’s Law has proven remarkably accurate, thanks to various innovative manufacturing technologies, in particular deep excimer ultraviolet laser photolithography.
Feature sizes on chips are typically measured in terms of nanometers, each nanometer being one billionth of a meter. Chips had 800 nanometer sized transistors in 1990, and by 2012 the components had shrunk to 22 nanometers.
Today, chips are being manufactured with transistors about 14 nanometers in diameter, and an upcoming manufacturing process will reduce them to 10 nanometers.
The scientists over at IBM Research, however, have recently demonstrated a chip having transistors a mere 7 nanometers in size. At that scale, one can cram more than 20 billion transistors on a single fingernail-sized chip.
This endless shrinking process in chip manufacture results in a chip having progressively higher performance and lower power consumption. For example, a chip having 7-nanometer transistors could have four times the computational ability of present-day chips, at just half the size.
Thus, smaller transistors are faster transistors. A computer’s CPU consists of many transistors, which are basically switches, the switching speed of each being defined by the speed of each “logic gate,” the structure used to control a transistor’s output current. In the binary world of chips, switching from a “1” to a “0” takes a certain amount of time. This is called the “gate delay.” As a signal moves from gate-to-gate, a cumulative delay occurs, limiting the speed of computation.
The switches gobble up most of the power on a CPU chip. If you want to run your chip faster, you increase its clock speed or clock rate, which is the speed at which the microprocessor executes instructions. The processor’s “clock” is a tiny oscillator that sends out billions of pulses per second, a sort of metronome that regulates the workings of the whole chip.
One of the earliest CPUs, the Intel 8088 found in the original 1984 IBM PC, had a slow clock rate, poking along at a mere 4.77 MHz (Megahertz) or 4.77 million pulses per second. In the year 2000 processor clocks achieved a 1 GHz (Gigahertz) speed, or one billion clock cycles per second. Today’s desktop CPUs with a clock rate of 3 GHz (Gigahertz) can perform 3 billion clock cycles per second.
If you try to increase the clock speed to make the processor go faster, you must apply more energy and heat becomes a problem. At a certain speed — these days a bit over 4 GHz — the processor simply overheats and computations fail.
Back in 2011, at AMD headquarters in Austin, Texas, an AMD FX Bulldozer CPU got into the "Guinness Book of World Records" when they submerged the chip in a container of incredibly cold liquid helium and revved the CPU’s clock up to 8.4 GHz.
In the same vein, some hobbyists employ water cooling so that they can “overclock” their respective computers for gaming purposes. Some folks even try extreme cooling scenarios using thermoelectric cooling (TEC), coolers based on vapor phase change refrigeration principles, or even home brew systems involving dry ice, liquid nitrogen or liquid helium.
It’s easier, however, simply to wait for new, more energy efficient chips having smaller, faster, cooler transistors.
Moreover, there is the matter of what the chip itself is made of. Nearly all processors are fabricated out of silicon, but the IBM researchers added a bit of gallium arsenide to their chip formula, which enables even higher clock frequencies to be used without melting the chip.
So let’s root, root, root for the IBM home team and keep our fingers crossed that 7-nanometer chips are not too far off.
Richard Grigonis is an internationally known technology editor and writer. He was executive editor of Technology Management Corporation’s IP Communications Group of magazines from 2006 to 2009. The author of five books on computers and telecom, including the highly influential Computer Telephony Encyclopedia (2000), he was the chief technical editor of Harry Newton's Computer Telephony magazine (later retitled Communications Convergence after its acquisition by Miller Freeman/CMP Media) from its first year of operation in 1994 until 2003. Read more reports from Richard Grigonis — Click Here Now
© 2022 Newsmax. All rights reserved.