Intel introduced the 4004 microprocessor on Nov. 15, 1971, billing it as a computer on a chip. It was a central processing unit, (CPU), and its progeny would become the brains of most things electronic, including PCs, game consoles, supercomputers, digital cameras, servers, smartphones, iPods, tablets automobiles, microwave ovens, and toys. It has become an indispensable part of modern life. They solve problems from displaying the time to calculating the effects of global warming.

“Today, there is no industry and no human endeavor that hasn’t been touched by microprocessors or microcontrollers,” said Federico Faggin, one of the trio of the microprocessor’s inventors, in a speech in 2009.

The first chip wasn’t very powerful; it was originally designed to perform math operations in a calculator called Busicom.The 4-bit microprocessor ran at a speed of 740 kilohertz, compared to top speeds above 4 gigahertz today. If the speed of cars had increased at the same pace as chips, it would take about one second to drive from San Francisco to New York. Today’s fastest Intel CPUs for PCs run 5,000 times faster than the 4004.

The microprocessor was a pioneering piece of work created by Faggin, Ted Hoff, Stan Mazor — all working for Intel. They created the chip at the behest of their Japanese customer, Masatoshi Shima, who worked for Busicom.

The chips got better and faster through a phenomenon known as Moore’s Law. Observed by Intel chairman emeritus Gordon Moore in 1965, the law holds that the number of transistors on a chip doubles every two years. That is possible because chip equipment can shrink the circuitry on a chip so that the grid of components is finer and finer. As the circuits become smaller, the electrons travel shorter distances, so the chips become faster. They consume less electricity and throw off less heat. The shorter distances mean the chips can become smaller and cost less to make.

Full story: VentureBeat