angle-left When chips were no longer a military secret
INNOVATION TUNGSTENO

When chips were no longer a military secret

The birth of the microprocessor, in the early 1970s, sparked legal battles, conceptual challenges, and a veritable race for the jump to personal computers. The order of a Japanese calculator company was responsible for bringing this technology to market, until then classified as a military secret, which has been key to the evolution towards today's digital society.

The Intel 4004, born in November 1971, was the first commercially produced microprocessor. Credit: Wikimedia.

 

ANTONIO LÓPEZ | Tungsteno

A computer’s "brain" has no neurons, but millions of transistors in a tiny space, through which electrons flow. This is why the creation of the microprocessor was a technological challenge that also required a very detailed construction process. Until the 1970s, the electronic components of a processor could not be integrated into a single circuit to form a CPU (central processing unit), but in 1971, an Italian engineer and physicist who had trained in the heart of Silicon Valley (he came from the Research and Development Laboratories at Fairchild Semiconductors) came up with the key.

Federico Faggin joined the Intel team (then a start-up), where he developed, together with Marcian Hoff and Stan Mazor, the Intel 4004, the first microprocessor in the world to make it to market. To achieve this, Faggin used a new silicon gate technology that made it possible to double the number of transistors included in the microprocessor and increase its speed up to five times.

A microprocessor is a part as tiny as it is fundamental in whatever computer, a microscopic circuit with millions of integrated transistors that allows it to receive information and give orders (in a programming language) to the rest of the components. The Intel 4004 had 23,000 transistors and a capacity to perform up to 60,000 operations. In the 1970s, the microprocessor market was growing exponentially, becoming saturated and finding itself in a price war after just a decade of life. However, the story of microprocessor innovation is by no means straightforward or transparent.

In the years leading up to the birth of the Intel 4004, the Silicon Valley semiconductor industry had ambitions to integrate multiple circuits on a single chip. Not surprisingly, in this context, the development of military technology took precedence over the commercial processor. An example of this parallel history is the MP944, a microprocessor developed in 1968 by Garret AiResearch and designed and built by Steve Geller and Ray Holt, who were working on an integrated control system for the US Navy's Tomcat F-14 fighter. Their work was considered classified information by the US Navy until 1998, when after many years of litigation Ray Holt managed to have it declassified.

Silicon gate technology made it possible for the first time to manufacture high speed, low cost and reliable integrated circuits on a large scale. Credit: Intel.

The future and the end of Moore's Law

One of the founders of Intel when Faggin joined the team was Gordon Moore, who had also worked at Fairchild Semiconductors, where he had been the director of research and development. One of Moore's observations led to the famous law that bears his name: "The complexity of integrated circuits would double approximately every year with a commensurate reduction in cost."

The fact is that less than fifty years have passed from the first 4-bit processors to today's 64-bit, multi-core, multi-gigahertz processors. However, there are many voices that argue that the principles of Moore’s law have reached their limits, since the number of transistors in a microprocessor alone no longer offers an exponential increase in its capacity, and bold innovations are required that go beyond the materials and concept of the original system. For example, a team of researchers at the University of Eindhoven is experimenting with photons to transport data on a processor. Will mixed circuits (optical and electronic) have the potential to challenge Moore’s law itself?

New materials such as carbon nanotubes, more resistant and efficient, are emerging as substitutes for silicon for the development of the microchips of the future. Credit: Stanford University.

To overcome some of the limitations imposed by traditional microprocessor design, scientists are also exploring new materials such as carbon nanotubes or Gallium Nitride (GaN), an element made to order in the laboratory that will allow chips to remain miniaturized. Much more efficient and resistant to high temperatures than the silicon used over the last 50 years, GaN could provide the definitive boost to such promising technologies as 5G.

Although it is difficult to trace the family tree of microprocessors and their future offspring, what is clear is that this technological innovation, in addition to integrating circuits, has made it possible to integrate electronics into homes around the world, because nowadays "chips of chips" are practically ubiquitous.

· — —
Tungsteno is a journalism laboratory to scan the essence of innovation. Devised by Materia Publicaciones Científicas for Sacyr’s blog.

 

También podría gustarte

FIND US ON FACEBOOK

INNOVATION VIDEOS

CONTACT SACYR

Condesa de Venadito, 7
28027 - Madrid (Spain)
Phone: +34 91 545 50 00

www.sacyr.com