A tech industry pioneer sees ways for the U.S. to lead in advanced chips
Sixty years have passed since Ivan Sutherland created Sketchpad, a software system that heralded the future of interactive and graphical computing. In the 1970s, he was instrumental in rallying the computer industry to build a new type of microchip with hundreds of thousands of circuits that would become the basis of today’s semiconductor industry.
Dr Sutherland, now 84, believes the US has failed at a critical time to consider alternative chip-making technologies that would allow the country to regain its lead in making the most advanced computers.
By relying on supercooled electronic circuits that switch without resistance and therefore don’t generate excessive heat at higher speeds, computer designers will be able to bypass the biggest technical hurdle to get faster machines, he claims.
“The nation best able to seize the opportunity of superconducting digital circuits will enjoy a computing advantage for decades to come,” he and a colleague wrote in a recent article circulating among technologists and government officials.
Dr Sutherland’s opinion It matters in part because decades ago he was instrumental in helping create the dominant method of making computer chips today.
In the 1970s, Dr. Sutherland, chair of Caltech’s computer science department, and his brother Bert Sutherland, then a research manager at Xerox’s Palo Alto Research Center, introduced computer scientist Lynn Conway to physicist Carver Mead.
They pioneered a design based on an American-invented transistor called Complementary Metal Oxide Semiconductor, or CMOS. It enables the manufacture of microchips used in personal computers, video games and a host of commercial, consumer and military products.
Now, Dr. Sutherland argues, an alternative technology that predates CMOS and has had many false starts should be revisited. Superconducting electronics was pioneered by MIT in the 1950s, pursued by IBM in the 1970s, and largely abandoned after that. At one point, it even took an odd international detour before returning to the US.
In 1987, Mikhail Gorbachev, the last leader of the Soviet Union, read an article in the Russian newspaper Pravda describing Japanese microelectronics giant Fujitsu’s work on cryogenic computing Amazing progress made.
Mr. Gorbachev was intrigued. Isn’t this, he wondered, an area in which the Soviet Union could excel? The task of giving a five-minute briefing to the Soviet Politburo eventually fell to Konstantin Likhalev, a young associate professor of physics at Moscow State University.
However, when he read the article, Dr Likharev realized that the Pravda reporter had misread the press release and claimed that Fujitsu’s superconducting memory chips were five orders of magnitude faster than it.
Dr. Likharev explains the errorbut he noted that there is still hope in the field.
This set off a chain of events through which Dr. Likharev’s small laboratory received millions of dollars in research support, which made it possible for him to build a small research team and eventually relocate to the United States after the fall of the Berlin Wall. Dr. Likharev held a physics position at Stony Brook University in New York and helped start ultra high pressurea digital superconductor company that still exists.
The story might end there. But it seems the elusive technology may be gaining momentum again, as the cost of making modern chips has grown enormous. A new semiconductor factory costs $10 billion to $20 billion and takes up to five years to complete.
Dr Sutherland believes that instead of pushing for more expensive technologies that are increasingly inefficient, the US should consider training a generation of young engineers who can think outside the box.
Computing systems based on superconductors, in which the resistance of switches and wires drops to zero, may solve cooling challenges that increasingly plague the world’s data centers.
CMOS chip manufacturing is dominated by Taiwanese and Korean companies. The U.S. is now planning to spend a third of the nearly $1 trillion in private and public funds in an effort to rebuild the country’s chip industry and regain its global dominance.
Dr. Sutherland, like others in the industry, believes that CMOS manufacturing is hitting fundamental limits that will make the cost of progress unbearable.
“I think we can safely say that we’re going to have to fundamentally change the way we design computers, because we’re really approaching the limits of what our current silicon-based technology can do,” said expert Jonathan Kumi. Calculating energy requirements at scale.
As transistors shrink in size to only a few hundred or thousands of atoms, the semiconductor industry is increasingly beset by various technical challenges.
Modern microprocessor chips are also affected by what engineers call “dark silicon.” If all the billions of transistors on a modern microprocessor chip were used at the same time, the heat they generate would melt the chip. As a result, entire parts of modern chips are turned off, with only a few transistors active at any one time — making them significantly less efficient.
Dr Sutherland said the US should consider alternative technologies for national security reasons. He suggested that the advantages of superconducting computing technology may first play out in the highly competitive market for cell towers, the specialized computers inside cell phone towers that process wireless signals. China has become the dominant force in the current market for 5G technology, but next-generation 6G chips will benefit from the extreme speed and low power requirements of superconducting processors, he said.
Other industry executives agree. “Ivan is right, the power problem is a big problem,” said John L. Hennessy, an electrical engineer, Alphabet chairman and former president of Stanford University. He says there are only two ways to solve the problem — either to improve efficiency through a new design, which is unlikely for a general-purpose computer, or to create a new technology that is not bound by existing rules.
One such opportunity could be new computer designs that mimic the human brain, a marvel of low-power computing efficiency. AI research in the field of neuromorphic computing has previously used conventional silicon fabrication.
“Using superconductivity to create something comparable to the human brain really has potential,” says Elie Track, chief technology officer at superconducting company Hypres. , but unfortunately the funding agencies are not paying attention to it,” he said.
The age of superconducting computing may not yet be here, in part because whenever the CMOS world seems about to hit its final hurdle, clever engineering has overcome it.
In 2019, a team of researchers at the Massachusetts Institute of Technology, led by Max Shulaker, announced that they had built a Carbon Nanotube Microprocessor This promises to be 10 times more energy efficient than today’s silicon chips. Dr. Shulaker is working with Analog Devices, a Wilmington, Mass.-based semiconductor manufacturer, to commercialize a hybrid version of the technology.
“More and more, I believe you can’t beat silicon,” he said. “It’s a moving target and it’s really good at what it does.”
But as silicon approaches its atomic limit, alternative approaches look promising again. Mark Horowitz, a Stanford computer scientist who has helped start several Silicon Valley companies, said he would not underestimate Dr. Sutherland’s enthusiasm for superconducting electronics.
“People who change the course of history are always a little crazy, you know, but sometimes they’re right,” he said.