Extropic Unveils Superconducting AI Processor Outperforming GPUs by Orders of Magnitude
On March 20th, Extropic, a startup dedicated to developing and manufacturing superconducting artificial intelligence processors, recently secured a seed funding round of $14.1 million. Extropic claims to be on a mission to create AI accelerators that are “orders of magnitude faster than digital processors (CPU/GPU/TPU/FPGA) and more energy-efficient.”
Extropic believes that the currently available digital processors are not well-suited for AI acceleration. In a research paper, they presented a novel probabilistic computing paradigm that is said to be diverging from increasingly complex conventional digital computing methods. Extropic is enhancing efficiency and unlocking the full potential of generative AI by directly implementing Energy-Based Models (EBM) as parameterized stochastic simulation circuits. In terms of runtime and energy efficiency of algorithms based on complex landscape sampling, Extropic’s accelerator is expected to achieve multiple orders of magnitude improvement over digital computers.
According to reports, Extropic’s first processor is made of nanometer-scale aluminum and operates at superconducting low temperatures. Some of these neurons are similar to existing superconducting flux qubits, providing basic building blocks that combine to form larger superconducting systems. In such a larger system, numerous linear and nonlinear neurons combine to create a circuit sampling from rich and high-dimensional distributions.
△ Microscopic image of the Extropic chip. The illustration shows two Josephson junctions, which are devices providing critical nonlinearity to the processor.
Extropic’s superconducting chips are entirely passive, meaning energy is only consumed when measuring or manipulating their states. These neurons are believed to be among the most energy-efficient neurons in the universe. These systems will achieve high energy efficiency at scale: Extropic aims to serve small-batch, high-value customers like governments, banks, and private clouds.
Extropic is also developing semiconductor devices that can operate at room temperature to expand its market reach. These devices replace Josephson junctions with transistors. While sacrificing some energy efficiency compared to superconducting devices, this approach allows for standard manufacturing processes and supply chains for mass production. Operating at room temperature allows these devices to be packaged into formats resembling GPU cards, potentially enabling placing an Extropic accelerator in every household, democratizing access to thermodynamic AI acceleration.
To support a wide array of hardware, Extropic is building a software layer that can compile the abstract specifications of EBMs into relevant hardware control languages. This compilation layer is built on the framework of factor graphs, which specify how large distributions decompose into local blocks. This enables Extropic accelerators to decompose and execute programs that are too large to fit in any given simulation core.
Many AI accelerator companies have struggled to find advantages due to the memory limitations in deep learning; today’s algorithms spend approximately 25% of their time moving data in memory. According to Amdahl’s Law, achieving more than four times acceleration for chips that accelerate specific operations (such as matrix multiplication) is challenging. By natively accelerating a wide range of probabilistic algorithms through physical operations, Extropic chips are poised to unlock a whole new AI acceleration mechanism far beyond what was previously considered achievable.
Founded in 2022 by its CEO Guillaume Verdon, Extropic is backed by a team with extensive experience in physics and artificial intelligence. Before founding Extropic, Guillaume led the quantum technology efforts in Alphabet X’s Physics and AI team, pioneering various quantum technologies with broad applications in areas like sensing, communication, and representation learning. Guillaume is renowned as a pioneer in quantum deep learning, having initiated the later Google TensorFlow Quantum project during his doctoral studies at the University of Waterloo, eventually joining Google Quantum AI. With a broader background in theoretical physics and information theory, he holds master’s degrees from Perimeter Institute and the Institute for Quantum Computing.
Extropic’s Chief Technology Officer, Trevor McCourt, originally a mechanical engineer, first met Guillaume when joining the founding team of TensorFlow Quantum at the University of Waterloo. Their collaboration started with developing differentiable quantum programming software from scratch. Trevor later transitioned back to hardware engineering, working on cutting-edge devices and control technologies at Google Quantum AI. Seeking meaning in self-organizing physical systems, Trevor pursued a doctoral degree at MIT, researching the role of noise in computational and biological systems.
Christopher Chamberland, Extropic’s Chief Architect, is widely regarded as one of the most prominent quantum computer architects. After defining core quantum architectures and roadmaps at AWS and IBM Quantum, he decided to shift from the quantum computing field to lead architecture work at Extropic. Before joining AWS and IBM, Christopher worked at Microsoft Quantum and earned his PhD from the Institute for Quantum Computing at the University of Waterloo.
Extropic’s broader team consists of scientists and engineers with backgrounds in physics and artificial intelligence who have previously worked at AWS, Meta, IBM, Nvidia, Xanadu, and many top academic institutions worldwide. This highly interdisciplinary team brings a wealth of experience in physics-based artificial intelligence and possesses unique advantages to pioneer the unified approach to physics and artificial intelligence that Extropic pursues.
Editor: Xinzixun – Lin Zi