Nick Harris, CEO of Light Matter, discusses how their pioneering photonic AI chips use light instead of electrical signals to overcome the limitations of traditional semiconductor technology, enabling vastly faster and more energy-efficient data transmission for AI computing. He envisions this technology revolutionizing data centers and AI development by dramatically accelerating processing speeds and reducing energy consumption, with future possibilities including photonic computation and even space-based data centers.
In this interview, Katherine Schwab from Forbes speaks with Nick Harris, CEO of Light Matter, a company pioneering AI chip technology that uses light instead of traditional electrical signals to transmit data. Nick explains that the motivation behind using light stems from the limitations of Moore’s Law and Dennard scaling, which have slowed down since around 2005. Light offers fundamentally different physics, allowing data to be transmitted over vast distances with minimal loss and at extremely high speeds, as evidenced by their recent achievement of an 800 gigabit per second optical fiber, which can power the equivalent of 800 homes.
The conversation highlights the urgent need for new computing technologies due to the explosive growth in AI demand. Traditional semiconductor improvements, which used to occur every 18 months, now need to happen every three and a half months to keep pace with AI advancements. Light Matter’s photonic chips enable GPUs to communicate at unprecedented speeds using optical fibers, significantly increasing bandwidth and efficiency. Their M1000 platform, for example, can handle 114 terabits per second, equivalent to the bandwidth needs of an entire city, enabling much larger and faster AI supercomputers.
Nick shares the origin story of Light Matter, tracing it back to his time as an R&D engineer at Micron and later as a graduate student at MIT. He recognized the end of transistor scaling and saw the potential for silicon photonics to accelerate AI computing. Despite early challenges, including the need to invent new technologies and manufacturing processes, the company gained traction after winning prestigious entrepreneurship competitions and securing investor interest. Their technology now addresses critical bottlenecks in data center efficiency and scalability, especially as AI workloads demand ever-larger clusters of GPUs communicating at high speed.
A major challenge facing data centers today is energy consumption, with AI compute alone projected to consume a significant fraction of power in regions like Texas. Light Matter’s technology can make data centers more efficient by enabling larger groups of GPUs—potentially millions—to communicate as if they were a single giant chip, dramatically speeding up AI training and inference while reducing energy waste. This capability could accelerate AI development cycles from months to weeks or even seconds, unlocking new possibilities for AI applications, including much larger and more intelligent models.
Looking to the future, Nick envisions a radical transformation of computing where light replaces electrical signals not only for communication but also for computation itself. He predicts that in the next decade or more, photonic computing could enable fundamentally new types of AI models and supercomputers. In a bold and sci-fi-inspired vision, he even imagines Mars becoming a massive data center, leveraging space infrastructure to support the ever-growing demand for AI compute. This futuristic outlook underscores the profound impact that photonic technology could have on the evolution of AI and computing at large.