Lightmatter's Broader System Approach
Lightmatter
The key difference is that Lightmatter is trying to own more of the AI system bottleneck, not just one link in it. Celestial AI is centered on moving data with its Photonic Fabric optical interconnect, especially between compute and memory, while Lightmatter has built a broader product stack around Passage interconnect products and earlier photonic compute efforts, giving it more ways to win design slots as AI clusters move from copper to optics.
-
Celestial AI raised $175M in its Series C on March 27, 2024, then expanded that with a $250M Series C1 in March 2025 and later announced a $255M final close in August 2025. That funding pace shows strong demand for optical I/O, but it also underlines that Celestial is scaling around a focused interconnect wedge rather than a full compute platform.
-
Lightmatter’s current roadmap is concrete and system level. Passage L200 is a 3D co-packaged optics product, Passage L20 is a 6.4 Tbps optical engine for rack scale links, and the company is pairing its photonics with partners like Alphawave, Qualcomm, Cadence, and GUC to fit directly into AI server and chiplet designs.
-
Ayar Labs is the closest proof point for the optical interconnect only model. It has focused on optical I/O rather than compute, raised $155M, and attracted Intel, AMD, and NVIDIA backing. That makes the market structure clearer, one set of companies sells optical links, while Lightmatter is pushing to bundle the links with more of the surrounding architecture.
The market is heading toward optical links becoming standard inside AI systems, from chiplets to racks. If that happens, focused interconnect companies can become critical suppliers, but the biggest upside will likely go to companies that package optics as part of a broader system design. That is the lane Lightmatter is trying to occupy.