Why Nvidia Is Betting on an Open-Source Chip Startup to Power AI Data Centers

Nvidia just made a surprising move: it backed a startup building CPUs on completely different technology than the chips that power most AI systems today. SiFive, an 11-year-old company founded by UC Berkeley engineers, raised $400 million in a new funding round that values it at $3.65 billion. The twist is that Nvidia, the dominant force in AI computing, is one of the investors. This signals a major shift in how the tech giant is thinking about data center architecture and the future of AI infrastructure.

What Makes SiFive's Approach Different from Nvidia's Competitors?

Most CPUs powering data centers today run on either Intel's x86 architecture or ARM-based designs. SiFive uses RISC-V, an open-source processor design that has historically been used for smaller applications like embedded systems. Until recently, RISC-V wasn't considered powerful enough for the demanding workloads of AI data centers. But with Nvidia's backing and fresh capital, SiFive is now positioning itself to compete in this high-stakes market.

The company's business model mirrors how ARM operated in its early years. Rather than manufacturing chips itself, SiFive licenses its chip designs to other companies, who can then modify them for their specific needs. This approach gives customers flexibility while keeping SiFive focused on design innovation. The company hasn't raised funding since March 2022, when it brought in $175 million at a pre-money valuation of $2.33 billion, making this new round a significant jump in valuation.

How Will SiFive's CPUs Actually Work in Nvidia's AI Ecosystem?

Here's where the strategy becomes clear. SiFive's RISC-V designs are being built to work seamlessly with Nvidia's CUDA software, the programming framework that powers most AI applications today. More importantly, they'll integrate with NVLink Fusion, Nvidia's rack server system that allows different types of CPUs to plug into what Nvidia calls its "AI factory". This means companies building AI data centers won't have to choose between Nvidia's GPUs and alternative CPUs; they can use both together in the same system.

The funding round attracted a diverse group of investors beyond Nvidia. Atreides Management, founded by former Fidelity investor Gavin Baker, led the round. Other backers include Apollo Global Management, D1 Capital Partners, Point72, T. Rowe Price, Sutter Hill Ventures, and others. This breadth of support suggests confidence that open-source chip design could reshape data center infrastructure.

Why Does Nvidia Backing an Alternative CPU Matter for Data Centers?

On the surface, this seems counterintuitive. Nvidia dominates AI computing through its GPUs, and competitors like Intel and AMD are desperately trying to build their own GPU alternatives to challenge Nvidia's market position. By backing SiFive, Nvidia appears to be taking a different approach: rather than fighting competitors head-on, it's creating an ecosystem where different chip architectures can coexist and complement each other.

This strategy has several implications for how AI data centers will be built in the coming years:

  • Reduced Lock-In: Companies building AI infrastructure won't be forced to rely exclusively on a single chip vendor, potentially lowering costs and increasing flexibility in system design.
  • Open Standards Advantage: RISC-V's open nature means any company can build chips based on the design without licensing fees, which could accelerate innovation and reduce dependency on proprietary architectures.
  • Nvidia's Ecosystem Play: By supporting multiple CPU options that work with its CUDA software and NVLink Fusion, Nvidia strengthens its position as the central hub of AI computing rather than just a GPU maker.

SiFive's timing is strategic. The company was founded in 2015 by engineers who created the original open-source RISC-V design, giving it deep credibility in the chip design community. For over a decade, RISC-V remained a niche technology, but the explosive growth of AI and the resulting demand for specialized computing hardware has created an opportunity for alternative architectures to gain traction.

The $400 million oversubscribed round, meaning investors wanted to put in more money than the company was raising, underscores how seriously the investment community is taking this bet. It's not just venture capitalists backing SiFive; the round includes private equity firms and hedge funds, suggesting institutional confidence that open-source chip design could become a significant part of AI infrastructure.

For data center operators and AI companies, this development means more options are coming. Rather than choosing between Nvidia's GPUs and Intel or AMD's CPUs, they'll soon be able to mix and match different architectures optimized for different workloads. SiFive's RISC-V designs could handle certain tasks more efficiently than traditional x86 or ARM chips, potentially reducing overall power consumption and costs in large-scale AI deployments.

The broader implication is that Nvidia's support signals confidence that the future of AI computing isn't about one company dominating every layer of the stack. Instead, it's about creating an open ecosystem where different technologies can integrate seamlessly. For an industry grappling with skyrocketing power demands and infrastructure costs, that kind of flexibility could be transformative.

" }