The Chip That Powers AI Data Centers Is Getting a Quiet Upgrade,Here's Why It Matters
Navitas Semiconductor is betting its future on a fundamental shift in how AI data centers get their power. The semiconductor company reported first-quarter 2026 revenue of $8.6 million, up 18% from the previous quarter, driven by a 35% year-over-year surge in high-power applications that now make up the majority of its business. The pivot signals a broader industry recognition that as AI systems consume more electricity, the chips managing that power delivery have become just as critical as the processors doing the computing.
What Are GaN and SiC, and Why Do AI Data Centers Need Them?
Navitas is focusing on two semiconductor technologies: gallium nitride (GaN) and silicon carbide (SiC). These materials can handle higher voltages and temperatures than traditional silicon, making them ideal for the industrial-scale power conversion systems that modern AI data centers require. The company introduced new offerings including 2.3 kilovolt and 3.3 kilovolt modules, along with a 20 kilowatt platform that achieves approximately 97.5% peak efficiency in converting power from 800 volts down to 6 volts.
To understand why this matters, consider the scale of the problem. Today's leading AI systems pack 72 processors into a single rack, connected by hundreds of thousands of kilometers of cabling. These facilities require industrial-scale liquid cooling, dedicated power delivery systems, and redundancy infrastructure that didn't exist in conventional data center design a decade ago. A standard cloud data center from the 2010s might have cost $10 million per megawatt to build. The next generation of AI-optimized facilities costs $15 million to $20 million per megawatt, and some facilities built just two years ago are already considered insufficiently equipped for the chips being manufactured today.
How Is Navitas Positioning Itself in the AI Infrastructure Race?
Navitas management explicitly identified four target segments for growth: AI data center, energy and green infrastructure, performance computing, and industrial electrification. The company's gross margin improved to 39.0% in the first quarter, up from 38.7% in the prior quarter, driven by a greater proportion of higher-value, high-power revenue. Operating expenses remained flat at $15.0 million while the company reallocated spending to emphasize research and development supporting the strategic pivot.
"Navitas is back to growth, driven by our high power market," said Chris Allexandre, President and Chief Executive Officer at Navitas Semiconductor.
Chris Allexandre, President and Chief Executive Officer at Navitas Semiconductor
The company maintains a strong financial position with $221 million in cash and no debt, giving it runway to invest in the transition. For the second quarter, management guided revenue to $10.0 million, plus or minus $0.5 million, with non-GAAP gross margin around 39.25% and flat operating expenses. However, the path to profitability remains distant; management said the company would need revenue "in the high 30s" (meaning tens of millions of dollars) to reach profitability.
Why Is Power Infrastructure Becoming the Real Bottleneck for AI?
Navitas's pivot reflects a broader industry reality that Goldman Sachs recently highlighted. The investment bank projects roughly $7.6 trillion in cumulative AI capital expenditure between 2026 and 2031, covering chips, data centers, and power infrastructure. Annual spending is expected to more than double over that period, from $765 billion this year to $1.6 trillion by 2031.
What's striking is that power infrastructure represents a growing share of that total. Building the next generation of data centers at $19 million per megawatt instead of $15 million balloons total data center costs by more than $500 billion over the projection period. These are not minor variations; they represent the difference between a manageable investment and a potential economic constraint.
Steps to Understanding AI Data Center Power Requirements
- Voltage Conversion Efficiency: Modern AI data centers require power conversion from high-voltage transmission lines down to the low voltages that processors use. Navitas's 97.5% efficiency rating means that for every 100 watts of power delivered, only 2.5 watts are lost as heat, compared to older systems that might lose 5 to 10 watts.
- Thermal Management Complexity: High-power semiconductor modules generate significant heat. Silicon carbide and gallium nitride materials can operate at higher temperatures than traditional silicon, reducing the cooling load and allowing data centers to operate more efficiently in warmer conditions.
- Redundancy and Reliability: AI data centers cannot afford power interruptions. The power delivery infrastructure must include multiple independent systems that can take over instantly if one fails, requiring sophisticated switching and monitoring chips like those Navitas develops.
- Grid Integration Challenges: As data centers draw more power, they create new demands on electrical grids. Navitas's focus on "energy and green infrastructure" reflects the industry's need for chips that can help data centers integrate with renewable energy sources and manage power demand more intelligently.
The broader context reveals an uncomfortable truth about the AI boom. Despite $30 billion to $40 billion in enterprise investment in generative AI, research cited by Goldman Sachs found that 95% of organizations were getting zero return on their AI pilots. A 2025 EY survey found that 99% of companies in its sample reported financial losses due to AI-related risks, with an average loss of $4.4 million per company.
Yet the spending continues, driven largely by what Goldman Sachs describes as "insecurity, if not outright fear." Hyperscalers like Microsoft, Amazon, Google, and Meta have dramatically increased their spending on AI infrastructure even as their stocks have lagged the broader market. These companies have burned through all their free cash flow from operations and are now issuing debt to fund the build-out. Data center debt issuance doubled to $182 billion in 2025 alone.
For companies like Navitas, this dynamic creates both opportunity and risk. The opportunity is clear: as hyperscalers race to build more data centers, they need the power management chips that Navitas manufactures. The risk is equally clear: if the returns on AI investment eventually force hyperscalers to cut their capital expenditure, the entire supply chain supporting data center construction could contract sharply. For now, though, Navitas's pivot toward high-power markets appears well-timed, even if the underlying economics of the AI boom remain deeply uncertain.