EnCharge Secures $22.6 Million Funding to Launch AI-Accelerating Chip Technology

Around a year ago, TechCrunch wrote about a little-known company developing AI-accelerating chips to face off against hardware from titans of industry — e.g. Nvidia, AMD, Microsoft, Meta, AWS and Intel. Its mission at the time sounded a little ambitious — and still does. But to its credit, the startup, EnCharge AI, is alive and kicking — and just raised $22.6 million in a new funding round.

The VentureTech Alliance, the strategic VC associated with semiconductor giant TSMC, led the round with participation from RTX Ventures, ACVC Partners, Anzu Partners and Schams Ventures. Bringing EnCharge’s total raised to $45 million, the new capital will be put toward growing the company’s team of 50 employees across the U.S., Canada and Germany and bolstering the development of EnCharge’s AI chips and “full stack” AI solutions, according to co-founder and CEO Naveen Verma,

“EnCharge’s mission is to provide broader access to AI for the 99% of organizations that can’t afford to deploy today’s costly and energy-intensive AI chips,” Verma said. “Specifically, we’re enabling new AI use cases and form factors that run sustainably, from both an economical and environmental perspective, to unlock AI’s full potential.”

Verma, the director of Princeton’s Keller Center for Innovation in Engineering Education, launched EnCharge last year with Echere Iroaga and Kailash Gopalakrishnan. Gopalakrishnan was until recently an IBM fellow, having worked at the tech giant for close to 18 years. Iroaga previously led semiconductor company Macom’s connectivity business unit as VP and then GM.

EnCharge has its roots in federal grants Verma received in 2017 alongside collaborators at the University of Illinois at Urbana-Champaign. An outgrowth of DARPA’s Electronics Resurgence Initiative, which aims to advance a range of computer chip technologies, Verma led an $8.3-million effort to investigate new types of non-volatile memory devices.

In contrast to the “volatile” memory prevalent in today’s computers, non-volatile memory can retain data without a continuous power supply, making it theoretically more energy efficient.

DARPA also funded Verma’s research into in-memory computing — “in-memory,” here, referring to running calculations in RAM to reduce the latency introduced by storage devices.

EnCharge was launched to commercialize Verma’s research. Using in-memory computing, EnCharge’s hardware can accelerate AI applications in servers and “network edge” machines, Verma claims, while reducing power consumption relative to standard computer processors.

“Today’s AI compute is expensive and power-intensive; currently, only the most well-capitalized organizations are innovating in AI. For most, AI isn’t yet attainable at scale in their organizations or products,” he said. “Encharge products can provide the processing power the market is demanding while addressing the extremely high energy requirement and cost roadblocks that organizations are facing.”

Lofty language aside, it’s worth noting that EnCharge hasn’t begun to mass produce its hardware — yet — and only has “several” customers lined up so far. In another challenge, EnCharge is going up against well-financed competition in the already-saturated AI accelerator hardware market. Axelera and GigaSpaces are both developing in-memory hardware to accelerate AI workloads, and NeuroBlade has raised tens of million in VC funding for its in-memory inference chip for data centers and edge devices.

It’s tough, also, to take EnCharge’s performance claims at face value given that third parties haven’t had a chance to benchmark the startup’s chips. But EnCharge’s investors are standing behind them for what it’s worth.

“EnCharge is solving critical issues around computing power, accessibility and costs that are both limiting AI today and inadequate for handling AI of tomorrow,” the VentureTech Alliance’s Kai Tsang said via email. “The company has developed computing beyond the limits of today’s systems with a technologically unique architecture that fits into today’s supply chain.”