
Electricity serves as a crucial raw material for artificial intelligence, yet emerging processing methods surpass the capacity of data center operators to handle their power grid interactions, compelling them to reduce output by as much as 30%.
“There is an immense amount of energy wasted in these AI facilities,” stated Nvidia CEO Jensen Huang during a keynote address at the company’s annual GTC customer conference. “Every watt that goes unused equates to lost revenue,” the firm asserted during the yearly presentation.
Now, the startup Niv-AI has surfaced from stealth mode with $12 million in seed funding aimed at addressing this issue by accurately gauging GPU power consumption with innovative sensors and creating tools to manage it more effectively.
Founded in Tel Aviv last year by CEO Tomer Timor and CTO Edward Kizis, the startup counts Glilot Capital, Grove Ventures, Arc VC, Encoded VC, Leap Forward, and Aurora Capital Partners among its backers. The company opted not to disclose its valuation.
As advanced labs utilize thousands of GPUs simultaneously to train and operate complex models, intermittent, millisecond-level spikes in power demand occur when processors alternate between computational tasks and communication with other GPUs.
These spikes create challenges for data centers in managing their power intake from the grid. To prevent potential power shortages, data centers invest in temporary energy storage to handle surges or limit their GPU workloads. Both options diminish the return on investments in high-cost chips.
“We simply cannot continue constructing data centers the way we currently do,” remarked Lior Handelsman, a partner at Grove Ventures who serves on Niv’s board.
Techcrunch event
San Francisco, CA
|
October 13-15, 2026
The initial phase of Niv’s strategy involves gaining insight into ongoing operations; the company is currently rolling out rack-level sensors that monitor power consumption at the millisecond level on GPUs that they possess and those of their design partners. The objective is to comprehend the distinct power profiles of various deep learning tasks and develop strategies that enable data centers to tap into more of their current capabilities.
Predictably, the engineers plan to develop an AI model based on the data they gather, aiming to train it to forecast and coordinate power loads across the data center — functioning as a “copilot” for data center engineers.
Niv-AI anticipates having an operational system in several U.S. data centers within the next six to eight months. This prospect is appealing, as hyperscalers looking to establish new data centers encounter challenging land-use and supply chain obstacles. The founders envision their final product as an essential “intelligence layer” bridging data centers and the electrical grid.
“The grid is genuinely concerned about the data center drawing excessive power at a particular moment,” Timor shared with TechCrunch. “The issue we are addressing is one with two facets: One aspect is to assist data centers in better utilizing their GPUs and, ideally, maximizing the power they are already paying for. Conversely, there’s a chance to create much more responsible power profiles between the data centers and the grid.”

