ThinkLabs AI Raises $28M to Solve Power Grid Challenges
ThinkLabs AI just closed a $28 million Series A to tackle the electric grid's biggest problem: legacy planning tools can't keep pace with surging demand from AI data centers and EVs.

ThinkLabs AI Raises $28M to Transform Grid Planning with Physics-Informed AI
Learn more about marvell stock pops 8% as nvidia takes $2b stake
The electric grid is facing its biggest stress test in decades. U.S. electricity demand is projected to surge 25% by 2030, driven by AI data centers, electric vehicles, and building electrification. Yet utilities still use planning tools from the 1990s that take weeks or months to analyze a single grid connection scenario.
ThinkLabs AI just closed a $28 million Series A financing round led by Energy Impact Partners (EIP) to solve this bottleneck. Nvidia's venture capital arm NVentures and Edison International joined the round, signaling that infrastructure AI is moving from hype to critical necessity.
The funding marks a strategic shift in AI investment. While most headlines focus on chatbots and content generation, ThinkLabs applies AI to the physical infrastructure that powers modern life. The company builds physics-informed AI models that compress month-long grid studies into under three minutes while maintaining greater than 99.7% accuracy.
Why Does the Power Grid Need Immediate AI Solutions?
Utilities face a computational capacity crisis. When a large data center wants to connect to a substation, engineers must run power flow simulations to understand the impact. These calculations model how electricity moves through the network, identifying potential overloads or voltage issues.
Traditional software from Siemens, GE, and Schneider Electric can take weeks or months to complete these studies. For a single scenario. The problem multiplies when utilities need to evaluate hundreds of possible configurations or future load scenarios.
"We are dead focused on the grid," ThinkLabs CEO Josh Wong told VentureBeat. "We do AI models to model the grid, specifically transmission and distribution power flow related modeling. We can calculate things like interconnection of large loads and understand the impact they have on the grid."
The timing couldn't be more urgent. Data centers alone are driving unprecedented electricity demand, with some facilities requiring as much power as small cities. Add electric vehicle charging infrastructure and heat pump adoption, and the grid planning challenge becomes existential.
How Does ThinkLabs Compress Month-Long Studies Into Minutes?
ThinkLabs replaces legacy bottlenecks with physics-informed AI that learns from existing engineering simulators. The platform can run 10 million scenarios in 10 minutes, according to company data.
The key difference from generative AI: no hallucinations allowed. "We're not hallucinating the heck out of things," Wong said. "We are talking about engineering calculations here. I would really compare this to a computation of fluid dynamics, or like F1 cars, or aerospace, or climate models."
ThinkLabs trains its AI on outputs from first-principles physics simulators, the same tools utilities already trust. The models then validate against those simulators to ensure accuracy and auditability. This matters because a miscalculation can cause blackouts or damage physical infrastructure.
For a deep dive on artemis ii is not safe to fly: critical issues explained, see our full guide
The platform performs full three-phase AC power flow analysis, examining every node and bus on the electric grid to determine:
- Real and reactive power levels at each connection point
- Line flows and potential congestion bottlenecks
- Voltage stability across the network
- Equipment loading and thermal constraints
- Interconnection impacts from new loads or generation
For a deep dive on best mid-range audio interfaces for studio and stage, see our full guide
Utilities make billion-dollar capital investment decisions based on exactly these calculations. If analysis shows a proposed data center will overload a transmission line, the utility may need to build new infrastructure at enormous cost.
What Makes ThinkLabs Different From Other Grid AI Startups?
The competitive landscape for grid management AI has grown crowded. But Wong contends ThinkLabs occupies fundamentally different territory.
"As far as we know, we're the only ones actually doing AI-native grid simulation analysis," he said. "Others might be using AI for forecasting, load disaggregation, or local energy management, but fundamentally, they're not calculating a power flow."
Many competitors focus on forecasting or optimization around existing grid studies. ThinkLabs replaces the core engineering analysis itself. The platform uses reinforcement learning to generate creative solutions beyond traditional trial-and-error approaches.
"With many utilities, existing tools will basically show them all the problems, but they can only address solutions by trial and error," Wong explained. "With AI, we can use reinforcement learning to generate more creative solutions, but also very effectively weigh the pros and cons of each of these solutions."
This capability matters for capital efficiency. If AI can suggest alternative solutions like battery storage placement, load flexibility scheduling, or topology optimization, utilities can potentially avoid or defer massive infrastructure investments.
How Do Strategic Partnerships Accelerate ThinkLabs Adoption?
Nvidia's participation through NVentures signals a deeper strategic relationship. The company rarely writes venture checks, making this investment particularly noteworthy.
ThinkLabs works extensively within the Nvidia ecosystem, leveraging CUDA for GPU-accelerated computation. The company integrates Nvidia's Earth-2 climate simulation platform into its probabilistic forecasting and risk-adjusted analysis pipelines.
"We are what one utility mentioned as the only high-intensity GPU workload for the OT side, the operational technology side, that's planning and operations," Wong said.
Edison International's participation carries different strategic weight. In January 2026, ThinkLabs announced results from a collaboration with Southern California Edison that demonstrated real-world capabilities:
- AI training in minutes per circuit
- Processing a full year of hourly power-flow data in under three minutes across 100+ circuits
- Engineering reports with solution recommendations in under 90 seconds
- Work that previously required 30 to 35 days from dedicated engineers
The collaboration was built on Microsoft Azure AI Foundry, situating ThinkLabs within cloud infrastructure many large utilities already use. Microsoft hosted a webinar in mid-2025 featuring Wong alongside representatives from Southern Company and EPRI.
How Did a 20-Year Utility Career Lead to ThinkLabs?
Wong spent more than 20 years in the utility industry before founding ThinkLabs. He started at Toronto Hydro, then founded Opus One Solutions in 2012, a smart-grid software company he grew to over 100 employees across eight countries before selling to GE in 2022.
After the acquisition, Wong joined what became GE Vernova and developed the company's "grid of the future" roadmap. That thesis became the intellectual foundation for ThinkLabs.
"I was pulling together the thesis that we need to electrify, but the grid is really at the center of attention," Wong said. "The conclusion is we need to drive towards greater autonomy. We talk a lot about autonomous cars, but I would argue that autonomous grids is the much more pressing priority."
ThinkLabs was incubated inside GE Vernova and spun out as an independent company in April 2024 with a $5 million seed round. GE Vernova remains a shareholder and strategic partner. Wong is the sole founder.
Why Are Utility Sales Cycles Accelerating?
Utilities rank among the most conservative technology buyers, with procurement cycles stretching years. But Wong says the landscape is shifting rapidly.
"I have noticed sales cycles really accelerating," he said. "It's still long and depends on which utility and how big the deal is, but we have been witnessing firsthand sales cycles going from the traditional one to two years to a shortest two to three months."
Commercial traction supports this claim. ThinkLabs works with more than 10 utilities on AI-native grid simulation for planning and operations. The company doubled its customer accounts in the first quarter of 2026 alone.
The company primarily targets investor-owned utilities and system operators, the organizations that own and operate the grid. But AI is also democratizing grid simulation capabilities for smaller utilities that previously lacked engineering resources for sophisticated analyses.
Energy Impact Partners' involvement as lead investor provides strategic access. The firm is backed by more than half of North America's investor-owned utilities, giving ThinkLabs direct lines into executive suites of target customers.
What Does 99.7% Accuracy Mean for Critical Infrastructure?
Any AI application to critical infrastructure confronts the failure mode question. A chatbot hallucination is embarrassing. A grid power flow miscalculation could contribute to equipment damage or widespread outages.
Wong addressed this directly. The 99.7% accuracy figure represents an average across large-volume planning studies, specifically 8,760-hour analyses projected across three to 10 years with multiple sensitivity scenarios.
"If you look at a source of truth, the data quality is actually the biggest limiting factor, not the accuracy of these AI models," he said. "When we bring in traditional engineering analysis and actually snap it with telemetry, metering data, SCADA data, I would actually argue AI is far more accurate because it is data driven on actual measurements."
For more critical real-time applications, ThinkLabs deploys hybrid models that blend AI computation with traditional physics-based simulation. The AI handles roughly 99% of computational workload before handing off to a physics-based engine for final validation.
The company monitors for model drift and maintains strict training boundaries. "We're not like ChatGPT training the internet here," Wong said. "We're training on the possibility of grid conditions."
How Will ThinkLabs Deploy the $28 Million Series A?
The round was significantly oversubscribed. ThinkLabs initially set out to raise less than $28 million, but strong demand from strategic partners pushed it higher.
"This was way oversubscribed," Wong said. "We attracted the right ecosystem partners and the right capital partners to grow with, and that's how we ended up at $28 million."
Primary use of funds will advance the product to enterprise grade and expand the range of supported use cases. The company sees significant land-and-expand opportunity within individual utility accounts, moving from modeling small regions to training AI models across entire states or multi-state territories.
The team composition reflects dual identity. "Half of our team are power system PhDs, but the other half are the AI folks, people who have been looking at hyper-scalable AI infrastructure platforms and MLOps for other industries," Wong said. "We have really been blending the two."
Returning investors include GE Vernova, Powerhouse Ventures, Active Impact Investments, Blackhorn Ventures, and Amplify Capital, along with an unnamed large North American investor-owned utility.
Does the Value Proposition Survive If Data Center Growth Slows?
The bullish case for ThinkLabs rests heavily on projected electricity demand surge. But some analysts question whether projections are inflated, particularly if AI investment cycles cool and data center build-outs decelerate.
Wong argued the value proposition remains resilient. Even without dramatic load growth, utilities face fundamental modernization challenges. They use tools and processes from the 1990s and 2000s, and the workforce that knows how to operate those tools is retiring rapidly.
"Workforce renewal is a big factor," he said. "These AI tools not only modernize the tool itself, but also modernize culture and transformation and become major points of retention for the next generation."
Continue learning: Next, explore fedware: government apps that spy harder than banned apps
Energy affordability drives adoption independent of load growth projections. If utilities continue planning based on worst-case deterministic scenarios, building enough infrastructure to cover every conceivable contingency, consumer rates will
Related Articles

Gig Economy 2023: A Revolutionary Benefits Package for Freelancers
2023 witnesses a pivotal shift in the gig economy with a new benefits package for freelancers, offering health insurance, retirement plans, and more.
Sep 6, 2025

Revolutionizing Supply Chains: AI's Predictive Power
Discover how AI's predictive power is revolutionizing supply chains, offering businesses a competitive edge with early disruption alerts.
Sep 6, 2025

2025: A Landmark Year for Startup Funding
2025 marks a watershed moment for startups with record venture capital investments. Discover the sectors leading the surge and strategies for success.
Sep 6, 2025
Comments
Loading comments...
