Back to briefings
Download PDF

April 26, 2026

Bain Capital seeks $2bn sale of 40% Bridge Data Centres Bridge Data Centres pitched at $5bn valuation, 3GW by 2030 Energy Vault cites AI-driven surge in storage orders, PG&E hydrogen backup UC Riverside-cited 519ml water per 100-word AI search debate

Bain Capital is reportedly trying to cash out part of its bet on Bridge Data Centres — a 40%+ stake sale pitched at about $2bn, implying a ~$5bn valuation for the Singapore-based platform. In a market that’s been oscillating between “AI changes everything” exuberance and financing reality, a chunky secondary process like this is a useful truth-teller: it tests how much investors will pay for contracted scale versus a growth plan.

The Big Stories

Bain’s move to sell at least 40% of Bridge Data Centres for around $2bn (with a reported $5bn valuation) is the clearest signal in today’s flow about where private-market pricing might settle for Asia-Pacific digital infrastructure. Bridge was founded in 2017 and is targeting 3GW by 2030 across hyperscale build-to-suit and colocation. Why it matters: if this clears at the numbers being floated, it strengthens the case that scaled regional platforms can still command premium valuations even as power, land, and delivery timelines get harder.

On the energy side, Energy Vault’s CEO says AI and data-centre load is translating into a “sharp increase” in inquiries and firm orders, with the kicker that high oil prices and electricity volatility could further tilt buyers toward storage and renewables. The company points to owning/operating two plants with two more under construction, and cites a 10.5-year PG&E contract for green-hydrogen backup. Why it matters: the infrastructure stack around data centres is widening—developers aren’t just shopping for megawatts, they’re increasingly shopping for firmness, hedges, and resilience products that make permitting and operations less brittle.

Public scrutiny of AI’s resource footprint is also getting more specific. At Gannon University, participants in a campus roundtable highlighted concerns about data centres’ energy and water use, citing UC Riverside research that a 100-word AI search can use roughly 519 milliliters of water. Why it matters: whether or not that number holds across models and locations, the rhetorical battleground is shifting from abstract “AI is power-hungry” to everyday analogies that travel fast—and that’s exactly the kind of framing that shows up later in local approvals, water permits, and political campaigns.

Behind the Headlines

Local process risk is back in the spotlight with multiple environmental studies underway ahead of a proposed data center, including work to assess wastewater and other impacts. Even without project financials or a timeline, the subtext is familiar: data centre proposals are increasingly being treated like “heavy infrastructure” in community reviews, with water and waste streams scrutinised alongside power. For investors, the practical takeaway is that diligence can’t stop at interconnects and substations—municipal environmental review calendars are becoming as material as transformer lead times.

The broader cultural pushback is visible in commentary like Liz Pinkey’s call to develop technology that protects local resources while enabling connectivity, which ties data centre construction to wider anxieties about land use, policy shifts, and environmental stewardship. This kind of argument matters less for its specifics and more for its coalition-building: it bundles data centres with other contested developments (mining, warehouses, changes in federal land management) into a single story about community loss of control. When opponents succeed in that bundling, developers often find themselves fighting on multiple fronts—noise, water, traffic, “industrialisation”—rather than debating one clean technical issue at a time.

Subscribe to Data Centres Briefings

Get AI-powered briefings delivered to your inbox

Region