Getting your news
Attempting to reconnect
Finding the latest in Climate
Hang in there while we load your news feed
May 15, 2026
Google’s planned 1GW “Project Cannoli” in Michigan just cleared a key local hurdle — and it’s a reminder that power infrastructure, not buildings, is now the real gating item for mega-scale AI campuses. In Van Buren Township, officials preliminarily approved the substation and switching station needed to feed a 282-acre development with demand likened to ~800,000 homes. The plan’s mix of grid gear, 450MW of storage, and 1.6GW of renewables reads less like a typical data center project and more like a utility-scale energy program with servers attached.
The Big Stories
Google 1GW data center switching station approved in Van Buren after the township planning commission granted preliminary approval for the substation and switching station that would serve Google’s 1-gigawatt Project Cannoli. DTE and ITC say the project includes 450MW of storage and 1.6GW of renewables, while developers pledged no eminent domain and said underground cables would use public rights-of-way. This matters because the project shows how hyperscale is increasingly arriving as “grid-first” development: approvals and stakeholder promises around rights-of-way, land use, and local disruption are becoming as material as the data hall footprint.
DESRI and Meta sign PPAs for 850 MW clean capacity, covering 500MW in Oklahoma, 200MW in Texas, and 150MW in Mississippi — solar plus battery capacity — and taking Meta’s contracted capacity with DESRI to ~2,575MW across nine states. The signal here is that large AI and cloud buyers are still locking in multi-state portfolios at a pace that effectively shapes regional generation buildouts. For everyone else competing for power, these deals are also capacity reservations in all but name.
Ford launches Ford Energy BESS business for data centers, aiming to assemble systems in a repurposed Glendale, Kentucky facility and deploy at least 20GWh annually, with initial deliveries in late 2027 and ~$2 billion planned investment over two years. If Ford executes, it’s a meaningful step toward industrial-scale battery supply geared specifically at the kinds of utility and large-load customers now driving interconnection and reliability headaches. It also hints at a future where “who can deliver storage at volume” becomes a competitive differentiator for data center site selection.
ERCOT warns AI-driven data center demand may be overstated, flagging that a big chunk of proposed non-crypto AI data center load may be “paper megawatts.” In its filing, ERCOT said statewide peak demand could approach ~368GW by 2032 in the top-line view, but it applied a 49.8% realization factor in some models and signaled it may adjust based on historical realization rates. The important tell: grid planners are now openly discounting speculative load — which could change how quickly (and for whom) transmission and generation investments get justified.
NEMA forecasts U.S. electricity demand to surge 55% by 2050, projecting net consumption rising from 3,936 TWh in 2024 to 6,130 TWh by 2050. NEMA’s report also says data centers could grow 300% in the next 10 years and account for 38% of net consumption through 2037, alongside sharp growth in electric transportation demand. Put next to Michigan’s 1GW proposal and ERCOT’s “paper MW” warning, the takeaway is uncomfortable but clear: the macro trend is real, but the forecasting and timing are messy — and that mess will shape permitting, pricing, and who gets served first.
Behind the Headlines
Bank of England: Strengthening operational and cyber resilience is a quiet but consequential read for anyone selling cloud, colocation, or critical connectivity into regulated sectors. The Bank will run SIMEX and CORST this year using a shared scenario simulating a global cloud service provider disruption, and it’s pushing firms toward clearer risk frameworks and “impact tolerances,” plus participation in stress tests. For data center and cloud operators, this isn’t just compliance theatre: it’s a push to quantify what “resilience” actually means in operational terms, which can translate into procurement requirements, audit pressure, and a higher bar for incident transparency.
Uptime finds data center outages decline despite AI risks says outage rates improved for the fifth consecutive year — good news — but warns new forces could reverse that trend. The standout datapoint is economic: one in five respondents reported outage costs exceeding $1 million. Uptime’s list of emerging risks (AI-driven rack density, on-site generation complexity, grid instability, cable cuts) maps closely to what operators are building right now; the industry may be trading mature, well-understood failure modes for newer ones that aren’t yet fully “operationalized.”
T5 Services: Liquid Cooling Is an Operational Discipline, Not Just a Technology makes an argument the industry needs to internalize quickly: liquid cooling success is about monitoring, maintenance, and repeatable operating models, not just installing coolant hardware. As rack densities climb and liquid becomes normal, the winners won’t be the ones with the slickest demo — they’ll be the ones that can run the systems day-in, day-out with tight procedures and trained staff. In practice, that shifts competitive advantage toward operators and service providers who can standardize operations across portfolios, not just build one impressive “AI-ready” room.
Subscribe to Data Centres Briefings
Get AI-powered briefings delivered to your inbox