AI and Energy Transformation

Explore top LinkedIn content from expert professionals.

  • View profile for Abby Hopper
    Abby Hopper Abby Hopper is an Influencer

    Former President & CEO, Solar Energy Industries Association

    75,647 followers

    Something VERY cool just happened in California and… it could be the future of energy.   On July 29, just as the sun was setting, California’s electric grid was reaching peak demand.   However, instead of ramping up fossil fuel resources, the California Independent System Operator (CAISO) and local utilities decided to lean on a network of thousands of home batteries.   More than 100,000 residential battery systems (made up primarily by Sunrun and Tesla customers) delivered about 535 megawatts of power to California’s grid right as demand peaked, visibly reducing net load (as shown in the graphic).   Now, this may not seem like a lot but 535 megawatts is enough to power more than half of the city of San Francisco and that can make all the difference when a grid is under stress.   This is what’s called a Virtual Power Plant or VPP. It’s a network of distributed energy resources that grid operators can call on in an emergency to provide greater resilience to our energy systems. Homeowners are compensated for the dispatch, grid operators are given another tool for reliability, and ratepayers are saved from instability. It’s a win-win-win.   Now, this was just a test to prepare for other need-based dispatches during heat waves in August and September. But it’ historic.   As homeowners add more solar and storage resources, the impact of these dispatch events will become even more profound and even more necessary. This was the second time this summer that VPPs have been dispatched in California and I expect to see even more as this technology improves.   Shout out to Sunrun, Tesla, and all companies who participated. Keep up the great work.

  • View profile for John Reister

    Founder @ GoPowerEV ⚡️ | Turning Multifamily Properties into Virtual Power Plants

    2,620 followers

    Last week 100,000 home batteries operated like a mid-sized power plant. On July 29, California aggregated more than 100,000 residential batteries and discharged them for two hours during the evening peak. The result: 535 MW of coordinated output, comparable to a gas peaker plant, but distributed across rooftops instead of built on a single plot of land. These were some of the most promising outcomes: Truly additive output: The batteries weren’t just doing what they normally do. Compared to the prior day’s profile, almost all 535 MW was additional discharge triggered by the event, which is clear evidence this was coordinated grid support, not incidental customer behavior. Stable performance: Telemetry showed steady power delivery for the full two-hour window with no noticeable drop-off. That’s the level of reliability grid planners typically expect from conventional plants. Well-timed to system stress: The event aligned with CAISO’s net peak (that’s California’s grid operator, balancing demand minus wind and solar). Hitting that window matters because this is when power is most scarce and expensive, and when the “duck curve” ramps hardest. Visible grid impact: Net load dropped measurably during the dispatch, demonstrating that thousands of small batteries can move the needle at the system level. Program design matters: Nearly 90% of participants were enrolled in California’s Demand-Side Grid Support program, with others in the Emergency Load Reduction Program. Incentive structures like these are what make broad participation possible across multiple aggregators and OEMs. The takeaway is bigger than one test: virtual power plants are crossing the line from pilot to planning-grade resource. If properly integrated—through refined dispatch algorithms, better coordination with CAISO, and markets that actually value flexibility—they can defer costly peaker plants, absorb excess solar, and flatten the evening ramp without the stranded costs of centralized infrastructure. The technology is ready. The economics pencil out. The question now is whether market design will catch up. ---- Read the full report from The Brattle Group here: https://lnkd.in/gwYbFiPz

  • View profile for Dr. Barry Scannell
    Dr. Barry Scannell Dr. Barry Scannell is an Influencer

    AI Law & Policy | Partner in Leading Irish Law Firm William Fry | Member of the Board of Irish Museum of Modern Art | PhD in AI & Copyright

    59,676 followers

    AI has gone nuclear. President Trump's latest Executive Order (EO) marks a decisive shift in how the US approaches the intersection of AI and national security. The order requires the deployment of advanced nuclear reactors to power both military installations and AI data centers, treating uninterrupted AI computing power as a matter of national defense. Significantly, for commercial AI development, the Department of Energy must designate AI data centers at DOE facilities as critical defense facilities, with their nuclear power infrastructure classified as defense critical electric infrastructure. This is potentially the start of attempts to make AI a restricted technology. The scale of the challenge is immense. According to the IEA, global electricity consumption by data centers is set to more than double by 2030 to around 945 TwH annually. This figure equals Japan's entire annual electricity consumption and represents enough power to supply 85 million American homes for a year, or California for nearly four years. The urgency driving this policy becomes clear when examining the Stargate Project, the $500 billion AI infrastructure venture announced in January. The first Stargate site in Abilene, Texas will have 1.2 gigawatts of capacity when completed by mid-2026, enough to power roughly 750k small homes. These numbers underscore why conventional energy sources cannot meet AI's demands at the required scale and speed. The EO explicitly frames AI as a national security imperative. It states that advanced computing infrastructure for AI at military and national security installations demands reliable, high-density power sources that cannot be disrupted by external threats or grid failures. Military applications of AI, from surveillance and intelligence processing to autonomous systems, depend on massive computing infrastructure that traditional power sources cannot reliably support. Congress has reinforced this federal approach to AI governance. The House just passed the "One Big Beautiful Bill Act" which includes a 10-year moratorium on State enforcement of any law regulating artificial intelligence models, systems, or automated decision-making processes. The administration's intolerance for impediments to AI progress became evident with the firing of the head of the US Copyright Office. Her dismissal came one day after the Copyright Office released a report stating that technology companies' use of copyrighted works to train AI may not always be protected under U.S. law - something which may hinder AI development in the USA. These developments signal that the US has entered a new strategic phase where AI is no longer merely a technological or economic concern but an instrument of geopolitical power. The US is treating the AI race as an arms race, with nuclear energy as its fuel, centralised federal control as its governance model, and zero tolerance for resistance whether from states, regulators, or rights holders.

  • View profile for Eddie Donmez
    Eddie Donmez Eddie Donmez is an Influencer

    Founder at Creative Capital | LinkedIn Top Voice - Finance I +275,000 Followers

    277,016 followers

    In Case You Missed It: One of The World’s Leading Consulting Firms, McKinsey & Company Has Dramatically Increased Its Data Center Forecasts 🤯 McKinsey now anticipates data centers capacity will reach 83GW by 2030. That's a significant jump from the 56GW forecasted in their Sept 2023 model. What is driving this? “data, compute and connectivity from digitalization, and cloud migration, as well as the scaling of new technologies — the most important of which is AI.” What is capacity? Capacity represents the total electrical power available to operate the facility, including servers, cooling systems, and other infrastructure. What impact does this have on consumption? They now project energy consumption will soar to 606 TWh in 2030, which represents 11-12% of total US power demand - from 147 TWh (or 3-4%) in 2023. An eye-watering +312% increase in energy consumption from 2023 to 2030. Goldman Sachs projecting a 160% rise in energy demand by 2030 with data centers accounting for 8%. Key Questions: 1) What is going to power this? The five major hyperscalers (Meta, Alphabet Inc., Amazon, Microsoft, Oracle) are forecasted to account for over $1T in CapEx from 2024 to 2027 according to S&P estimates. That needs a lot of power. Just recently: • Amazon & Ken Griffin of Citadel announced it take a $500M stake in X-energy to develop small module reactors (5GW by 2040). • Google has ordered six small modular reactors (SMRs) from Kairos Power which will be operational by 2030, supplying 500MW of power. • Microsoft struck a 20-year power agreement with Constellation Energy to restart a unit 1 reactor in Pennsylvania, supplying 837MW of power. We'll be hearing a lot more about nuclear energy powering AI & Cloud over the coming years. 2) Who is going to finance this huge infrastructure buildout? Private capital appears essential to plug that gap. Microsoft, BlackRock, Global Infrastructure Partners (GIP) and MGX recently launched a new AI partnership to invest in AI infrastructure - mobilizing up to $100B in total investment potential. Apollo Global Management, Inc. has forecasted the total addressable market (TAM) for digital infrastructure at $15 - $20T over the next 10 years. AI Buildout.

  • View profile for Navveen Balani
    Navveen Balani Navveen Balani is an Influencer

    LinkedIn Top Voice | Google Cloud Fellow | Chair - Standards Working Group @ Green Software Foundation | Driving Sustainable AI Innovation & Specification | Award-winning Author | Let’s Build a Responsible Future

    12,260 followers

    The next evolution of sustainable AI isn’t just about using more efficient hardware—it’s about Autonomous AI Agents that code with sustainability in mind. These agents are designed to operate independently, learning and adapting as they go, and have the potential to transform software development by writing energy-efficient code. They don't just optimize for speed; they prioritize minimal resource consumption. Why This Matters for Sustainability Modern AI models consume massive amounts of power, yet software development still prioritizes performance over energy efficiency. Agentic AI could change that paradigm by: ✅ Reducing Computational Waste: AI agents could select or generate the most efficient algorithms based on real-time constraints instead of defaulting to resource-heavy models. For example, they could optimize database queries to reduce data retrieval and processing or dynamically adjust resource allocation based on demand. ✅ Automating Green Software Principles: AI-driven frugal coding practices could optimize data structures, reduce redundant calculations, and minimize memory overhead. This could involve choosing the most energy-efficient programming language or framework for a specific task. ✅ Measuring & Optimizing in Real Time: The reward function would be clear: lower energy consumption, less latency, and reduced emissions—all while maintaining accuracy. ✅ Parallel & Distributed Optimization: AI agents could continuously refine codebases across thousands of cloud instances, improving sustainability at scale. AI-Driven Innovation Archive for Green Coding One of the most exciting ideas in autonomous coding is the "Green Code Archive"—an AI-generated repository of energy-efficient code snippets that could continuously improve over time. Imagine: 🔹 Reusing optimized code instead of reinventing energy-intensive solutions. 🔹 Carbon-aware coding suggestions for green data centers & renewable energy scheduling. 🔹 AI-driven legacy refactoring, automating migration to sustainable architectures. Measuring AI’s carbon footprint after the fact isn’t enough—the goal should be AI that reduces energy use at the source. The future of sustainable tech isn’t just about efficient hardware—it’s about intelligent, autonomous software that optimizes itself for minimal environmental impact. While this technology is still emerging, challenges remain in areas like training complexity and robust validation. However, the potential benefits for a greener future are undeniable. Learn more about leading with Agentic AI and its transformative potential in my book, "Empowering Leaders with Cognitive Frameworks for Agentic AI: From Strategy to Purposeful Implementation" (link in the comments section). #agenticai #greenai #sustainability

  • View profile for Craig Scroggie
    Craig Scroggie Craig Scroggie is an Influencer

    CEO & MD, NEXTDC | AI infrastructure, energy systems, sovereignty

    44,856 followers

    AI has hit a hard limit. It is power, and the world is running out of time. Chips without energy produce nothing. Capital deployed ahead of energy is stranded. As Satya Nadella said recently: “The biggest issue we’re now having is not a compute glut but power… If you can’t build close to power, you’ll have a bunch of chips sitting in inventory that I can’t plug in.” Across many regions, AI expansion is colliding with fragmented permitting, local opposition to power and water use, and prolonged approval processes. These may be rational individually, but collectively they slow execution. China has taken a different path. While others debate, it is executing. China added roughly 429 GW of total new generation capacity last year across all sources, compared with about 51 GW in the United States. That gap explains why securing energy upstream is now a first-order priority in the AI infrastructure race. Hyperscaler behaviour has shifted to address the bottleneck. Google has moved energy development onto its balance sheet to reduce exposure to permitting delays and interconnection queues. Energy shifts from operating expense to strategic capital investment to secure timing and scale. Amazon has moved upstream into pre-FID development risk. By funding solar, storage, and nuclear projects before they are financeable, it aligns power delivery with data-centre deployment. The objective is speed and sequencing, not asset ownership. Meta has prioritised firm capacity. Its investment in advanced nuclear reflects a simple reality: dense AI workloads require firm electrons. Reliability and load factor now outweigh lowest-cost energy. Different structures. Same objective: reduce power-timing risk. Historically, energy risk was outsourced. Developers carried permitting risk. Utilities carried delivery risk. Infrastructure funds carried capital risk. Hyperscalers signed PPAs and focused on compute. That model worked when power was abundant and demand flexible. It fails when workloads are dense, fixed-location, and schedule-critical. High-density AI compresses timelines and penalises delay. A data centre without power is a stranded asset. Waiting for grid upgrades or third-party development now introduces unacceptable execution risk. Risk is therefore being internalised, within the limits of grid physics and regulation. Transmission, system strength, and permitting remain binding constraints. What has changed is when capital is committed: earlier, to remove uncertainty before those constraints are hit. The trade-off is asymmetric. The downside risk is financial and capped. The risk of not acting is strategic: delayed capacity, lost relevance, permanent displacement. Regions that can deliver integrated energy and compute will attract future AI capacity. Regions that cannot will not. Energy is no longer a background input. It is the dominant coordination constraint. Leading platforms are now designing energy and compute as a single system. #ai

  • View profile for Bhanu Harish Gurram
    Bhanu Harish Gurram Bhanu Harish Gurram is an Influencer

    Co-founder Ditto Insurance & Finshots | We are hiring!!

    168,725 followers

    While everyone is fighting over NVIDIA, AMD, and Intel, a bigger shortage is quietly forming: power. AI discussions are still centred on chips. Meanwhile, electricity has become the binding constraint. A single hyperscale AI data centre can draw 200–500 MW of continuous power. That’s roughly what a mid-sized city consumes. And unlike factories or offices, this demand runs 24/7, without fluctuation. Global AI data centre power demand is estimated to rise from ~8 TWh in 2024 to over 650 TWh by 2030. That’s an 80× increase in six years. That gap is now being bridged with workarounds. Some AI operators are generating electricity on-site using large gas turbines and repurposed jet engines. Retired aircraft engines can be installed near data centres and brought online in days. It’s fast, flexible, and avoids waiting years for grid upgrades. There are clear pros to this: immediate deployment, independence from congested grids, and guaranteed uptime for critical workloads. There are also clear cons: high carbon emissions and local pollution risks. This is why jet engines aren’t a long-term fix. The more interesting signal is where this leads. Large tech players are already exploring small modular nuclear reactors (SMRs) for future data centres. The logic is straightforward: nuclear offers continuous, low-carbon, always-on power without the intermittency of wind or solar. If AI demand keeps scaling at this pace, the next strategic race won’t just be for chips. It will be for reliable power. And nuclear, whether people are comfortable with it or not, is increasingly part of that conversation. Finshots

  • View profile for Saanya Ojha
    Saanya Ojha Saanya Ojha is an Influencer

    Partner at Bain Capital Ventures

    79,796 followers

    AI’s exponential energy appetite is quietly rebooting America’s nuclear industry. In 2024, Big Tech had a critical realization: artificial intelligence isn’t just a software revolution - it’s a thermodynamic one. Training a single GPT‑4‑class model consumes ~500 MWh, that’s enough to power ~15 U.S. homes for a year. But inference is the real sinkhole. It’s always-on, everywhere, all at once. AI server racks consume >100 kW per rack, 10x more than traditional racks. Renewables can’t keep up. The sun sets. The wind stalls. Batteries are expensive, and at this scale, insufficient. Clean power isn’t the same as reliable power. And for 24/7 inference, only one option checks every box: nuclear - clean, constant, controllable baseload power. So what do trillion-dollar firms do when they realize their business model runs on electrons? They start buying the grid. ▪️ Microsoft partnered with Constellation Energy to restart Three Mile Island Unit 1 by 2028, supplying 835 MW of baseload power to its AI data centers - the first large-scale restart of a decommissioned U.S. reactor. Oh, and it’s betting on fusion too: Microsoft’s backing Helion, targeting the first commercial fusion prototype by 2028. When you have Microsoft money, you can place moonshots on the sun. ▪️Google is doing what Google does: building a portfolio. It inked a deal in October 2024 with Kairos Power for molten-salt SMRs (6–7 reactors by 2035, first demo 2030). Two weeks ago, it added Elementl Power - 1.8 GW of advanced nuclear capacity. ▪️Amazon Web Services (AWS) locked down up to 1.9 GW from Talen Energy's Susquehanna plant and, last year, dropped $650 million to buy a nuclear-powered data center campus outright. ▪️Meta finally joined the party last week, signing a 20‑year Purchase Agreement with Constellation to draw 30 MW from the Clinton nuclear plant in Illinois. The capacity is modest, but it signals a strategic shift - away from carbon offsets and toward operational baseload coverage. Even Meta sees the writing on the grid. This isn’t a hypothetical future - it’s happening now.  3 major nuclear PPAs signed within 2 weeks. Soaring federal support. Billions in private bets. What began as a GPU arms race is now an energy land grab. The next big AI breakthrough might not be a model, it might be a reactor.

  • View profile for Andreas Horn

    Head of AIOps @ IBM || Speaker | Lecturer | Advisor

    241,769 followers

    𝗧𝗵𝗲 𝗳𝘂𝘁𝘂𝗿𝗲 𝗼𝗳 𝗔𝗜 𝗶𝘀...𝗥𝗔𝗗𝗜𝗢𝗔𝗖𝗧𝗜𝗩𝗘? ☢️ Over the last few weeks, Microsoft, AWS, Google and Oracle have all announced heavy investments in nuclear energy. These investments are driven by the need for reliable, carbon-free energy to power AI and data center operations (in fact the combined energy consumption of all data centers worldwide would rank 17th among all countries). And power demands for AI are expected to continue to rise and renewables seem not able to keep up. 𝗛𝗲𝗿𝗲 𝗶𝘀 𝘁𝗵𝗲 𝗳𝘂𝗹𝗹 𝗽𝗶𝗰𝘁𝘂𝗿𝗲: 𝗔𝗺𝗮𝘇𝗼𝗻: • Anchoring a $500 million investment in X-energy for small modular reactor (SMR) development. • Partnering with Energy Northwest to develop four SMRs in Washington state. • Collaborating with Dominion Energy to explore SMR development near the North Anna nuclear power station in Virginia. • Acquired a $650 million nuclear-powered data center in Pennsylvania. 𝗚𝗼𝗼𝗴𝗹𝗲: • Signed last week a power purchase agreement with Kairos Power for multiple SMRs expected to be operational by 2030. • Aiming to bring 500 megawatts of nuclear power to the grid through this partnership. 𝗠𝗶𝗰𝗿𝗼𝘀𝗼𝗳𝘁: • Financing the revival of the decommissioned Three Mile Island nuclear power plant in Pennsylvania. • Committed to purchasing energy from Helion Energy, a startup aiming to establish the world's first nuclear fusion plant by 2028. 𝗢𝗿𝗮𝗰𝗹𝗲 • Oracle has secured building permits for three small modular reactors (SMRs) to power a new data center (2030), which is expected to require over 1 gigawatt of electrical power. • The initiative reflects Oracle's strategy to utilize advanced nuclear technology as a reliable, carbon-neutral energy source, aiming to enhance the efficiency and sustainability of its extensive data center operations globally. 𝗧𝗵𝗲𝗿𝗲 𝗶𝘀 𝗼𝗻𝗲 𝗰𝗼𝗺𝗺𝗼𝗻 𝘁𝗿𝗲𝗻𝗱 𝗮𝗺𝗼𝗻𝗴 𝗮𝗹𝗹 𝗳𝗼𝘂𝗿 𝗰𝗼𝗺𝗽𝗮𝗻𝗶𝗲𝘀: A strong focus on 𝗦𝗺𝗮𝗹𝗹 𝗠𝗼𝗱𝘂𝗹𝗮𝗿 𝗥𝗲𝗮𝗰𝘁𝗼𝗿 (𝗦𝗠𝗥) 𝘁𝗲𝗰𝗵𝗻𝗼𝗹𝗼𝗴𝘆 as a central part of their nuclear energy strategy. SMR are quite small compared to nuclear power stations. Seems like they're probably going to be made on a production line and packed into a few trucks for each reactor. This might be a way to help companies meet climate commitments and meet increasing energy needs at the same time. It also represents a significant shift from previous investments in wind and solar, driven by the growing energy needs of AI. 𝗜𝘀 𝗮 𝗻𝘂𝗰𝗹𝗲𝗮𝗿 𝗿𝗲𝗻𝗮𝗶𝘀𝘀𝗮𝗻𝗰𝗲 𝘂𝗻𝗱𝗲𝗿𝘄𝗮𝘆? #climatechange #AI #datacenter

  • View profile for Lindsay Rosenthal

    Founder | Creator | Strategist | Building AI, Leaders, & Ideas That Move Markets

    44,424 followers

    persistent memory in AI chats sounds amazing… but it’s toll on energy infrastructure will be significant. here’s how this plays out: we keep talking about models, features, agents. but a major shift is happening underneath all of that. persistent memory changes everything. persistent memory requires massive storage. memory means long-running agents, historical context, personalized systems, and amazing enterprise continuity. but all that data does not disappear after a prompt. it gets stored. indexed. retrieved. recomputed. AI starts behaving less like an app and more like infrastructure. and infrastructure is physical. additionally, AI workloads are not like traditional cloud workloads. they are always on, ultra compute-heavy, storage-intensive. every improvement in AI capability increases demand downstream. and better models do not reduce infrastructure needs. they accelerate them. how nuclear partnerships come into play: this is where it gets interesting. Google. Microsoft. Amazon. all publicly exploring or investing in nuclear energy partnerships. nuclear is stable, scalable, and long-term (carbon-conscious too!) and renewables alone cannot meet sustained AI demand at scale. at a certain point, AI progress stops being limited by algorithms. the new limitations will be:  • power generation  • grid reliability  • cooling systems  • land availability  • geopolitics compute is now a national asset. energy independence becomes AI independence. countries that can reliably produce power at scale will:  • train bigger models  • run more agents  • support enterprise adoption  • control AI supply chains this means it’s no longer just a tech race. it’s an infrastructure race. the biggest bottleneck to AI will soon be (or already is) power. the companies and countries that win the next decade of AI will have the strongest grids!!

Explore categories