When ChatGPT Learned to Count Kilowatts: A Masterclass in Converting Intelligence Into Electricity Bills

Today's SpecialLadies and gentlemen, welcome to the newest restaurant in Silicon Valley's financial district, where today's special is "AI-Powered Future" served with a side of "Holy Shit, Where's All The Electricity Going?" Our chefs have been working overtime to perfect this dish, and by working overtime, I mean they've accidentally created a monster that eats power grids for breakfast and asks for seconds.¹The menu promised us artificial intelligence that would solve climate change, optimize everything, and probably fold our laundry. What we got instead was a digital Pac-Man that's currently munching through 20% of global data center electricity consumption—the equivalent of the Netherlands' annual usage. By late 2025, AI is projected to consume 23 gigawatts of electricity, which is nearly double what Bitcoin mining currently uses as of 2025. Yes, the same Bitcoin mining that environmentalists have been screaming about for years just got a bigger, hungrier sibling.IngredientsLet's examine what went into this particular recipe for electrical disaster. First, we start with a base of computational hubris—the belief that if we just make the models bigger, they'll somehow become smarter. Each AI server rack now consumes 30-100 kilowatts compared to traditional servers' modest 7 kilowatts. It's like replacing your Honda Civic with a Formula 1 race car for your daily commute to Starbucks.Add a generous helping of venture capital enthusiasm, where "growth at any cost" meets "climate commitments are just marketing copy." Sprinkle in some CEO promises about carbon neutrality while simultaneously signing power purchase agreements for small modular reactors because apparently irony is the secret sauce in this recipe.The kicker? Training a model like GPT-4 emits roughly 500 tons of CO₂—equivalent to 300 round-trip flights from NYC to San Francisco. But hey, at least it can write poetry about how sorry it is for melting the polar ice caps.²Preparation MethodStep 1: Convince everyone that AI will solve all problems, including the energy crisis (while conveniently not mentioning it will first create an energy crisis).Step 2: Build data centers that require the electrical output of small countries. AWS, Microsoft, and Meta have already signed power purchase agreements for 5.4 gigawatts of nuclear capacity, because nothing says "disruption" like going full nuclear to power your chatbots.Step 3: Watch uranium prices rebound 20% to $71.25 per pound in May 2025 after a 40% price drop earlier in the year, driven partly by AI's nuclear energy bets and aging mine supply constraints. Somewhere, a uranium mining executive is probably getting creative with their yacht names, though they'd be wise to remember that commodity cycles are less predictable than AI hype cycles.Step 4: Project that global data centers will triple their power consumption to 1,500 terawatt-hours by 2030, then act surprised when the grid starts making concerning noises.Step 5: Implement "efficiency measures" like Small Language Models that reduce energy use by 80% compared to their bloated predecessors per research from efficiency optimization studies (which sounds great until you realize they're 80% more efficient than something that was already absurdly inefficient).Simon's Kitchen Notes[Adjusting chef's hat and checking the industrial-grade electrical meter]Here's what's really cooking: We're witnessing the greatest bait-and-switch in energy history. The same tech companies that pledged carbon neutrality are now scrambling to secure nuclear power faster than Vietnam built motorbikes in the 1990s. The uranium market is experiencing what economists call "vertical price appreciation" and what normal people call "holy crap, why is this glowing rock getting expensive again?"The beautiful irony? AI is supposed to optimize energy grids, but first it needs to consume enough electricity to power small nations. It's like hiring a personal trainer who eats all your food before teaching you about nutrition. *Cái này thật là "thông minh" quá!*³Meanwhile, FERC is trying to streamline approval processes for 2,000+ gigawatts of total energy projects (not just AI-related ones), which is regulatory speak for "we have no idea how to handle this electrical appetite monster we've created, plus all the other energy infrastructure we've been neglecting."Regional VariationsIn Southeast Asia, where I've watched governments flip-flop on cryptocurrency mining due to energy concerns, the AI revolution presents a fascinating case study in cognitive dissonance. Singapore wants to be the AI hub of Asia while simultaneously having some of the highest electricity costs in the world. It's like trying to host a swimming competition in the desert.China, ever pragmatic, is building 32 nuclear reactors while pretending this has nothing to do with their AI ambitions. Meanwhile, Indonesia is still figuring out how to keep the lights on in rural areas, but sure, let's add AI data centers to the mix.The delicious part? Every government wants the prestige of being an "AI leader" while pretending the energy math doesn't exist. It's like wanting to be a bodybuilder without acknowledging you'll need to eat more food.Food Poisoning WarningHere's your final taste: The IMF projects that unabated AI growth could add 1.7 gigatons of CO₂ by 2025 under current policies, which makes every corporate sustainability report read like fan fiction. The industry's solution? More AI to optimize the AI that's consuming all the energy. It's turtles all the way down, except the turtles are electric and very, very hungry.For investors paying attention, the uranium play isn't just about nuclear energy—it's about betting on humanity's inability to moderate its appetite for computational power. Analysts are bullishly forecasting $100 per pound uranium by late 2025, partly because production costs for new mines are approaching that level, creating natural upward pressure on prices.The real kicker? Major AI firms "rarely disclose model-specific energy data," which is corporate speak for "if we told you how much electricity this thing actually uses, you'd probably unplug it." Transparency in the age of artificial intelligence apparently doesn't extend to the electrical bill.So here's your investment thesis: Bet on whatever powers the machines, because the machines aren't going on a diet anytime soon. Whether it's uranium, renewable infrastructure, or companies that make really, really big extension cords, the AI revolution will be electric—literally.Bon appétit, and may your portfolio be as charged as your data centers.¹ The technical term for this phenomenon is "computational load scaling beyond grid capacity," but "digital monster eating electricity" is more accurate and significantly more entertaining.² For context, training GPT-4 consumed roughly the same amount of energy as the annual emissions of 500 average American homes. But hey, at least it can explain why that's problematic in seventeen different languages.³ Vietnamese for "This is really 'smart'!" (heavy sarcasm implied)
Continue reading with free account
Sign in to read the full article and access exclusive content
✨ Completely free • No credit card required