- At first glance this looks like a credible set of calculations to me. Here's the conclusion:
> So, if I wanted to analogize the energy usage of my use of coding agents, it’s something like running the dishwasher an extra time each day, keeping an extra refrigerator, or skipping one drive to the grocery store in favor of biking there.
That's for someone spending about $15-$20 in a day on Claude Code, estimated at the equivalent of 4,400 "typical queries" to an LLM.
- I'm not sure I like this method of accounting for it. The critics of LLMs tend to conflate the costs of training LLMs with the cost of generation. But this makes the opposite error: it pretends that training isn't happening as a consequence of consumer demand. There are enormous resources poured into it on an ongoing basis, so it feels like it needs to be amortized on top of the per-token generation costs.
At some point, we might end up in a steady state where the models are as good as they can be and the training arms race is over, but we're not there yet.
- Only tangentially related, but today I found a repo that appears to be developed using AI assistance, and the costs for running the agents are reported in the PRs. For example, 50 USD to remove some code: https://github.com/coder/mux/pull/1658
- Had a small discussion about this on an OP on bsky. A somewhat interesting discussion over there.
https://bsky.app/profile/simonpcouch.com/post/3mcuf3eazzs2c
by matthewfcarlson
0 subcomment
- I would like some real world comparisons. How much power does the laptop or desktop consume during these (likely multi hour) sessions? Assuming you’re using a large HDR monitor 50-100W isn’t unreasonable and at 8 hours a day you’re talking about at least 2 days before you crack 1000kwh like his sessions do. But then a personal desktop on a gaming session can easily pull 1000w (cpu + gpu + peripherals). So comparing it to a gaming session seems fair.
by thestructuralme
0 subcomment
- A lot of the scary numbers come from agents being left in “always-on” loops: long context windows, tool calls, retries, and idle GPU time between steps. The right unit isn’t “watts per agent” but something like joules per accepted change (or per useful decision), because an agent that burns 10x energy but replaces 20 minutes of human iteration can still be a net win. What I’d love to see is a breakdown by (1) model/token cost, (2) orchestration overhead (retries, evaluation, tool latency), and (3) utilization (how much time the GPU is actually doing work vs waiting). That’s where the real waste usually hides.
by throwerxyz
1 subcomments
- "How much energy does it take to eat meat? (ignoring the cost to produce the meat into your hands)"
Do people even care about this?
How much energy does it take to download a video on YouTube versus the energy input to keep it all setup and running?
- As long as it's unaccounted for by users it's at best anexternality. I think it may demand regulation to force this cost to the surface.
electricity and cooling incur wider costs and consequences.
- That is a pretty good article although the one factor not mentioned that we see that has a huge impact on energy is batch size but that would be hard to estimate with the data he has.
We've only launched to friends and family but I'll share this here since its relevant: we have a service which actually optimizes and measures the energy of your AI use: https://portal.neuralwatt.com if you want to check it out. We also have a tools repo we put together that shows some demonstrations of surfacing energy metadata in to your tools: https://github.com/neuralwatt/neuralwatt-tools/
Our underlying technology is really about OS level energy optimization and datacenter grid flexibility so if you are on the pay by KWHr plan you get additional value as we continue to roll new optimizations out.
DM me with your email and I'd be happy to add some additional credits to you.
- Us person does not consume 1600 liters a day
- So less energy than a human brain uses...
by mikeaskew4
0 subcomment
- I have a kids and a dishwasher (which with kids, runs quite often) but I’m not convinced I’m doing worse at energy consumption
by renewiltord
0 subcomment
- [flagged]
- LLMs don't use much energy at all to run, they use it all at the beginning for training, which is happening constantly right now.
TLDR this is, intentionally or not, an industry puff piece that completely misunderstands the problem.
Also, even if everyone is effectively running a a dishwasher cycle every day, this is still a problem that we can't just ignore, that's still a massive increase in ecological impact.