Sunday, April 13, 2025

AI power usage estimate error 120 000 times

Excellent podcast interview with a very prominent person from computing word, 
co-creator of RISC processor design, including modern and very popular "open hardware" RISC V".

But the most interesting part of conversation was about a mistake in a scientific paper
where energy usage used for AI training was over-estimated 120 000 times!

That was based on public data from Google, that are miss-interpreted, and without insight of how Google is actually doing AI training etc. David Patterson knows it well, since he now works in Google
and is deeply involved in energy optimization.

And now, with this wrong info published and cited many times, people are making all kinds of assumptions and even plans! Yes, data centers are using a lot of energy, but not even near to levels some people are presenting it to be.

Turing Award Special: A Conversation with David Patterson - Software Engineering Daily


Good News About the Carbon Footprint of Machine Learning Training

"Unfortunately, some ... papers misinterpreted the NAS estimate as the training cost for the model it discovered, yet emissions for this particular NAS are ~1300x larger than for training the model. These papers estimated that training the Evolved Transformer model takes two million GPU hours, costs millions of dollars, and that its carbon emissions are equivalent to five times the lifetime emissions of a car. In reality, training the Evolved Transformer model on the task examined by the UMass researchers and following the 4M best practices takes 120 TPUv2 hours, costs $40, and emits only 2.4 kg (0.00004 car lifetimes), 120,000x less. This gap is nearly as large as if one overestimated the CO2e to manufacture a car by 100x and then used that number as the CO2e for driving a car."

David Patterson (computer scientist) - Wikipedia

JavaScript V8 engine internals

very "technical" and interesting

Land ahoy: leaving the Sea of Nodes · V8  
"V8’s end-tier optimizing compiler, Turbofan, is famously one of the few large-scale production compilers to use Sea of Nodes (SoN). However, since almost 3 years ago, we’ve started to get rid of Sea of Nodes and fall back to a more traditional Control-Flow Graph (CFG) Intermediate Representation (IR), which we named Turboshaft. By now, the whole JavaScript backend of Turbofan uses Turboshaft instead, and WebAssembly uses Turboshaft throughout its whole pipeline."



Google created V8 for its Chrome browser, and both were first released in 2008.[4] The lead developer of V8 was Lars Bak, and it was named after the powerful car engine.[5] For several years, Chrome was faster than other browsers at executing JavaScript


In 1994, he joined LongView Technologies LLC, where he designed and implemented high performance virtual machines for both Smalltalk and Java. After Sun Microsystems acquired LongView in 1997, Bak became engineering manager and technical lead in the HotSpot team at Sun's Java Software Division where he developed a high-performance Java virtual machine

With a team of 12 engineers, Bak coordinated the development of the V8 JavaScript interpreter for Chrome

Bak co-developed the Dart programming language presented at the 2011 Goto conference in Aarhus, Denmark