The funniest part is beyond the typo, the complete lack of physical intuition from the analysts who circulated this. 500,000 tons is roughly the weight of 1.5 Empire State buildings. If your rack busbars weigh more than the structural steel of the facility housing them, you have a geotechnical engineering crisis on your hands. It is wild that we reached a point where financial modeling is so decoupled from physical reality that nobody paused to ask if the floor would collapse.
The world of financial analysis and modeling is broad. It’s common to give these tasks to juniors and expect them to grind through it when the output doesn’t really matter.
In this case the output wasn’t actually used for financial modeling. If it had been, it would have been caught immediately when someone put it into a table where they calculated the price or the supply constraints or anything else.
Just to make it even more real: During covid I added a sub-panel and the wire (more like the sausage given the girth) between the sub-panel and main panel was aluminum because of cost. You just need to be a tad careful at the connection points with copper -- nothing a caring literate person can't handle
I wouldn't be surprised if that part was not really reviewed by an expert. They have the unit mass correct but maybe an editor is like ok but what does this look like for a gw project? It doesn't take more than 3rd grade math and a pocket calculator to do it correctly but journalist hasn't had to fumble that ball before. An expert knows its all too easy for any person to make that mistake and would second guess their own work.
With regards to the copper market: it keeps surprising me that some people seem to assume copper is a hard requirement for conducting electricity.
In reality copper is just convenient. We use it because it's easy to work with, a great conductor, and (until recently) quite affordable. But for most applications there's no reason we couldn't use something else!
For example, a 1.5mm2 copper conductor is 0.0134kg/m, which at current prices is $0.17 / meter. A 2.4mm2 aluminum conductor has the same resistance, weighs 0.0065kg/m, which at current prices is $0.0195 / meter!
Sure, aluminum is a pain to work with, but with a price premium like that there's a massive incentive to find a way to make it work.
Copper can't get too expensive simply due to power demands because people will just switch to aluminum. The power grid itself had been using it for decades, after all - some internal datacenter busbars should be doable as well.
> In the Earth's crust, aluminium is the most abundant metallic element (8.23% by mass[68]) and the third most abundant of all elements (after oxygen and silicon).
TIL. I thought it would be relatively expensive due to the difficulty of extracting it.
The Washington Monument has an aluminum cap, which at the time was as expensive as silver. Two years later the Hall-Héroult process was invented, and as a result the price plummeted.
I am not an electricity/wiring guy so maybe you can help me understand. I thought aluminum is dangerous to wire with because it is a fire hazard (I bought a home this year and this was a prominent warning in my reading). Is that because it needs to be done very carefully? I imagine most data centers would not mess with a fire risk on such a scale.
Residential aluminum is a Really Bad Idea because DIY Dave will inevitably do something wrong - which then leads to a fire hazard. Copper is a lot more forgiving.
But a large scale datacenter, solar farm, or battery storage installation? Those will be installed and maintained by trained electricians, which means they actually know what a "torque wrench" is, and how to deal with scary words like "corrosion" and "oxidation".
Like I said: it's what's used for most of the power grid. With the right training it really isn't a big deal.
Aluminum got a bad rep due to a lot of poor installations made in the 1960 and 70'ies. Subsequently new alloys (AA-8000 series) for wiring, new installation and termination procedures etc. have been developed, and AFAIU the situation today is much better. It's still trickier than copper wiring, so probably not a good option for DIY, but for some industrial installation with competent personnel and where the savings can be substantial it could certainly be an attractive option.
Aluminum oxide has high resistance and if you mix aluminum wiring with copper outlets, etc the impedance mismatched is what causes fires. You need to either have special copper pigtails installed or use fixtures that are rated for aluminum wiring.
For commercial installs, it shouldn't be a problem as long as it's planned for.
It's because aluminium has a higher coefficient of thermal expansion. It expands and shrinks more as it heats, and as those cycles add up it tends to loosen electrical connections. Loose connections have higher resistance, heat up and can cause fires.
That said, there is no reason we can't design better connectors that can withstand the expansion and shrinkage cycles, like spring loaded or spring cage connectors.
Old aluminum wiring in your walls with cloth insulators, designed for a time where electricity consumption was a small fraction of today's electrified usage is dangerous because you're overloading an old, unprepared system.
Aluminum bus bars(solid, often exposed) would be designed for the required power levels and installation criteria.
Aluminum home wiring was from the 60s and 70s. It’s not the same as cloth covered knob and tube from earlier years. It has its own problems, but I’d take a house with knob and tube over a house with aluminum wiring.
Unless I'm mistaken, the risk with aluminum is that it can expand and contract if it gets too hot. Aluminum sized properly with the correct connectors torqued to spec would be fine, aluminum wires in a residence with a DIYer working on it can be riskier and is why inspectors will always note it.
Aluminum wires became brittle over time(tens of years), fluid which requires some maintenance for screw terminals and inducts galvanic corrosion when coupled with copper without special care. If wiring was properly done and maintained, it is okayish.
It’s a fire hazard in residential houses where people frequently do their own wiring, because it needs more expertise to wire correctly. Copper wiring is a lot more forgiving to being hooked up by amateurs.
The biggest reason is that aluminum oxidizes, and unlike copper, the oxide layer has high resistivity. In theory that shouldn’t be an issue in datacenters hiring expert technicians.
Yea, in a data center setup the cost difference can justify the things like soldiering connections rather than mechanical connectors with a higher resistance.
> In reality copper is just convenient. We use it because it's easy to work with, a great conductor, and (until recently) quite affordable
It's convenient, it's easy to work with, great conductivity, and cheap enough all at the sametime... Dude, I think you just explained why cropper is used instead of anything else.
That's why my calculation example used a 1.5mm2 copper wire but a 2.4mm2 aluminum one.
Aluminum has a higher resistance, which means the same diameter will get hotter than copper. Make the cable thicker and its resistance drops, which means it gets less hot.
Want more amps at the same temperature? Ohm's law still applies: just use a thicker cable.
It depends on design, environment and maintenance. If it's well protected against oxidation and it's static, it's not really a problem. Heat issues are often at the joints, which are most vulnerable, but can be coated and encapsulated to mitigate. For some uses, aluminium is superior to copper all things considered. Obviously the diameter of the wire/cable depends on which conductor you use, AC/DC, current and service temp - for some applications, that may favour one material over the other.
If the two wires are the same gauge, yes. If you size up the aluminum, at the same resistance/current would mean the same amount of power over the length of the conductor and same heat.
Thicker cables or higher voltage(lower current) is the answer which is why it's used in power distribution networks where they can control the voltage by planning what to transform to.
I would imagine most large-scale data center construction projects will include electrical engineers to design the electrical subsystem. A rack's floor footprint is a few square feet. You can put several million dollars of hardware into that rack. A data center will have at least a few racks. It's a very reasonable investment to bring someone in to do electrical design.
This sort of mistake is easy to make when you're mixing up your units; if they kept to one system of measure, it would've been trivial to catch, before or after release.
We need to standardize on using Earth circumferences as the unit of length. Or better, football fields! (the type of football of course being implied by the website's ccTLD)
We _have_ standardized on Earth circumferences for length, only we divide by 40 million to make the numbers more sane, and got the measurement slightly wrong!
How hard would it be to fix this? Could we theoretically add or subtract enough material or make whole thing slightly more dense or less dense to compensate?
You jest, but times around the Earth is the actual origin of the Meter. Kinda.
The history is quite interesting and well worth checking out.
I can't recommend a book on the subject, but I do heartily recommend "Longitude", which is about the challenges of inventing the first maritime chronometers for the purpose of accurately measuring longitude.
With a higher voltage you can reduce your copper needs by a substantial amount. Seems if copper cost was a concern this would be what these data centers would do.
Agreed. I was kind of surprised to see 54 VDC mentioned. I am assuming this is low enough to meet some threshold for some kind of safety regulation. In other words, it doesn't shock you just 220 VAC would. I'm not entirely convinced of that however as it turns out bus bars are really dangerous in general. A 54 VDC bus bar won't shock you, but if you drop even a paperclip between the bus bar and a metal part that is grounded it basically disappears instantly in a small blast of plasma. The injury from that can be far worse than any shock you'd receive.
> If the "half a million tons" figure were accurate, a single 1 GW data center would consume 1.7% of the world's annual copper supply. If we built 30 GW of capacity—a reasonable projection for the AI build-out—that sector alone would theoretically absorb almost half of all the copper mined on Earth.
Quickly doing such "back of an envelope" calculations, and calling out things that seem outlandish, could be a useful function of an AI assistant.
Using your brain is so vastly more energy efficient, we might just only need half of that 30 GW capacity if fewer people had these leftpad-style knee-jerk reactions.
Each person uses about 100W (2000kcal/24h=96W). Running all of humanity takes about 775GW.
Sure, using or not using your brain is a negligible energy difference, so if you aren't using it you really should, for energy efficiency's sake. But I don't think the claim that our brains are more energy efficient is obviously true on its own. The issue is more about induced demand from having all this external "thinking" capacity on your fingertips
Is there an AI system with functionality at or equal to a human brain that operates on less than 100W? Its currently the most efficient model we have. You compare all of humanity's energy expenditure, but to make the comparison, you need to consider the cost of replicating all that compute with AI (assuming we had an AGI at human level in all regards, or a set of AIs that when operated together could replace all human intelligence).
So, this is rather complex because you can turn AI energy usage to nearly zero when not in use. Humans have this problem of needing to consume a large amount of resources for 18-24 years with very little useful output during that time, and have to be kept running 24/7 otherwise you lose your investment. And even then there is a lot of risk they are going to be gibbering idiots and represent a net loss of your resource expenditure.
For this I have a modern Modest Proposal they we use young children as feed stock for biofuel generation before they become a resource sink. Not only do you save the child from a life of being a wage slave, you can now power your AI data center. I propose we call this the Matrix Efficiency Saving System (MESS).
No one will ever agree on when AI systems have equivalent functionality to a human brain. But lots of jobs consist of things a computer can now do for less than 100W.
Also, while a body itself uses only 100W, a normal urban lifestyle uses a few thousand watts for heat, light, cooking, and transportation.
> Also, while a body itself uses only 100W, a normal urban lifestyle uses a few thousand watts for heat, light, cooking, and transportation.
Add to that the tier-n dependencies this urban lifestyle has—massive supply chains sprawling across the planet, for example involving thousands upon thousands of people and goods involved in making your morning coffee happen.
Wikipedia quoted global primary energy production at 19.6 TW, or about 2400W/person. Which is obviously not even close to equally distributed. Per-country it gets complicated quickly, but naively taking the total from [1] brings the US to 9kW per person.
And that's ignoring sources like food from agriculture, including the food we feed our food.
To be fair, AI servers also use a lot more energy than their raw power demand if we use the same metrics. But after accounting for everything, an American and an 8xH100 server might end up in about the same ballpark
Which is not meant as an argument for replacing Americans with AI servers, but it puts AI power demand into context
Obviously we don't have AGI so we can't compare many tasks. But on tasks where AI does perform at comparable levels (certain subsets of writing, greenfield coding and art) it performs fairly well. They use more power but are also much faster, and that about cancels out. There are plenty of studies that try to put numbers on the exact tradeoff, usually focused more on CO2. Plenty that find AI better by some absurd degree (800 times more efficient at 3d modelling, 130 to 1500 times more efficient at writing, or 300 to 3000 times more efficient at illustrating [1]). The one I'd trust the most is [2] where GPT4 was 5-19 times less CO2 efficient than humans at solving coding challenges
I did some math for this particular case by asking Google’s Gemini Pro 3 (via AI studio) to evaluate the press release. Nvidia has since edited the release to remove the “tons of copper” claim, but it evaluated the other numbers at a reported API cost of about 3.8 cents. If the stated pricing just recovers energy cost, that implies 1500kJ of energy as a maximum (less if other costs are recovered in the pricing). A human thinking for 10 minutes would use sbout 6kJ of direct energy.
I agree with your point about induced demand. The “win” wouldn’t be looking at a single press release with already-suspect numbers, but rather looking at essentially all press releases of note, a task not generally valuable enough to devote people towards.
That being said, we normally consider it progress when we can use mechanical or electrical energy to replace or augment human work.
A Gemini query uses about a kilojoule. The brain runs at 20 W (though the whole human costs 100 W). So, the human is less energy if you can get it done in under 50 seconds.
It's almost always the engineers, analysts and MBA spreadsheet pushers and other people removed from the physical consequences outputting these mistakes because it's way easier to not notice a misplaced decimal or incorrect value when you deal in pure numbers and know what they "should" be than you are the person actually figuring out how to make it happen the difference between needing 26666666.667 and 266666666.667 <units> of <widget> is pretty meaningful. Engineers don't output these mistakes as often as analysts or whatever because they work in organizations that invest more in catching them, not because they make them all that much less.
Whether talking weight or bulk a decimal place is approximately the difference between needing a wheelbarrow, a truck, a semi truck, a freight train and a ship.
Around here, asking "does this number make sense?" when coming across a figure is second nature, reinforced since early in engineering school. The couple of engineers from the US that I know behave similarly, which makes sense because when your job is to solve practical problems and design stuff, precision matters.
> difference between needing 26666666.667 and 266666666.667 <units> of <widget> is pretty meaningful
To be fair, that’s why we’d use 2.6666666667e7 and 2.66666666667e8, which makes it easier to think about orders of magnitude. Processes, tools and methods must be adapted to reduce the risk of making a mistake.
Nobody on HN is a bigger AI stan than I am -- well, maybe that SimonW guy, I guess -- but the truth is that problems involving unit conversions are among the riskiest things you can ask an LLM to handle for you.
It's not hard to imagine why, as the embedding vectors for terms like pounds/kilograms and feet/yards/meters are not going to be far from each other. Extreme caution is called for.
I edited the post with a speculation, but it's just a guess, really. In the training data, different units are going to share near-identical grammatical roles and positions in sentences. Unless some care is taken to force the embedding vectors for units like "pounds" and "kilograms" to point in different directions, their tokens may end up being sampled more or less interchangeably.
Gas-law calculations were where I first encountered this bit of scariness. It was quite a while ago, and I imagine the behavior has been RLHF'ed or otherwise tweaked to be less of a problem by now. Still, worth watching out for.
> In the training data, different units are going to share near-identical grammatical roles and positions in sentences.
Yes, but I would also expect the training data to include tons of examples of students doing unit-conversion homework, resources explaining the concept, etc. (So I would expect the embedding space to naturally include dimensions that represent some kind of metric-system-ness, because of data talking about the metric system.) And I understand the LLMs can somehow do arithmetic reasonably well (though it matters for some reason how big the numbers are, so presumably the internal logic is rather different from textbook algorithms), even without tool use.
Even among engineering fields routine handling of diverse and messy unit systems (e.g. chemical engineering) are relatively uncommon. If you work in one of these domains, there is a practiced discipline to detect unit conversion mistakes. You can do it in your head well enough to notice when something seems off but it requires encyclopedic knowledge that the average person is unlikely to have.
A common form of this is a press release that suggests a prototype process can scale up to solve some planetary problem. In many cases you can quickly estimate that planetary scale would require some part of the upstream inputs to be orders of magnitude larger than exists or is feasible. The media doesn't notice this part and runs with the "save the planet" story.
This is the industrial chemistry version of the "in mice" press releases in medicine. It is an analogue to the Gell-Mann amnesia effect.
Checking the arithmetic in every paper published seems like an good use case for LLMs. Has someone built a better version than uploading a PDF to ChatGPT and asking it to check the arithmetic?
Modern reasoning models are actually pretty good at arithmetic and almost certainly would have caught this error if asked.
Source: we benchmark this sort of stuff at my company and for the past year or so frontier models with a modest reasoning budget typically succeed at arithmetic problems (except for multiplication/division problems with many decimal places, which this isn't).
ChatGPT 5.2 has recently been churning through unsolved Erdös problems.
I think right now one is partially validated by a pro and the other one I know of is "ai-solved" but not verified. As in: we're the ones who can't quite keep up.
You can feed it the Hodge Conjecture for all I care,
the current algorithms are a joke and without real breakthroughs your just generating left to right text with billions in hardware.
Yes, yes. We’ve all seen the same screenshots. Very funny.
Those of us who don’t base our technical understandings on memes are well aware of the tooling at the disposal of all modern reasoning models gives them the capability to do such things.
In this case the output wasn’t actually used for financial modeling. If it had been, it would have been caught immediately when someone put it into a table where they calculated the price or the supply constraints or anything else.
In reality copper is just convenient. We use it because it's easy to work with, a great conductor, and (until recently) quite affordable. But for most applications there's no reason we couldn't use something else!
For example, a 1.5mm2 copper conductor is 0.0134kg/m, which at current prices is $0.17 / meter. A 2.4mm2 aluminum conductor has the same resistance, weighs 0.0065kg/m, which at current prices is $0.0195 / meter!
Sure, aluminum is a pain to work with, but with a price premium like that there's a massive incentive to find a way to make it work.
Copper can't get too expensive simply due to power demands because people will just switch to aluminum. The power grid itself had been using it for decades, after all - some internal datacenter busbars should be doable as well.
TIL. I thought it would be relatively expensive due to the difficulty of extracting it.
(Iron is much cheaper than I thought, too.)
The Washington Monument has an aluminum cap, which at the time was as expensive as silver. Two years later the Hall-Héroult process was invented, and as a result the price plummeted.
Residential aluminum is a Really Bad Idea because DIY Dave will inevitably do something wrong - which then leads to a fire hazard. Copper is a lot more forgiving.
But a large scale datacenter, solar farm, or battery storage installation? Those will be installed and maintained by trained electricians, which means they actually know what a "torque wrench" is, and how to deal with scary words like "corrosion" and "oxidation".
Like I said: it's what's used for most of the power grid. With the right training it really isn't a big deal.
https://en.wikipedia.org/wiki/Aluminum_building_wiring
For commercial installs, it shouldn't be a problem as long as it's planned for.
That said, there is no reason we can't design better connectors that can withstand the expansion and shrinkage cycles, like spring loaded or spring cage connectors.
Aluminum bus bars(solid, often exposed) would be designed for the required power levels and installation criteria.
Old aluminum wires in your walls were designed for a time when you lit your home with 100 watt incandescent lamps rather than 12 watt LEDs.
The biggest reason is that aluminum oxidizes, and unlike copper, the oxide layer has high resistivity. In theory that shouldn’t be an issue in datacenters hiring expert technicians.
It's convenient, it's easy to work with, great conductivity, and cheap enough all at the sametime... Dude, I think you just explained why cropper is used instead of anything else.
Not anymore. :(
Aluminum has a higher resistance, which means the same diameter will get hotter than copper. Make the cable thicker and its resistance drops, which means it gets less hot.
Want more amps at the same temperature? Ohm's law still applies: just use a thicker cable.
Look at the electrical fires of the 1950’s and 1960’s as an example, and that was at household levels of current.
Aluminum is used, but everything accounts for the insane coefficient of linear expansion and other annoying properties.
Each feeder can be aluminum if you put special goop on any copper connections. Breakers accept it just fine, etc.
You should avoid it for smaller wiring, though. There's special 8000 series aluminum if you're trying to be serious with Al feeders
> "Tat sounds like the ultimate catalyst for the commodities market and copper has been hitting records."
"Tat" should be "That", imo.
https://en.wikipedia.org/wiki/Muphry%27s_law
The history is quite interesting and well worth checking out.
I can't recommend a book on the subject, but I do heartily recommend "Longitude", which is about the challenges of inventing the first maritime chronometers for the purpose of accurately measuring longitude.
It's not the most aesthetic one, but it was at the time the most able to be measured.
https://developer.nvidia.com/blog/nvidia-800-v-hvdc-architec...
Quickly doing such "back of an envelope" calculations, and calling out things that seem outlandish, could be a useful function of an AI assistant.
Sure, using or not using your brain is a negligible energy difference, so if you aren't using it you really should, for energy efficiency's sake. But I don't think the claim that our brains are more energy efficient is obviously true on its own. The issue is more about induced demand from having all this external "thinking" capacity on your fingertips
Obviously not equal to a human brain, but my GPU takes about 150W and can draw an image in a minute that would take me forever to replicate.
So, this is rather complex because you can turn AI energy usage to nearly zero when not in use. Humans have this problem of needing to consume a large amount of resources for 18-24 years with very little useful output during that time, and have to be kept running 24/7 otherwise you lose your investment. And even then there is a lot of risk they are going to be gibbering idiots and represent a net loss of your resource expenditure.
For this I have a modern Modest Proposal they we use young children as feed stock for biofuel generation before they become a resource sink. Not only do you save the child from a life of being a wage slave, you can now power your AI data center. I propose we call this the Matrix Efficiency Saving System (MESS).
Also, while a body itself uses only 100W, a normal urban lifestyle uses a few thousand watts for heat, light, cooking, and transportation.
Add to that the tier-n dependencies this urban lifestyle has—massive supply chains sprawling across the planet, for example involving thousands upon thousands of people and goods involved in making your morning coffee happen.
And that's ignoring sources like food from agriculture, including the food we feed our food.
To be fair, AI servers also use a lot more energy than their raw power demand if we use the same metrics. But after accounting for everything, an American and an 8xH100 server might end up in about the same ballpark
Which is not meant as an argument for replacing Americans with AI servers, but it puts AI power demand into context
https://www.eia.gov/energyexplained/us-energy-facts/
1: https://www.nature.com/articles/s41598-024-54271-x?fromPaywa...
2: https://www.nature.com/articles/s41598-025-24658-5
I agree with your point about induced demand. The “win” wouldn’t be looking at a single press release with already-suspect numbers, but rather looking at essentially all press releases of note, a task not generally valuable enough to devote people towards.
That being said, we normally consider it progress when we can use mechanical or electrical energy to replace or augment human work.
For example does it factor in the 18-24 years needed to train a human and the energy used for that?
Whether talking weight or bulk a decimal place is approximately the difference between needing a wheelbarrow, a truck, a semi truck, a freight train and a ship.
> difference between needing 26666666.667 and 266666666.667 <units> of <widget> is pretty meaningful
To be fair, that’s why we’d use 2.6666666667e7 and 2.66666666667e8, which makes it easier to think about orders of magnitude. Processes, tools and methods must be adapted to reduce the risk of making a mistake.
It's not hard to imagine why, as the embedding vectors for terms like pounds/kilograms and feet/yards/meters are not going to be far from each other. Extreme caution is called for.
Gas-law calculations were where I first encountered this bit of scariness. It was quite a while ago, and I imagine the behavior has been RLHF'ed or otherwise tweaked to be less of a problem by now. Still, worth watching out for.
Yes, but I would also expect the training data to include tons of examples of students doing unit-conversion homework, resources explaining the concept, etc. (So I would expect the embedding space to naturally include dimensions that represent some kind of metric-system-ness, because of data talking about the metric system.) And I understand the LLMs can somehow do arithmetic reasonably well (though it matters for some reason how big the numbers are, so presumably the internal logic is rather different from textbook algorithms), even without tool use.
Even among engineering fields routine handling of diverse and messy unit systems (e.g. chemical engineering) are relatively uncommon. If you work in one of these domains, there is a practiced discipline to detect unit conversion mistakes. You can do it in your head well enough to notice when something seems off but it requires encyclopedic knowledge that the average person is unlikely to have.
A common form of this is a press release that suggests a prototype process can scale up to solve some planetary problem. In many cases you can quickly estimate that planetary scale would require some part of the upstream inputs to be orders of magnitude larger than exists or is feasible. The media doesn't notice this part and runs with the "save the planet" story.
This is the industrial chemistry version of the "in mice" press releases in medicine. It is an analogue to the Gell-Mann amnesia effect.
Source: we benchmark this sort of stuff at my company and for the past year or so frontier models with a modest reasoning budget typically succeed at arithmetic problems (except for multiplication/division problems with many decimal places, which this isn't).
ChatGPT 5.2 has recently been churning through unsolved Erdös problems.
I think right now one is partially validated by a pro and the other one I know of is "ai-solved" but not verified. As in: we're the ones who can't quite keep up.
https://arxiv.org/abs/2601.07421
And the only reason they can't count Rs is that we don't show them Rs due to a performance optimization.
Those of us who don’t base our technical understandings on memes are well aware of the tooling at the disposal of all modern reasoning models gives them the capability to do such things.
Please don’t bring the culture war here.