Discussion
JavaScript is not available.
timmg: This tweet shows it as a percentage of US GDP:https://x.com/paulg/status/2045120274551423142Makes it a little less dramatic. But also shows what a big **'n deal the railroads were!
therein: I really dislike the term hyperscaler. Comes off very insincere. They came up with it themselves, didn't they? What's the official definition supposed to be now? Companies that are setting up as many GPU/TPU server clusters as possible for a demand that's yet to exist?
bombcar: It always makes me think of a hyperactive toddler running around in circles, which oddly fits most thought leaders who use the term.
chromacity: But doesn't that overstate it in the other direction? Talking about investments in proportion to GDP back when any estimate of GDP probably wasn't a good measure of total economic output?We're talking about the period before modern finance, before income taxes, back when most labor was agricultural... Did the average person shoulder the cost of railroads more than the average taxpayer today is shouldering the cost of F-35? (That's another line in Paul's post.)
chaos_emergent: I posted just that on the Twitter feed but then I realized that railroad started at the beginning of an industrial revolution where labor was a far larger portion of GDP compared to industrial production. So it kind of makes sense that the first enabling technology consumed far more GDP than current investments do, even on a marginal basis.
bombcar: That's the problem with going too far using "money" or "GDP" - you can roughly compare the WWII 45% of GDP spent with today - https://www.davemanuel.com/us-defense-spending-history-milit... because even by WWII much was "financialized" in such a way that it appears on GDP (though things like victory gardens, barter, etc would explicitly NOT be included without effort - maybe they do this?).As you get further and further into the past you have to start trying to measure it using human labor equivalents or similar. For example, what was the cost of a Great Pyramid? How does the cost change if you consider the theory that it was somewhat of a "make work" project to keep a mainly agricultural society employed during the "down months" and prevent starvation via centrally managed granaries?
lukeschlather: This seems like a total category error. The Railroads are the only example that actually seems comparable, in being an infrastructure build out that's mostly done by a variety of private companies. Examples of things that would be worth comparing to the datacenter boom are factory construction and utilities (electrification in the first half of the 20th century, running water, gas pipes.)
stefan_: "Infrastructure build out"? Everything put into these datacenters is worthless well before 10 years have gone by.We aren't even getting infrastructure out of it, they are just powering it with gas turbines..
jeffbee: This isn't true and you can easily prove it to yourself by renting a Sandy Bridge CPU or a TPUv2 from Google today.
therobots927: The problem is that once built, railroads provided economic value right off the bat.I would love to hear about the economic value being generated by these LLMs. I think a couple years is enough time for us to start putting some actual numbers to the value provided.
coffeefirst: I have concluded the entire public discourse surrounding AI has no relationship to real stuff that you can go, test, and point at.There’s a loop of everyone is saying stuff because everyone else is saying stuff that turns into a sort of reality inspired fan fiction.It’s not just that it’s wrong or imprecise, that I expect, it’s that the folklore takes on a life of its own.
SpicyLemonZest: Gentle reminder that the cost of producing well-formatted graphs is much, much lower than it used to be. We grew up in a world where the mere existence of this graph would prove that someone put a great deal of effort into making it, and now it does not. I have no specific reason to doubt the information, but if you want to have reliable epistemic practices, you can no longer treat random graphs you find on social media as presumptively true.
kerblang: Adjusted for inflation?
lenerdenator: That's not fair to the toddlers; their crap tends to be safely contained in a diaper as opposed to their heads.
j-bos: As sibling comments mentioned deceptive comparison as well. How about comparing in percentage of Gross Energy Output. https://www.sciencedirect.com/science/article/abs/pii/S09218...
dghlsakjg: The railroads and the interstate are arguably the biggest and broadest impact, especially in 2nd order effects (everything West of the Mississippi would be vastly different economically without them).I am not an ai-booster, but I would not be surprised at AI having a similar enabling effect over the long term. My caveat being that I am not sure the massive data center race going on right now will be what makes it happen.
mikrl: Superscaler sounds too much like superscalar…
JumpCrisscross: > once built, railroads provided economic value right off the batIf they were laid on a sensible route, completed on budget and time, and savvily operated. Many railroads went bust.
losvedir: Does anyone know what's included in "datacenter capex"? In particular, does that include spending for associated power generation? Because whether or not the AI craze pans out, if we've built a whole bunch of power plants (and especially solar, wind, hydro, etc) that would be a big win.
jeffbee: The other categorical error is that the American people paid the railroads a monumental subsidy to get the job done. We gave them almost 10% of the territory.
lenerdenator: Given the size of some of these data centers, the incentives packages that local governments often give their developers, and the impact on the electric grid that can, in some cases, raise costs for other ratepayers, I'd say the comparison could be similar.The one Google's putting in KC North is 500 acres [0] and there were $10 billion in taxable revenue bonds put up by the Port Authority to help with the cost.This for a company that could pay for that in cash right now.[0] https://fox4kc.com/news/google-confirms-its-behind-new-data-...
jeffbee: That's the opposite of a subsidy. KC stakes nothing of value and gets a defined revenue for the next 25 years.
lenerdenator: Then why would Google mess with the bonds at all?Again, they have the cash to buy that land and develop it without any further consideration beyond permits and planning.
tripletao: This seems to show the railroads peaking around 9% of GDP. While that's lower than some of the other unsourced numbers I've seen, it's much higher than the numbers I was able to find support for myself athttps://news.ycombinator.com/item?id=44805979The modern concept of GDP didn't exist back then, so all these numbers are calculated in retrospect with a lot of wiggle room. It feels like there's incentive now to report the highest possible number for the railroads, since that's the only thing that makes the datacenter investment look precedented by comparison.
helterskelter: You don't even need to go that far back to run into issues, when I read Pride and Prejudice, I think Mr. Darcy was one of the richest people in England at around £10,000/year, but if you to calculate his wealth in today's terms it wasn't some outrageous sum (Wikipedia is telling me ~£800,000/year). The thing is that the economy was totally different back then -- labor cost practically nothing, but goods like furniture for instance were really expensive and would be handed down for generations.With £800K today, you may not even be able to afford the annual maintenance for his mansion and grounds. I knew somebody with a biggish yard in a small town and the garden was ~$40K/yr to maintain. Definitely not a Darcy estate either.Thinking about it, an income of £800K is something like the interest on £10m.
somenameforme: The big change is the end of any sort of backing in money. The Minneapolis Fed calculated consumer price index levels since 1800 here. [1] Of course that comes with all the asterisks we're speaking of here for data going back that far, but their numbers are probably at least quite reasonable. They found that from 1800 to 1950 the CPI never shifted more than 25 points from the starting base of 51, so it always stayed within +/- ~50% of that baseline. That's through the Civil War, both World Wars, Spanish Flu, and much more.Then from 1971 (when the USD became completely unbacked) to present, it increased by more than 800 points, 1600% more than our baseline. And it's only increasing faster now. So the state of modern economics makes it completely incomparable to the past, because there's no precedent for what we're doing. But if you go back to just a bit before 1970, the economy would have of course grown much larger than it was in the past but still have been vaguely comparable to the past centuries.And I always find it paradoxical. In basic economic terms we should all have much more, but when you look at the things that people could afford on a basic salary, that does not seem to be the case. Somebody in the 50s going to college, picking up a used car, and then having enough money squirreled away to afford the downpayment on their first home -- all on the back of a part time job was a thing. It sounds like make-believe but it's real, and certainly a big part of the reason boomers were so out of touch with economic realities. Now a days a part time job wouldn't even be able to cover tuition, which makes one wonder how it could be that labor cost practically nothing in the past, as you said. Which I'm not disputing - just pointing out the paradox.https://www.minneapolisfed.org/about-us/monetary-policy/infl...
cactacea: Really shows where our priorities are at as a country. SMH
topspin: The F-35 case is interesting. Lockheed Martin can, given peak rates seen in 2025, produce a new F-35 approximately every 36 hours, as they fill orders for US allies arming themselves with F-35's. US pilot training facilities are brimming with foreign pilots. It's the most successful export fighter since the F-16 and F-4, and presently the only means US allies have to obtain operational stealth combat technology.What that means for the US is this: if the US had to fight a conventional war with a near-peer military today, the US actually has the ability to replace losses. The program isn't some near-dormant, low-rate production deal that would take a year or more to ramp up: it's a operating line at full rate production that could conceivably build a US Navy squadron every ~15 days, plus a complete logistics and training system, all on the front burner.
throwaway27448: Is there really that much inefficiency in our distribution of goods and services such that AI could have this much impact?
throwaway27448: Further evidence that the US, for whatever reason, lacks basic ability to rationally use resources.
palmotea: > Lockheed Martin can, given peak rates seen in 2025, produce a new F-35 approximately every 36 hours ... it's a operating line at full rate production that could conceivably build a US Navy squadron every ~15 days, plus a complete logistics and training system, all on the front burner.That's amazing. I had no idea the US was still capable of things like that.I wonder if there's a way to get close to that, for things that aren't new and don't have a lot of active orders. Like have all the equipment setup but idle at some facility, keep an assembly teams ready and trained, then cycle through each weapon an activate a couple of these dormant manufacturing programs (at random!) every year, almost as a drill. So there's the capability to spin up, say F-22 production quickly when needed.Obviously it'd cost money. But it also costs a lot of money to have fighter jets when you're not actively fighting a way. Seems like manufacturing readiness would something an effective military would be smart to pay for.
guywithahat: If you adjust for GDP railroads were much more expensive, and I don't think they're viewed as a mistake https://x.com/finmoorhouse/status/2044985790212583699?s=20
psychoslave: ~£800,000/year when compared to median value in current UK? Outrageous is relative sure, but for most people out there it should be no surprise they would feel that as an outrageously odd distribution of wealth.https://en.wikipedia.org/wiki/Income_in_the_United_Kingdom
topspin: "I had no idea the US was still capable of things like that."It's more than just the US though. It's the demand from foreign customers that makes it possible. It's the careful balance between cost and capability that was achieved by the US and allies when it was designed.Without those things, the program would peter out after the US filled its own demand, and allies went looking for cheaper solutions. The F-35 isn't exactly cheap, but allies can see the capability justifies the cost. Now, there are so many of them in operation that, even after the bulk of orders are filled in the years to come, attrition will keep the line operating and healthy at some level, which fulfills the goal you have in mind.Meanwhile, the F-35 equipped militaries of the Western world are trained to similar standards, operating similar and compatible equipment, and sharing the logistics burden. It actual conflict, those features are invaluable.There are few peacetime US developed weapons programs with such a record. It seems the interval between them is 20-30 years.
wisemanwillhear: For some reason this reminds me of people at work who walk up and say we did x bazillion things in n time, and then pause and expect us to express shock at how amazing that is and how much more productive they are than other teams. So what. Without a proper comparison to something equivalent I can't evaluate whether it's exceptional. I could treat each molecule as a thing and tell people how incredibly many things I eat on average per minute, but if I explain no one would find this to be exceptional.
rcxdude: Hyperscale exists as a term pre-LLM-hype. It mainly exists to describe the kind of datacenteres that companies like google and amazon have been building for at least a decade now: very large, very highly integrated and customised hardware, with a focus on cloud deployment and management strategies. This is to distinguish from just a large datacenter built with commodity server parts from a set of vendors (i.e. the kinds of servers 99% of people will be able to lay their hands on. Another way to put it is that if you're not writing your own BIOS/BMC/etc, you're probably not hyperscaling).
0xbadcafebee: Fwiw, Railroads were the reason for some of the biggest bank collapses in history. Panic of 1873 was literally called "The Great Depression" (until a greater depression hit). 20 years later was the Panic of 1893. Both were due to over-investment and a bubble bursting, and they took out tons of banks and businesses.We're seeing exactly the same thing with AI, as there is massive investment creating a bubble without a payoff. We know that the value will lower over time due to how software and hardware both gets more efficient and cheaper. And so far there's no evidence that all this investment has generated more profit for the users of AI. It's just a matter of time until people realize and the bubble bursts.And when the bubble does burst, what's going to happen? Most of the investment is from private capital, not banks. We don't know where all that private capital is coming from, so we don't know what the externalities will be when it bursts. (As just one possibility: if it takes out the balance sheets of hyperscalers and tech unicorns, and they collapse, who's standing on top of them that collapses next? About half the S&P 500 - so 30% of US households' wealth - but also every business built on top of those mega-corps, and all the people they employ) Since it's not banks failing, they probably won't be bailed out, so the fallout will be immediate and uncushioned.
keeda: > We're seeing exactly the same thing with AI, as there is massive investment creating a bubble without a payoff....And so far there's no evidence that all this investment has generated more profit for the users of AI.If you look around a bit, you will find evidence for both. Recent data finds pretty high success in GenAI adoption even as "formal ROI measurement" -- i.e. not based on "vibes" -- becomes common: https://knowledge.wharton.upenn.edu/special-report/2025-ai-a... (tl;dr: about 75% report positive RoI.)The trustworthiness, salience and nuances of this report is worth discussing, but unfortunately reports like this gets no airtime in the HN and the media echo chamber.Preliminary evidence, but given this weird, entirely unprecedented technology is about 3+ years old and people are still figuring it out (something that report calls out) this is significant.