Isn't there a full wafer ai chip mainframe for data centers now that blows anything needing ram out of the water?
I don't understand the ram shortage exists companies have surpassed nvidia.
i am working on my side-product [1] where i was exploring a Rockchip which required external memory (just 1G) which went from $3 to $32 and completely destroyed economics for me. I settled with one with embedded memory and optimizing my code instead :)
I suspect game development will be similar - game companies will optimize their games given customer cards are not going to be released for a while or will be too expensive.
Resource usage has been on a hedonic treadmill at least since I came online in the 90s. Good things have come from that, of course, but there's also plenty of abstraction/waste that's permitted because "new computers can handle it."
With so many gaming devices based on the AMD Z1 Extreme platform (and its custom Valve corollaries) over the past few years, it'll be great to see that be the target/baseline for a while. Brings access to more players and staves of e-waste for longer.
I'm not sure how we got on to games as resource hogs when Teams uses 2GiB of RAM and Windows itself uses 4GiB of RAM.
I work in gamedev, so perhaps I'm a bit sensitive, and I understand that general purpose engines aren't as light on resources as the handcrafted ones that nobody can afford to make anymore... but we're not anywhere close to the layers of waste and abstraction that presents itself when using webtech for desktop apps by default.
I think Europe should invest into manufacturing RAM. RAM isn't going anywhere, all of modern compute uses it. This would be an opportunity to create domestic supply of it.
The worry is that these high prices aren't going to last long. And by the time you spend years building the capacity, the prices plummet making your facility uneconomical to run.
Ram will always be in some demand, but that doesn't mean it's viable for everyone to start building production.
Not everyone but a supplier in the Europe would be a massive benefit long after the AI driven demand dies off. It'd free them from dependence on other countries for a critical resource making chips more affordable and the supply more stable which is good because the stability of the rest of the world is already questionable and big shocks are expected in the near future.
> I think Europe should invest into manufacturing RAM. RAM isn't going anywhere, all of modern compute uses it. This would be an opportunity to create domestic supply of it.
It's easy to build factories, much more difficult to train the engineers required to run them... and let's not even talk about all the crazy regulations & environmental rules at the EU level that make that task even more difficult, because yes, chip factories do pollute... a lot.
Countries like South Korea or Taiwan have adapted all their legislations and tax, environmental regulations to allow such factories to operate easily. The EU and EU countries will never do that... better outsource pollution and claim they care about the planet...
I am a CAD engineer and software developer who has worked in manufacturing a lot in the UK in various industries - products as big as superyachts and as small as peristaltic pumps. I think if the UK and EU are to try and defend their weakening and shrinking manufacturing sectors (these industries have been disappearing for my entire adult life) then it is possible but difficult...In 10 to 20 years it will be impossible.
The reason is as you have described. We are getting close to where the numbers of people with practical experience working in, managing, and designing things like the work processes and factory layouts in industries that build physical products are disappearing. We're losing a lot of capable practical engineers with hands on experience. We can keep the universities going teaching the physical subjects but those lecturers wouldn't know even where to begin on designing and building efficient factories unfortunately.
We'd probably end up having to get Chinese and Taiwanese businesses to outsource their 'experts' back to us in order to actually do this and pay them a fortune - basically the reverse of what was happening in the manufacturing sector in the 80s and 90s!
Even the most excellent education system takes several yeas to educate a high-schooler to a level of a junior engineer. Then several more years are needed for the best of them to become senior engineers, with the knowledge and experience that a university alone cannot provide.
So, we're looking at a decade-long project at least, even if everything goes as planned, and crazy fast, in the technical and administrative departments.
All the more reason to start now I guess. Putting it off isn't going to get them that knowledge and experience any sooner. If something happens over the next 10 years that eliminates our need for memory chips things will probably be either too messed up or too wonderful for anyone to cry over the years they needlessly spent trying to secure a domestic source of RAM.
> Doesn’t the EU have an excellent education system?
Excellent universities, overall. But results from primary and secondary schools are nose diving at a more than alarming rate in several EU countries. Literacy rates are falling, math grades are falling. There's IMO only so much time before universities begin to be affected as well.
> Doesn’t the EU have an excellent education system?
Well, the EU has not manufactured a whole lot of chips in the last 30 years, where do you get the people with the professional experience to teach new engineers... Oh you mean you have to import the teachers from South Asia too? /s and it takes what, 5 years at the minimum to train an engineer? France and UK used to produce entire home computers... in the 80's...
Come on, STM, Nordic, Infineon, NXP are all European. There is a bunch of chip-making installations in Dresden, Germany (Global Foundries, Bosch, etc), and there's Intel Fab 34 in Ireland. BTW TSMC is planning to open a production facility in Europe in 2027.
This is not comparable to Taiwan or the Shenzen area, but it's definitely not nothing. Some local expertise exists, even though it may be not the most cutting-edge.
Only a matter of time before you hear about missing shipping trucks being stolen. China is opening up more production, but I don’t see any relief coming soon.
This is a fairly odd statement given that BOMs are managed in manufacturing systems and for accounting and engineering purposes in multiple different ways. This can be for anything to do with sales data for a client or for guys on the factory floor or for the accountants. There are sales BOMs, manufacturing BOMs procurement BOMs and nested BOMs etc all for different parts of the business process...you would have BOMs within the organisation that were probably nearly 70% etc or those that were 0%!
The joke is that Apple RAM pricing is now close to market level, they still have margin in there even at market prices, and they are notorious for supply chain management and locking in contracts/prices ahead of time. So doubt Apple will change anything here short term.
On the flip side if you're buying a new computer in 2026 - it's going to be even harder to justify not getting a MacBook, the chips are already 2 years ahead of PC, the price of base models was super competitive, now that the ram is super expensive even the upgraded versions are competitive with the PC market. Oh and Windows is turning to an even larger pile of shit on a daily basis.
I think we’re at the peak, or close to it for these memory shenanigans. OpenAI who is largely responsible for the shortage, just doesn’t have the capital to pay for it. It’s only a matter of time before chickens come home to roost and the bill is due. OpenAI is promising hundreds of billions in capex but has no where near that cash on hand, and its cash flow is abysmal considering the spend.
Unless there is a true breakthrough, beyond AGI into super intelligence on existing, or near term, hardware— I just don’t see how “trust me bro,” can keep its spending party going. Competition is incredibly stiff, and it’s pretty likely we’re at the point of diminishing returns without an absolute breakthrough.
The end result is going to be RAM prices tanking in 18-24 months. The only upside will be for consumers who will likely gain the ability to run much larger open source models on locally.
Connectix was a big deal in its day. RAM Doubler was considered essential software.
They also marketed the first webcam, and made emulators mainstream. Their PlayStation emulator is the basis for the case law that says emulators are fair use, decided as a result of a suit from Sony.
1. https://x.com/_asadmemon/status/1989417143398797424
Resource usage has been on a hedonic treadmill at least since I came online in the 90s. Good things have come from that, of course, but there's also plenty of abstraction/waste that's permitted because "new computers can handle it."
With so many gaming devices based on the AMD Z1 Extreme platform (and its custom Valve corollaries) over the past few years, it'll be great to see that be the target/baseline for a while. Brings access to more players and staves of e-waste for longer.
I work in gamedev, so perhaps I'm a bit sensitive, and I understand that general purpose engines aren't as light on resources as the handcrafted ones that nobody can afford to make anymore... but we're not anywhere close to the layers of waste and abstraction that presents itself when using webtech for desktop apps by default.
Arguably the connotation has changed slightly, but AI slop caught on because it fit so well.
It's uncommon, and associated with old timey prisons and orphanages.
The word itself has existed for hundreds of years.
Ram will always be in some demand, but that doesn't mean it's viable for everyone to start building production.
It's easy to build factories, much more difficult to train the engineers required to run them... and let's not even talk about all the crazy regulations & environmental rules at the EU level that make that task even more difficult, because yes, chip factories do pollute... a lot.
Countries like South Korea or Taiwan have adapted all their legislations and tax, environmental regulations to allow such factories to operate easily. The EU and EU countries will never do that... better outsource pollution and claim they care about the planet...
The reason is as you have described. We are getting close to where the numbers of people with practical experience working in, managing, and designing things like the work processes and factory layouts in industries that build physical products are disappearing. We're losing a lot of capable practical engineers with hands on experience. We can keep the universities going teaching the physical subjects but those lecturers wouldn't know even where to begin on designing and building efficient factories unfortunately.
We'd probably end up having to get Chinese and Taiwanese businesses to outsource their 'experts' back to us in order to actually do this and pay them a fortune - basically the reverse of what was happening in the manufacturing sector in the 80s and 90s!
So, we're looking at a decade-long project at least, even if everything goes as planned, and crazy fast, in the technical and administrative departments.
Excellent universities, overall. But results from primary and secondary schools are nose diving at a more than alarming rate in several EU countries. Literacy rates are falling, math grades are falling. There's IMO only so much time before universities begin to be affected as well.
Well, the EU has not manufactured a whole lot of chips in the last 30 years, where do you get the people with the professional experience to teach new engineers... Oh you mean you have to import the teachers from South Asia too? /s and it takes what, 5 years at the minimum to train an engineer? France and UK used to produce entire home computers... in the 80's...
This is not comparable to Taiwan or the Shenzen area, but it's definitely not nothing. Some local expertise exists, even though it may be not the most cutting-edge.
On the flip side if you're buying a new computer in 2026 - it's going to be even harder to justify not getting a MacBook, the chips are already 2 years ahead of PC, the price of base models was super competitive, now that the ram is super expensive even the upgraded versions are competitive with the PC market. Oh and Windows is turning to an even larger pile of shit on a daily basis.
I'd buy a mac in a sec otherwise.
Unless there is a true breakthrough, beyond AGI into super intelligence on existing, or near term, hardware— I just don’t see how “trust me bro,” can keep its spending party going. Competition is incredibly stiff, and it’s pretty likely we’re at the point of diminishing returns without an absolute breakthrough.
The end result is going to be RAM prices tanking in 18-24 months. The only upside will be for consumers who will likely gain the ability to run much larger open source models on locally.
They also marketed the first webcam, and made emulators mainstream. Their PlayStation emulator is the basis for the case law that says emulators are fair use, decided as a result of a suit from Sony.
So why you’re saying is that it could be worse, but not by much?