This would be HUGE!!!!!

Would be big for budget builders.

6 Likes

THey should remake the 5k series on AM4 platform :slight_smile:

This would be MASSIVE!

1 Like

Keep your fingers crossed. multipleoutlets now reporting.

If this does happen I think it will be because the big playersa in the market feel the RAM issue is gonna be around for some time. More than just the end of this year.

That would be CRAZY if amd did that for am4 cpus

isnt this going to make people just use AM3 for AI builds too?

This isnt really isnt that we need CPU support, its that RAM is being bogu ht for AI. They did it for AM4 too. what is stopping them for AM3?

I feel like they need to hard code RAM or CPU to just reject AI applications.

We saw manufactures attempt to make GPUs deny crypto mining when it was an issue with 3K series, and it would half or even 1/4 the hash rates, but people found a work around within a month.

If theres more push for it, and they are successful, i think this will be the sweet spot

AM4 uses cheaper/slower DDR4 RAM. The AI problem is not retail usage. Its LARGE commercial data centers gobbling up DDR5 RAM. DDR4 is too slow for them. On top of that, there is absolutely no incentive for RAM manufacturers to do what you said. There is way more money for them to cater to the AI companies who pay top dollar than to sell to individual consumers. The example being Crucial pulling out of the retail market and only selling to the commercial sector now.

hmmm valuable information. I was really under the impression DDR4 was being used by AI too.

I guess there just hope we large scale AM4 memory produced.

What is the main difference internally from am4 and am5 memory. were seeing double speeds but why exactly.

Half correct, half wrong. The ai data centers what fast ram, that means that they what the ram chips not the actual ddr sticks, or at least a special kind of ram unit. But you are correct about the ai data centers wanting fast ram, so the slower ram chips that ddr4 uses is almost unwanted (to my information) because the chips that ddr4 uses have a faster response time. Although I think thay will sacrifice a little latency for a faster speed.

What AI data centers really want is high-bandwidth memory (HBM), because modern AI workloads are bandwidth-limited. HBM delivers way more throughput than DDR5 by stacking memory close to the compute die, but supply is extremely tight. Most HBM production is locked up years in advance by the biggest players, like OpenAI and major cloud providers, sometimes reportedly at the wafer or die level, which leaves very little available on the open market. this forces the smaller players in the AI sector to move down a step in and buy as much DDR5 as possible locking us retail consumers out.:confused:. This is what’s basically what’s going on in a nutshell.

Correct, to my information. But I pretty sure that the ai companies can get the HBM, but that takes a LOT of the actual ram chips, so that’s why ddr5 ram sticks are really expensive, they need the higher bandwidth chips, so the companies that make the ram chips are being sold out by the ai companies so basically thay say to the companies that make the ddr sticks: “If you really want the ram, than you have to give us more money to prove that you want it.” Or something a little different. Buf let’s just hope that the gpu company sapphire is correct and that rhis ends soon.

Noted. Thank you for the information. I think its time to go look into how computer memory works and what the difference in them are.

I wish @ArsenalPC was here too. id love their input and indepth knowledge.

Yeah, he’s got a lot of solid technical knowledge. Most of what I know comes from trial and error and what I’ve picked up as a hobbyist. He’s coming at it from real professional, hands-on experience.

My presence has been requested? Here I am.

The reason why RAM prices have increased so substantially, in an over-arching answer, is AI. Why is that the case, when all the AI-related cards use HBM and not DDR5? Additionally, the relevant GPUs that don’t use HBM are using GDDR7 and GDDR6X, which is not DDR5. So what’s the deal?

The deal is wafers. OpenAI has contracted approximately 40% of the entire world’s RAM wafer supply. This has an immediate consequence of fewer wafers being available for other types of RAM, even if they are not HBM.

It affects DDR5 more directly, as home users who are running AI will require a large amount of fast RAM. DDR4 production had already been slowed down considerably as demand was fairly low, so the existence of fewer overall wafers affects DDR4 pricing less.

What did AMD actually mean when they said they were doing everything they could to resume production of AM4 CPUs?
It’s important to note the seemingly intentional omission of mentioning any particular CPU series. They have a few options on the table here:

  • Start producing more existing models of Zen3, Ryzen 5000 series CPUs.
  • Start producing some new models of Zen3 CPUs they haven’t made before, like perhaps a new bin. Think something like a Ryzen 7 5850X which would be a higher-binned 5800X.
  • Create a new CPU based on a backport of Zen3+ (which is currently mobile-only) or a backport of Zen4, using the currently-unused name of Ryzen 6000 series on desktop. (Right now, Ryzen 6000 only represents Zen3+ on mobile.)

We’ll have to wait and see. What you should NOT do right now is purchase a $350-500 Ryzen 7 5700X3D or 5800X3D on the used market, knowing AMD has pledged continued AM4 support. It’s just not worthwhile. If you’re currently on AM4, and considering upgrade paths from an older or mid-range chip like an R5 3600 or R5 5500, just wait. AMD might pleasantly surprise you.

image

2 Likes

That is so damn cool… Thank you so much for that information.

when I finishing reading these forums articles on POD Termination, banks grouping, what it means for memory controllers are on CPUs and where split dimming comes into play and if it can be applied to the older tech, I will definitely ping you again for a better understanding.

For now, if anyone has time or is interested, one of my favorite channels to watch ( even if i do not understand electronics ) is Branch Education. The have a video on how memory works . Not trying to promote them or advertise, just a nice tech channel who does really in depth modeling and explanations on PC Parts.

I agree with @TheZacAttack, I think it’s interesting how the ram shortage is really happening. But one questionn, I currently have a Ryzen 7 5700, should I wait on a gpu upgrade to see what amd might pull out? And what I have is surfacing greatly for me so I’m currently in no need to upgrade, I just what to think of the future and what I should do. So do you have any suggestions?

What is your GPU?
My first instinct is so say if something is working and you’re fine with it for the next 2-3 years. theres no sense in upgrading it. Games do get more demanding with time but if you arent doing medium settings or scaling down past 70%, your pretty set to keep using it for the generation.

To be clear Zac, we’re not PC hardware engineers who design memory communication interfaces. We’re a computer store that sells on Newegg and Amazon.

So we have our “finger on the pulse” for what’s going on in the market, which parts are compatible with which other parts, thermals, boost behavior. Things that matter to the average PC builder. We are NOT familiar with whatever the heck you mentioned (POD Termination? Split dimming?) Never even heard of these terms. Been in the custom PC business for over 20 years.

Not saying you won’t find people interested in this topic, or even find people who might be able to explain what those terms are. But it’s not us. Want to know the pros and cons of using a 15 year old power supply in a PC? We understand that like the backs of our hands.

I have a gtx 1070 and I play Minecraft and farming simulator 25. I can currently play Minecraft just fine at 1080p and I can play fs25 (fs stands for farming simulator if you were wondering) at the high settings preset in 1080p. But giants software (the company that makes fs) made fs25 a really graphly demanding game, and from fs22 it was a BIG jump in my opinion and I don’t think there done with improving graphics yet, I that thay will improve it one more time before thay just improve features in the game. And I did eventually want to play at 1440p, not just 1080p. I hope that answers all of your questions