upvote
I purchased my RX 580 in early 2018 and used it through late 2024.

I am critical of AMD for not fully supporting all GPUs based on RNDA1 and RDNA2. While backwards compatibility is always better than less for the consumer, the RX 580 was a lightly-updated RX 480, which came out in 2016. Yes, ROCm technically came out in 2016 as well, but I don't mind acknowledging that it is a different beast to support the GCN architecture than the RDNA/CDNA generations that followed (Vega feels like it is off on an island of its own, and I don't even know what to say about it).

As cool as it would be to repurpose my RX 580, I am not at all surprised that GCN GPUs are not supported for new library versions in 2026.

I would be MUCH more annoyed if I had any RDNA1 GPU, or one of the poorly-supported RDNA2 GPUs.

reply
deleted
reply
ROCm usually only supports two generations of consumer GPUs, and sometimes the latest generation is slow to gain support. Currently only RDNA 3 and RDNA 4 (RX 7000 and 9000) are supported: https://rocm.docs.amd.com/projects/install-on-linux/en/lates...

It's not ideal. CUDA for comparison still supports Turing (two years older than RDNA 2) and if you drop down one version to CUDA 12 it has some support for Maxwell (~2014).

reply
Worse, RDNA3 and RDNA4 aren't fully supported, and probably won't be, as they only focus on chips that make them more money. If we didn't have Vulkan, every nerd in the world would demand either a Mac or an Intel with Nvidia chip. AMD keeps leaving money on the table.
reply
Up until recently they didn't even support their cashcow Ryzen 395+ MAX properly. Idk about the argument that they only care about certain chips.
reply
If you are on an unsupported AMD GPU, why would you ever consider switching to a newer AMD GPU, considering you know that it will reach the same sorry state as your current GPU?

Especially when as you say, the latest generation is slow to gain support, while they are simultaneously dropping old generations, leaving you with a 1-2 year window of support.

reply
It's pretty crazy that a 6900XT/6950XT aren't supported.
reply
Eh, YMMV. I was using rocm for minor AI things as far back as 2023 on an "unsupported" 6750XT [0]. Even trained some LoRAs. Mostly the issues were how many libs were cuda only.

[0] https://news.ycombinator.com/item?id=43207015

reply
I have RX 6700XT, damn. AMD is shooting themselves in the foot
reply
Try it before you give up, I got plenty of AI stuff working on my 6750XT years ago.
reply
I have the same experience with my RX 5700. The supported ROCm version is too old to get Ollama running.

Vulkan backend of Ollama works fine for me, but it took one year or two for them to officially support it.

reply
Vulkan backends work just fine, provided one wants to be constrained by Vulkan developer experience without first class support for C++, Fortran and Python JIT kernels, IDE integration, graphical debugging, libraries.
reply
Did it used to be different?

A few years ago I thought I had used the ROCm drivers/libraries with hashcat on a RX580

Now it’s obsolete ?

reply
RX 580 is a GCN 4 GPU. I'm pretty sure the bare minimum for ROCm is GCN 5 (Vega) and up.
reply
Among consumer cards, latest ROCm supports only RDNA 3 and RDNA 4 (RX 7000 and RX 9000 series). Most stuff will run on a slightly older version for now, so you can get away with RDNA 2 (6000 series).
reply
Huh, I just saw that. Huge bummer.

I have a Radeon RX 6800 and on my system, I use ROCm's OpenCL for some stuff and HIP for blender cycles rendering. If ROCm were to drop support for my card, that'd be a huge bummer.

reply