NHacker Next
login
▲AMD Open Source Driver for Vulkan project is discontinuedgithub.com
123 points by haunter 14 hours ago | 37 comments
Loading comments...
tracker1 3 minutes ago [-]
I've been running an RX 9070XT since close to release... I've also been running PopOS Cosmic Alpha for the past few months and for better game compatibility been sticking close to the latest mainline kernel.

Just yesterday, I tried getting ROCm working to see of I could use StableDiffusion. Well, in the end 6.16 is currently unsupported and after a few hours of fail, I managed to get the in the box kernel module working again and gave up. It is emphatically nice that many/most games now run through Mesa/Vulkan+Proton without issue... but it would be nice to actually be able to use some of the vaulted AI features in AMD's top current card in the leading edge Linux Kernel release with their platform.

Hopefully sooner than later, this will all "just work" mostly and won't be nearly the exercise in frustration for someone who hasn't been actively in the AI culture. I could create a partition for a prior distro/kernel or revert back, but I probably shouldn't have to, in general I tend to expect leading edge releases to work in the Linux ecosystem, or at least relatively quickly patched up.

greatgib 7 hours ago [-]

   AMD is unifying its Linux Vulkan driver strategy and has decided to discontinue the AMDVLK open-source project, throwing our full support behind the RADV driver as the officially supported open-source Vulkan driver for Radeon™ graphics adapters.
Scary title but good news in the end I think.
willvarfar 7 hours ago [-]
What level of support will they give RADV? Or is it just that AMD ultimately do less?
account42 6 hours ago [-]
They have done pretty well with the open source OpenGL drivers that were also initially developed outside AMD.

AMDVLK was always a weird regression in the openness of the development model compared to that. Maybe understandable that the bean counters wanted to share the effort between the Windows and AMD drivers but throwing away the community aspect in order to achieve that made that approach doomed from the start IMO. The initial release being incredibly late (even though Vulkan was modeled after AMD's own Mantle) was the cherry on top that allowed RADV to secure the winning seat but probably only accelerated the inevitable.

pjmlp 3 hours ago [-]
So well that my Asus Netbook went from OpenGL 4.1 down to OpenGL 3.3, and when it finally got OpenGL 4.1 back, several years later, it died a couple of months later.
account42 2 hours ago [-]
Yes exactly, they (or someone else) did eventually add OpenGL 4.1 support for your GPU to the open source drivers which never had it before.

That you were "forced" to switch away from the old proprietary driver for some reason does not negatively implicated AMD's contribution to the open source drivers.

pjmlp 2 hours ago [-]
The reason being the old proprietary driver were dropped from Linux distros without feature parity, and given how great Linux drivers work across kernel versions, everyone got a downgraded experience for several years.
2 hours ago [-]
tonyhart7 5 hours ago [-]
why we have 2 project anyway??? what is the history???

I thought mesa is always default since I use fedora kde

account42 4 hours ago [-]
AMD developed their closed source Vulkan driver for Windows based on their proprietary shader compiler from their existing proprietary OpenGL driver (amdgpu-pro). They promised to release this driver as open source but didn't want to release the shader compiler for who knows what reason so this took them a while. Meanwhile David Airlie (Red Hat) and Bas Nieuwenhuizen (student at the time) didn't want to wait for that or were just looking for a challenge and wrote their own open source Vulkan driver (radv) which got pretty good results from the start. Linux distributions prefer open source drivers so this one quickly became the default. One AMD released the open-source version of their driver (amdvlk) it was faster than radv in some games but not decidedly so. It was also not an open project but rather just an open source release of their proprietary driver with a different shader compiler. So there wasn't really any reason for the open source developers to abandon their work on radv and switch to amdvlk. But they could and did use amdvlk to learn from it and improve radv so it was still useful. When Valve decided to contribute directly to Linux graphics drivers, radv was already winning so they backed that one as well.

Note that this is only about the user-space portion of the driver - the kernel part of the Linux drivers is shared by all of these as well as the OpenGL drivers - there used to be a proprietary kernel driver from AMD as well but that was abandoned with the switch to the "amdgpu-pro" package.

tonyhart7 3 hours ago [-]
I thought mesa is linux only driver

so they did use that for windows as well now right

so valve and OSS community make a better driver than amd themselves??? shit is new low

giancarlostoro 41 minutes ago [-]
Idk but MESA never worked for me, ever. Any time I installed a distro to try, if MESA was running, I basically had a non-functioning desktop. I think part of it may have been Wayland related, which is frustrating, but these days its gotten drastically better.
arghwhat 6 hours ago [-]
They already work on radv, which is already the better vulkan driver.

This is a matter of AMD no longer wasting time on a pointless duplicate project no-one is really interested in. They can allocate more resources for amdgpu and radv and ultimately do less overall by getting rid of the redundant project.

Win-win.

sylware 4 hours ago [-]
It is dangerous for RADV which already has its own issues. And when you look at AMDVLK, you don't want those devs anywhere near RADV.
CBLT 11 hours ago [-]
https://www.phoronix.com/news/AMDVLK-Discontinued

> This is a good but long overdue decision by AMD. RADV has long been more popular with gamers/enthusiasts on Linux than their own official driver. Thanks to Valve, Google, Red Hat, and others, RADV has evolved very nicely.

andy_ppp 4 hours ago [-]
I always think just open sourcing the whole software stack for graphics cards would be an excellent thing for hardware manufacturers, in the end these are free pieces of software and I've certain there would be a big community contributing loads of cool things for free. AMD (say) would also sell a load more hardware as enthusiast features would be added by the community.

Maybe I'm just naive but the downsides of doing this seem absolutely minimal and the upsides quite large.

fidotron 10 minutes ago [-]
> I always think just open sourcing the whole software stack for graphics cards would be an excellent thing for hardware manufacturers,

> Maybe I'm just naive

Yep.

There are things hidden in the design of very widely used hardware that would make people's heads explode from how out there they are. They are trade secrets, and used to maintain a moat in which people can make money. (As opposed to patents which require publishing publicly).

If you live in open source land you cannot make money from selling software. If there is no special sauce in the hardware you won't be able to make money from that either. Then we can all act surprised that the entire tech landscape is taken over by ads and fails to meaningfully advance.

Symmetry 17 minutes ago [-]
Those are essentially the reasons that Intel has always(?) had open source GPU drivers and AMD has been supporting open source since around 2009. As a result I think most people would recommend AMD cards for people interested in gaming on Linux, the experience can be a lot smoother than using NVidia's closed source drivers.
ChocolateGod 36 minutes ago [-]
That's easier said than done, AMD and Nvidia probably have licensed and patented code etc in their closed-source drivers, which would make it difficult to open source, where as a project open source from the get go won't have these issues.

Nvidia got around this on their kernel driver by moving most of it to the cards firmware.

giancarlostoro 40 minutes ago [-]
I still do not understand why they don't it makes their hardware basically good for life, since now you can run it on any OS if you really want to put the effort in to wire it all up.
dcan 33 minutes ago [-]
Patents and licensing, usually

https://news.ycombinator.com/item?id=39543291

giancarlostoro 10 minutes ago [-]
Ah so HDMI is one part of it, that's really unfortunate. Thank you for this insight.
potwinkle 8 hours ago [-]
This is great news for RADV development, I'm hoping someday we can even use ROCm on the open source stack.
account42 6 hours ago [-]
The kernel level of the stack was already open though, this only changes the Vulkan front end which AFAIK is irrelevant to ROCm.
suprjami 3 hours ago [-]
Depending on what you want to do, you already can.

llama.cpp and other inference servers work fine on the kernel driver.

shmerl 11 hours ago [-]
What will AMD do with Windows Vulkan driver, didn't they use amdvlk there? There was some radv on Windows experiment, it would be cool if AMD would use that.
trynumber9 11 hours ago [-]
No, it was a third driver.

Per AMD

>Notably, AMD's closed-source Vulkan driver currently uses a different pipeline compiler, which is the major difference between AMD's open-source and closed-source Vulkan drivers.

kimixa 7 hours ago [-]
The windows driver has 2 paths, the internal compiler, and the same LLVM as in the open source amdvlk release (though there might be things like not-yet-upstreamed changes, experimental new hardware support etc. that differ from the public version, it was fundamentally the same codebase). The same for DX12 (and any other driver that might use their PAL layer). If you want to confirm you can see all the llvm symbols in the driver's amdvlk{32,64}.dll and amdxc{32,64}.dll files. From what I remember, the internal compiler path is just stripped out for the open source amdvlk releases.

I believe the intent was to slowly deprecate the internal closed compiler, and leave it more as a fallback for older hardware, with most new development happening on LLVM. Though my info is a few months out of date now, I'd be surprised if the trajectory changed that quickly.

account42 6 hours ago [-]
AFAIK the closed source shader compiler was/is also available for Linux in the amdgpu-pro package, just not in the open source releases.
shmerl 11 hours ago [-]
Why are they using different compilers?
account42 6 hours ago [-]
Either licensing issues (maybe they don't own all parts of the closed source shader compiler) or fears that Nvidia/Intel could find out things about the hardware that AMD wants to keep secret (the fears being Unfounded doesn't make the possibility of them being a reason any less likely). Or alternatively they considered it not worth releasing it (legal review isn't free) because the LLVM back-end was supposed to replace it anyway.
AnthonyMouse 5 hours ago [-]
> or fears that Nvidia/Intel could find out things about the hardware that AMD wants to keep secret (the fears being Unfounded doesn't make the possibility of them being a reason any less likely)

When the fears are unfounded the reason isn't "Nvidia/Intel could find out things about the hardware", it's "incompetence rooted in believing something that isn't true". Which is an entirely different thing because in one case they would have a proper dilemma and in the other they would need only extricate their cranium from their rectum.

mschuster91 4 hours ago [-]
> When the fears are unfounded the reason isn't "Nvidia/Intel could find out things about the hardware"

Good luck trying to explain that to Legal. The problem at the core with everything FOSS is the patent and patent licensing minefield. Hardware patents are already risky enough to get torched by some "submarine patent" troll, the US adds software patents to that mix. And even if you think you got all the licenses you need, it might be the case that the licensing terms ban you from developing FOSS drivers/software implementing the patent, or that you got a situation like the HDMI2/HDCP situation where the DRM <insert derogatory term here> insist on keeping their shit secret, or you got regulatory requirements on RF emissions.

And unless you got backing from someone very high up the chain, Corporate Legal will default to denying your request for FOSS work if there is even a slight chance it might pose a legal risk for the company.

jacquesm 7 hours ago [-]
Bluntly: because they don't get software and never did. The hardware is actually pretty good but the software has always been terrible and it is a serious problem because NV sure could use some real competition.
AnthonyMouse 5 hours ago [-]
I wish hardware vendors would just stop trying to write software. The vast majority of them are terrible at it and even within the tiny minority that can ship something that doesn't non-deterministically implode during normal operation, the vast majority of those are a hostile lock-in play.

Hardware vendors: Stop writing software. Instead write and publish hardware documentation sufficient for others to write the code. If you want to publish a reference implementation that's fine, but your assumption should be that its primary purpose is as a form of documentation for the people who are going to make a better one. Focus on making good hardware with good documentation.

Intel had great success for many years by doing that well and have recently stumbled not because the strategy doesn't work but because they stopped fulfilling the "make good hardware" part of it relative to TSMC.

exDM69 4 hours ago [-]
> I wish hardware vendors would just stop trying to write software.

How would/should this work? Release hardware that doesn't have drivers on day one and then wait until someone volunteers to do it?

> Intel had great success for many years by doing that well

Not sure what you're referring to but Intel's open source GPU drivers are mostly written by Intel employees.

adrian_b 2 hours ago [-]
The documentation can be published in advance of the product launch.

Intel and AMD did this in the past for their CPUs and accompanying chipsets, when any instruction set extensions or I/O chipset specifications were published some years in advance, giving time to the software developers to update their programs.

Intel still somewhat does it for CPUs, but for GPUs their documentation is delayed a lot in comparison with the product launch.

AMD now has significant delays in publishing the features actually supported by their new CPUs, even longer than for their new GPUs.

In order to have hardware that works on day one, most companies still have to provide specifications for their hardware products to various companies that must design parts of the hardware or software that are required for a complete system that works.

The difference between now and how this was done a few decades ago, is that then the advance specifications were public, which was excellent for competition, even if that meant that there were frequently delays between the launch of a product and the existence of complete systems that worked with it.

Now, these advance specifications are given under NDA to a select group of very big companies, which design companion products. This ensures that now it is extremely difficult for any new company to compete with the incumbents, because they would never obtain access to product documentation before the official product launch, and frequently not even after that.

mschuster91 4 hours ago [-]
The problem is, making hardware is hard. Screw something up, in the best case you can fix it in ucode, if you're not that lucky you can get away with a new stepping, but in the worst case you have to do a recall and not just deal with your own wasted effort, but also the wasted downstream efforts and rework costs.

So a lot of the complexity of what the hardware is doing gets relegated to firmware as that is easier to patch and, especially relevant for wifi hardware before the specs get finalized, extend/adapt later on.

The problem with that, in turn, is patents and trade secrets. What used to be hideable in the ASIC masks now is computer code that's more or less trivially disassemblable or to reverse engineer (see e.g. nouveau for older NVDA cards and Alyssa's work on Apple), and if you want true FOSS support, you sometimes can't fulfill other requirements at the same time (see the drama surrounding HDMI2/HDCP support for AMD on Linux).

And for anything RF you get the FCC that's going to throw rocks around on top of that. Since a few years, the unique combination of RF devices (wifi, bt, 4G/5G), antenna and OS side driver has to be certified. That's why you get Lenovo devices refusing to boot when you have a non-Lenovo USB network adapter attached at boot time or when you swap the Sierra Wireless modem with an identical modem from a Dell (that only has a different VID/PID), or why you need old, long outdated Lenovo/Dell/HP/... drivers for RF devices and the "official" manufacturer ones will not work without patching.

I would love a world in which everyone in the ecosystem were forced to provide interface documentation, datasheets, errata and ucode/firmware blobs with source for all their devices, but unfortunately, DRM, anti-cheat, anti-fraud and overeager RF regulatory authorities have a lot of influence over lawmakers, way more than FOSS advocates.