Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>There is a very high likelihood that when a user complains about Wayland support for Nvidia they mean this

As the actual person who receives these complaints... this isn't true. Once it's explained, though, the users still stick around and get angry because they made dumb, uninformed choices as a consumer and think it's our fault.

>If you're representing Wayland, this problem is your problem whether you want it or not.

It's not. We can just choose not to solve it. Use X and buy smarter when the next harware upgrade comes around, or wait until your hardware is supported by nouveau if you don't want to upgrade any time soon.

>What is the current UX for screenshotting, screencapture, and screencapture with audio for the most featureful DE that uses Wayland atm? Is there a program out there that offers push-button access for these three features (choosing the most sensible default format in each case)?

I don't really know what the situation is for more noob-friendly DEs like GNOME, but on sway this tool is the i3-equivalent of push-button simplicity:

https://wayland.emersion.fr/grim/

Push-button screen capture isn't there yet.



"the users still stick around and get angry because they made dumb, uninformed choices as a consumer and think it's our fault."

If this is the attitude of Wayland developers, it goes a long way towards explaining Wayland's ten year road of non-adoption.


You phrase that as though there is a real choice here. There is not. The open source community has had experience with how to interact with closed source drivers since the Linux 0.01 in 1991 and that experience screams "don't do it". Xorg has been a technical disaster for more than a decade; I have sympathy for the people who want to use Nvidia cards, but they are going to have to use X.org and like it anyway because that is all Nvidia supports.

Attempting to run a project based on the opinions of a group of Nvidia devs who (1) don't care about your project and (2) don't care about your goals is technical madness. It took the kernel more than a decade of stubbornness before all the other device driver writers caved and supported decent design. As far as I recall Nvidia is literally the last major device driver who refuses to play ball with open source. If Nvidia users can't use Wayland for driver reasons that is because of Nvidia's choices and there isn't anything the Wayland devs can do without repeating all the mistakes of X.org.

If it takes another decade before Nvidia caves and behaves like a good corporate citizen then that is a decade well spent by the Wayland devs. Long term maintainability will eventually trump short term user issues; just like it did for wifi et al. It is unfortunate Nvidia is making their user's lives hard, but the writing has been on the wall since AMD started open sourcing their driver stack in the 2008 era.


The problem here isn't the interaction with the nvidia devs. Like you said, flatly refusing to work with their binary drivers is WAI. The problem here is the interaction with wayland's users and potential users. It is perfectly possible to post exactly what you said here, along with other useful information about why and how people should expect their nvidia cards to fail to behave, instead of saying things like "users are idiots and will blame us no matter what we do". Communicating with users better will not only reduce anger at Wayland, accelerate its adoption, and improve its quality, it'll direct at least some of that anger at nvidia thereby directly enhancing the no-cooperation strategy you describe here.


Android and ChromeOS show it is possible to have it other way when FOSS religion doesn't stand in the way.


Android shows exactly what that would look like: You get a blob-like kernel/stdlib/driver package from your vendor, then build your operating system around that and never speak about it again because it's brittle mess of good enough.

The only reason Google and a few big players can ever ship updates is by playing hard with those vendors, and negotiating upgrade paths in advance. Nobody is interested in doing that for the desktop.


Exactly the example of the FOSS religion I was talking about.

Game devs just want to put shinny pixels on the screen, no one cares if it is a blob or not.

AMD is open source and there is hardly any advantage, X routinely crashs, doesn't provide support for the GPUs that were dropped from the rebooted driver, being stuck with fewer capabilities than fxgl.

Hollywood studios are quite happy with NVidia drivers.


Android and ChromeOS show how any device support is abandoned as soon as it ships.

My phone is running kernel 3.10 (the same as rhel7, except without backported fixes), and it isn't going to get anything newer.


AMD also decided to abandon my Radeon, dropping DX11 class features and video hardware decoding on their rebooted open source driver, so where is the difference?


Yes. Even if the analysis is exactly correct, this is not an acceptable way to think about or communicate the results. It leads thought in anti-user directions and shuts down communication.

Compare to something like this:

"We need to provide potential users of Wayland with better information, both to better calibrate their expectations and ensure that anger is properly directed and to help them make more informed buying decisions in the future."

Of fucking course you have a communication problem when your relationship with your userbase is founded on logic like "doing anything other than calling them idiots and ignoring them is useless and we shouldn't surrender to idiocy".


In general, users of $foo have no problem researching their hardware purchases to make sure that they are compatible with $foo. ...unless foo=FOSS. When buying a printer, many users have no problem researching the printer to make sure it works with their specific model of iPad, then turn around and say that it's their GNU/Linux distro's fault that it doesn't work with that printer.


If, as a consumer of software and hardware, you presume that it's incumbent upon people expending their own energy for free to cater to your personal situation and/or purchasing habits, you'll keep taking whatever you're given and I don't feel bad when creators ignore you. "Market adoption" is not a driving force here.


By the same token, if, as a developer of software, you presume that it’s incumbent upon people spending their own money to buy hardware that caters to your software design, one should not feel bad if users ignore your project and use alternatives that are compatible instead – namely X.

If Wayland developers truly do not care about adoption of their software, then that’s fine. But it means that UI frameworks which do care about adoption will have to continue to support both X and Wayland as backends indefinitely, increasing complexity. That’s okay for now, since removing X support would be a long way off in any case. But if enough years go by without at least a plausible route towards being able to remove it in the future, they may eventually decide to address the complexity issue by dropping Wayland instead.


Here's to another 10 years of non adoption, if wayland means "we don't care about popular (even dominant) use cases, now piss off".


Yep, that's the trade-off. Volunteer projects just aren't going to do whatever it takes, because market share doesn't override all other considerations.

But maybe that's okay. With a viable alternative, we can be patient.


The alternative is that the volunteers cave in to strangers who bought anti-FOSS hardware by making the volunteers' software much crummier over a long period of agonizing work. I think volunteers wanting clean FOSS being expected to do that shows a negative, selfish attitude plus lack of cooperation on the NVIDIA consumers making those demands, not the developers.

To illustrate with my own example, I bought a Linux-compatible laptop when I wanted to run Linux on a desktop. That rewarded the seller for effort they put in, reused the FOSS work already performed, and required no demands to be made of volunteers. These things are a two-way street. So, I met them in the middle. Did it again getting a Thinkpad cuz I wanted to try BSD's later this year. Supported all of them.

If trying to maximize adoption, you have to tolerate people's indifference and selfishness. They dont want to maximize sales for anti-FOSS company. So, they're telling that company's customers the issue, suggesting a switch, and having something waiting for AMD buyers.


You can't complain about adoption while taking a holier-than-thou stance towards users.

Since when has FOSS meant that I _can't_ work with proprietary bits? As a user, this is my choice. If software is going to dictate where I spend my money, then I'm less inclined to adopt it. That's not freedom.


I have distinct memories of people buying PC hardware specifically because it came with a "compatible with Windows Vista" sticker, and they were thinking of switching.

I didn't read any "holier-than-thou" sentiment in the parent comment. It's not unreasonable to expect users to think about the things they want to be compatible with when buying hardware.

"I want to be able to use the next version of Windows"... buy hardware that's known to be compatible with the next version of Windows. "I want to use this printer from my iPad"... buy a printer known to be compatible with that model of iPad. "I want to use this printer from GNU/Linux"... buy a printer known to be compatible with GNU/Linux. "I want to use OpenBSD"... buy hardware that's known to be compatible with OpenBSD. "I want to use Ubuntu 17.10"... buy hardware that's known to be compatible with Wayland.


The “holier-than-thou” part of the parent comment was calling Nvidia “anti-FOSS.”

Nvidia is going to do what Nvidia is going to do, for their own reasons. Unless you have some hidden inside information you shouldn’t assume they’re doing it in order to hurt Linux or Open Source.

They’ve probably done some evaluation of what they would get out of putting their drivers into the kernel tree versus keeping them to themselves, and decided that it wasn’t worth the work, expense, risk to their IP, etc.


Your memory involves a sticker on the box with the information you needed. You could draw an analogy if there were a "compatible with linux" sticker on some hardware.

People generally don't understand what linux is, or what a distribution is, or what versions of any of the above are new or upcoming. None of these problems apply to Microsoft, since there are many fewer versions, less fragmentation, and a marketing budget.


That's an unfair comment on 2 counts.

Firstly, it's refuting a different point than the one I was making. If the problem were "it's too difficult to determine if hardware is compatible with the distro I want to use," (which is a real problem) then the comment would at least be relevant. But the comment I was replying to said "If software is going to dictate where I spend my money [what hardware I buy], ..."; they were rejecting the validity the claim that they should consider which software they use when making a hardware purchase, which is true for no OS, and no reasonable user expects to be true.

Secondly, you are demanding an impossible task. Because GNU/Linux distros have less marketing budget, and not enough dominance, and can't convince hardware vendors to put a sticker on the box... they need to have better hardware compatibility than Microsoft, and just be compatible with all the hardware? Microsoft has vastly more resources to dedicate to hardware compatibly than just about any other organization, and hardware vendors themselves test their hardware with Windows. Expecting a less-popular desktop operating system to work with more arbitrary hardware than Windows does is unreasonable.

I get that "being able to try it out with the hardware I already own" is a hugely powerful thing. But most users who want to make any other switch accept that they might need to buy some new hardware when doing it. A user switching from a Windows laptop to an iPad as their daily-driver accepts that they may need to get a new printer that's compatible. They may even realize that they have an older iPad, and research the printer they want to make sure it's compatible with their model, and not blindly trust the AirPrint badge on the box. Few users would think that level of due-diligence is unreasonable. Saying "that research is difficult for GNU/Linux distros" is very different than saying "expecting any level of research at all is unreasonable".


"You can't complain about adoption while taking a holier-than-thou stance towards users. "

He was saying they're getting plenty of action. They're just not supporting users on specific hardware whose vendor is trying to block such action. It still is freedom. It just doesn't support that specific hardware.

"Since when has FOSS meant that I _can't_ work with proprietary bits? As a user, this is my choice. "

It really isn't if you're just a user. It's the developers' choices that dictate what software you can run on which hardware. Once their choices are made, you choose between what each offers. In this case...

"If software is going to dictate where I spend my money, then I'm less inclined to adopt it."

Nvidia is spending money on software that tries to block you from using free software with it easily. The volunteers developing one of those free packages refuse to build support for an anti-FOSS company putting up obstacles. Instead, they're doing work on companies helping them a bit or not putting up obstacles to their work.

As a user, it would be weird for you to claim to want free software while buying a piece of hardware whose developers are working against that goal. They'll also use your dollars from that purchase to do more activity that reduces your freedom as a user. You're free to choose to buy that hardware but there's no reasons for volunteers, much less those maximizing free software, to be forced to do painful work to support your choice. It's reasonable to say you're on your own if your choices create unnecessary obstacles.


> As the actual person who receives these complaints... this isn't true. Once it's explained, though, the users still stick around and get angry because they made dumb, uninformed choices as a consumer and think it's our fault.

In terms of sheer bang for your buck, Nvidia's cards have been on top for a while now (there's a good reason cryptocurrency miners have been using them), on Linux their proprietary drivers are the most stable in games - unless your only goal is to run Wayland, it's not exactly a "dumb, uninformed choice".

This seems like a clear problem with Wayland developers mentality if they expect consumers to make purchasing decisions based on their platform alone.


>In terms of sheer bang for your buck, Nvidia's cards have been on top for a while now

But they have not been on top for a long enough period that cards bought when they were are generally supported by nouveau, and cards bought after should be AMD. On top of this, the reality is that most people don't need top of the line GPUs. Last year's GPUs will run next year's games at max specs.

>Linux their proprietary drivers are the most stable in games

were


> Last year's GPUs will run next year's games at max specs.

If you want to run at 1080p@60hz perhaps, but many people now are going to 1440p, 2160p, 144hz or some combination there of which can easily strain even the latest GPUs.


You can get high end AMD cards for these, so it's not an issue if you want to use open drivers. Last year was a bit tight due to crytpocurrency mining rush (AMD GPUs were hard to buy, prices were crazy inflated and so on). This year should be better for gaming with open drivers, especially after Navi will come out.


Last I heard AMD drivers, both proprietary and open have severe stability issues on Linux. Is that no longer the case?


Not anymore for open drivers. Mesa developers had a lot of push to fix gaming related bugs. If you know some that are still broken, please open a bug on Mesa bug tracker and also list it here:

https://www.gamingonlinux.com/wiki/Games_broken_on_Mesa

This list is monitored by Mesa developers.

These days AMD explicitly recommend using open drivers for gaming.


I've bought an AMD Radeon RX580 on my desktop for gaming, and it's been working great so far with the default open source drivers that come with Ubuntu (18.04 onwards). Didn't have to do anything.


[flagged]


Nice argument


> on Linux their proprietary drivers are the most stable in games

Not anymore, as long as you are using latest Mesa. Also, Nvidia usage on Linux is gradually dropping. See: https://www.gamingonlinux.com/index.php?module=statistics&vi...

So I agree with position that compositor developers should not waste their time on supporting the blob. If Nvidia are so eager, let them contribute support, like they proposed for KWin.


From the site:

> This is currently in BETA. There are 8338 registered users on GamingOnLinux. These statistics are gathered using their manually entered data on their profiles. Please update your profile here to make this as accurate as possible.

> As with any survey, this won't be 100% accurate and should be taken with a pinch of salt.

8K users on a site about gaming. Doesn’t include RHEL/CentOS in its distribution chart. Way too myopic of a sample to even declare NVIDIA usage on Linux is dropping. For all we know it could be that new users are signing up that don’t have NVIDIA hardware, it doesn’t show a shift away from NVIDIA.

I know that your comment is about games, but over in the professional CG/VFX and AI/ML spaces, AMD is rarely touched. For both workstations and server racks.


I was talking about gaming. RHEL/CentOS are server distros, not gaming oriented by any means. Fedora is more like it in such cases, if you prefer that family of distros.

> it doesn’t show a shift away from NVIDIA

It does, as confirmed by many users who choose AMD for their next upgrades. Their choice of Nvidia was due to better performing drivers in the past, which is a non existent difference these days. So they choose to drop Nvidia and get AMD because of much better Linux integration (from kernel to the whole graphics stack and DEs), which is a consequence of AMD drivers being open.

> but over in the professional CG/VFX and AI/ML spaces, AMD is rarely touched

Not from what I've heard. AMD is better for GPGPU due to real asynchronous compute which Nvidia lacks. If anything, AMD hardware is more compute oriented, and Nvidia one is more gaming oriented.

AMD plan to address it with their next architecture (so called "super-SIMD"). But compute hardware support is already good there.

You might refer to CUDA lock-in, and some higher end libraries for AI being CUDA only. That's surely a problem, but not with hardware itself. AMD are working on addressing that as well (ROCm project).


> RHEL/CentOS are server distros, not gaming oriented by any means. Fedora is more like it in such cases.

That's why I ended by saying I know that parent comment and site was about gaming, but I wanted to add a bit of extra context. And these 'server' distros are the primary workstation hosts, both bare bones and PCoIP for pretty much the entire CG industry.

> It does, as confirmed by many users who choose AMD for their next upgrades.

It doesn't at all. This graph only shows a very generic view at best. If you wanted to see a shift, you would need a graph of users changing their profiles from NVIDIA hardware to AMD hardware and a separate graph of new users coming in with AMD gear. This is just a conglomerate of information in a single graph, there are way to many inferences that can be made that simultaneously can't be proven without more specific data.

> Not from what I've heard. AMD is better for GPGPU due to real asynchronous compute which Nvidia lacks. If anything, AMD hardware is more compute oriented, and Nvidia one is more gaming oriented.

Source? Working over in CG/VFX myself, NVIDIA is pretty much the only hardware to choose. For rendering, everything is CUDA based. Application support for AMD, not stellar. NVIDIA offers the most stable platform for our use cases so far. Release notes of software literally point out that AMD support is not as extensive as NVIDIA and subject to being unstable.

> You might refer to CUDA lock-in, and some higher end libraries for AI being CUDA only. That's surely a problem, but not with hardware itself.

There doesn't have to be a problem with the hardware, adoption will be zero as long as alternative OpenCL/Vulkan compute based libraries don't exist. I know there's been a long term project of getting OpenCL as an engine for TensorFlow, been a bit since I checked its progress. The OpenCL stack for AMD also doesn't have a stellar reputation for producing consistently competitive results. Phoronix released a strict OpenCL comparison adding the Radeon VII to the mix with ROCm 2.0 and as a developer, it wouldn't give me confidence that compared to CUDA I'm going to be getting the results I'm looking for. NVIDIA has a very mature stack and support with CUDA and its associated libraries. AMD is going to need to be able to supply that for people to ween themselves off of the green train.

Granted for more AI based work, one might look towards the Instinct lineup.

https://www.phoronix.com/scan.php?page=article&item=radeon-v...


> It doesn't at all.

If you didn't follow this, it doesn't to you. It does, to those who pay attention and constantly see people switching to AMD and commenting about their reasons. And these reasons are quite obvious really. Nvidia did nothing to improve their integration with Linux (the meain issue is their unwillingness to upstream their kernel driver), while AMD did a lot to improve their drivers. It's only natural to expect Nvidia Linux gaming usage to continuously drop as a result of the above.

> Source?

https://www.techpowerup.com/215663/lack-of-async-compute-on-...

AMD hardware is known to be better for GPGPU for a long time already.

> There doesn't have to be a problem with the hardware, adoption will be zero as long as alternative OpenCL/Vulkan compute based libraries don't exist.

If developers are using some closed source libraries, they are at mercy of those vendors who might be in the pocket of Nvidia. And no one stops open source libraries from not being CUDA only and from using Vulkan and OpenCL for compute needs. Given Vulkan is still rather new, it can take time for libraries to pick it up though for compute scenarios. But it will happen either way. Overall, I don't see good future for CUDA, if it will remain Nvidia only. Same as with gaming, open solutions will catch up, and Nvidia lock-in will crumble.


> If you didn't follow this, it doesn't to you. It does, to those who pay attention and constantly see people switching to AMD and commenting about their reasons.

I'm talking about this purely from a data point of view. I'm referring to the graph, and it has too many holes to be used as a sweeping generalization about the market and Linux users as a whole.

To be clear, I'm not arguing for or against NVIDIA/AMD. I'm just trying to point out the issues with using that graph as definitive evidence of anything.

> Nvidia did nothing to improve their integration with Linux, while AMD did a lot to improve their drivers.

And I applaud AMD for that. They've come a long way. However, (and this is personal opinion based off of experience), I don't see Wayland as being as a particularly massive reason. Wayland is what this thread is about anyways. There are still plenty of people using XOrg that are happy with that and don't run into any issues, myself included. That's not to say I don't support Wayland development and I hope NVIDIA never adds support for it in the future, but it's such a 'young' piece of tech that still requires some more maturing.

> AMD hardware is known to be better for GPGPU for a long time already.

Raw power is only a piece the pie. Some people take issue with that statement, but it's reality.

To the point in your link, Pascal fixes the async problem present in Maxwell.

https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-...

> And no one stops open source libraries from not being CUDA only and from using Vulkan and OpenCL for compute needs.

And yet they primarily are CUDA only. It's a conscious and intentional decision by the developers of those libraries to start with CUDA and stick to it. And the community not adding OpenCL support shows that AMD hardware isn't even in the space for that application.

I think that we could go back and forth with this, providing counter examples for each point the other brings up. I merely wanted to offer up an enterprise oriented view of the picture, when most people are looking at individual usage.


> I'm referring to the graph

Graph makes sense if you analyze the context. It's as I said above - increasing number of users are switching to AMD. Just check any thread in Linux gaming forums on the topic of "what should be my next GPU".

> I don't see Wayland as being as a particularly massive reason. <...> don't run into any issues,

It's one of the reasons. There are many issues with Nvidia. Abysmal integration due to lack of upstream driver such as broken vsync, no standard hardware sensors, no PRIME support and so and so forth are all well known Nvidia problems. To address them, Nvidia should either open up their driver, or to support Nouveau to begin with. So far they showed no interest in either of those cases.

> It's a conscious and intentional decision by the developers of those libraries to start with CUDA and stick to it.

That's too bad. Avoid libraries which proliferate lock-in. If their developers don't know any better, find those who do and support their efforts instead. It should be in the interest of actual developers who use said libraries, not to be locked into single hardware vendor.


> AMD hardware is known to be better for GPGPU for a long time already.

Maybe, maybe not, but the async compute argument you are making only applies to graphics applications that use compute shaders on maxwell - it doesn't apply to pure gpgpu without graphics.


Cryptocurrency miners preferred AMD cards, actually. And not the latest ones, but Polaris (4x0, 5x0), which provided the best efficiency wrt power, performance and price.

It's gamers, that prefer Nvidia. Nvidia has better optimized driver stack, although they also sometimes play fast and loose.


Originally, yes, in recent years however that changed. Look at the cards used for ZCash, Monero, etc.


Miners are increasingly moving to specialized ASICs anyway. Which is good for gamers who needs GPUs :)


Any anyone doing deep learning.


It is an uninformed choice if you want to run wayland. That doesn't mean there aren't other factors at play. For example back when I was shopping for a laptop 2yrs ago, AMD GPU's weren't even available on most vendors. Even now linux vendors such as system76 and entroware don't ship amd cards on their laptops. CUDA dominating the Deep Learning space doesn't help either. That said my next hardware upgrade will definitely have an open source supported GPU or I m just not buying.


>It is an uninformed choice if you want to run wayland

Creating a product that only works for people who already a) know about it and b) buy their hardware specifically to support it doesn't seem like a good strategy to achieve widespread adoption, which requires converting people who don't currently use or know about wayland.


Nvidia goes out of their way not to play nice with Linux. It is no wonder, that Linux ecosystem will not accommodate their special requirements.

When you are purchasing Nvidia, you are giving money to Nvidia. If you have problem with some software not working with a product your purchased, contact the people you gave money to for support.


> In terms of sheer bang for your buck, Nvidia's cards have been on top for a while now

https://www.phoronix.com/ has a recent benchmarking of absolute card performance[1], and sometimes does performance-per-dollar, as here[2] last year with OpenCL (reporting AMD's FLOPS as cheaper... when ROCm satisfices). I've found the site helpful when tracking bleeding-edge linux support for high-end graphics cards.

[1] https://www.phoronix.com/scan.php?page=article&item=radeon-v... [2] https://www.phoronix.com/scan.php?page=article&item=nvidia-r...


> I don't really know what the situation is for more noob-friendly DEs like GNOME, but on sway this tool is the i3-equivalent of push-button simplicity:

On Gnome you can bind shortcuts to copy save to a file or copy to clipboard the whole screen, the current window, or a rectangular screen region. In the latter case, you can drag a rectangle with the mouse to choose the area you want to screenshot. There is a visual indication of the copied region (it blends a white rectangle over it), and the clipboard works with all applications (both Wayland and XWayland).

For screen recording I've used Peek, which is pretty nice, too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: