Gaming on Linux is the death knell. The masses are moving in. It is going to get enshitified. It is probably too late to do anything, but we should try to gatekeep those who call for unreasonable change. I’m an old man shouting in the wilderness. So long and thanks for all the penguins.
I’ve been using GNU/Linux full-time for about 12 years now and tinkered around with it on the side for close to 20 years. The last Windows OS I used on my own machine was XP. I’ve done tech support for friends and family in the meantime, but my exposure to modern Windows systems is very minimal. I’ve even got my Dad running Manjaro (because I use Arch). I don’t think I am an expert, but I am probably a high-end power user.
I say all this to establish my position as someone who has been around the block but maybe not been here since the beginning. A lot of things have changed in the last couple decades in regards to Linux. For the most part I would say that things have gotten better on almost every front. Something feels different now though, and I’d like to make the case that it isn’t good.
With the enshitification of Windows accelerating in recent years coupled with a seemingly tireless minority advocating for Linux adoption there has been a significant influx of new Linux users. Another (I think very large) reason is that a younger gaming audience has been tapped via the development of Proton. I myself used to be a gamer in my teens and have recently (in the last 4-5 years) returned. 100% because of Proton. While you could get a lot of games working via Wine and tinkering, Proton plus Steam makes it literally seamless in the vast majority of games (online anti-cheat games being the exception).
This influx of users might look positive at first glance. While I only rarely lurk a few gaming/Linux forums, I see so much cheer-leading with every conversion. The year of the Linux desktop indeed. But, I have a warning to offer.
Windows 10 going EOL will accelerate the conversion
If Linux (in the most general sense and in particular the kernel itself) does not gatekeep, it will sooner or later become enshitified in the same way everything else has tended to.
I want to be clear: these are not complaints I have per se; I’m fairly well insulated because of the way I’ve configured my system. For anyone that cares, I run a very bare bones Arch. No login manager. No bars. No DE. Only Openbox. With the exception of a few major programs like a browser and Libreoffice, 95% of what I run is from a terminal. Again I say this simply to show that whatever bullshit is happening with the distro of the week (currently Bazzite it seems), the clash between systemd and init, pipewire vs. pulse, wayland vs X11, I simply don’t care. Whatever ya’ll do, I’ll be running my decades old setup happily until the day I die. My argument is more from a conservationist standpoint.
Ok, so what is my argument in a nutshell?
Because of the influx of users that are not familiar with things like the Unix philosophy and are familiar with their Windows systems, we already see a large increase in requests to make Linux more Windows like. This will inevitably lead to a critical mass where developers are “forced” to cater to the lowest common denominator user. Whereas Linux was generally a system for those with a DIY attitude (what I’ll refer to as technical users) it will find itself trying to please the incoming non-technical users that will inevitably outnumber the technical users.
While this isn’t necessarily a bad thing in and of itself, non-technical people are as important as technical people, the question I have is: why do we need everything to be accessible to the non-technical person? Windows and Mac already exist. They are, as far as I can tell, pretty well streamlined for non-technical users. Linux was the technical user OS. Seems like a reasonable diversity of choice. I mean, there are more non-technical users and there are more non-technical OS’s, but at least the technical user has a (highly customizable) option.
And, again, I understand that the argument here will be that the technical user will be capable of building their system however they want. They can avoid most of the changes they want to avoid. But I counter: for how long?
Ask a college computer science professor about their incoming students. They will shake their heads at how utterly disconnected the freshmen are. Smartphones and the “hiding” of any and everything technical has become such the norm that the incoming computer science students don’t even know what a file system is. The pictures are “in” the camera app. The music is “in” the music app. These are CS students. Joe Average will be ten times as clueless.
Which is OK. Not everyone needs to be a CS whiz. But some people do. Someone needs to be able to understand what the heck is going on in the computer. You might argue that we don’t teach Assembly anymore. Heck, we don’t even teach Lisp (or even Scheme). More advanced students will learn C, but for the most part it is Python and Java.
Again, this is OK. You can have a great career without touching anything low-level and staying in high-level abstraction land. But, again, someone has to be able to do that low-level magic. Someone will have to be all the way deep in the weeds developing RISC-VI.
All this is to say that if Linux follows the Window/Mac path and acquiesces to this new non-technical crowd with their requests to “upgrade” and “improve” everything (read: simplify), there will be less opportunity for those future wizards to get the exposure to a technical OS. How many times have you heard a request/complaint that goes something like this:
“I don’t ever want to have to open a terminal. It is current year! There should be a GUI for that. XYZ should configure automatically.”
I simply don’t understand these people. They want to have their cake and eat it too. If you don’t want to use a terminal then don’t use an OS that is built around a terminal. It’s OK to use Windows/Mac. Why must you change Linux to suit your needs?
Here is a perfect example of all the new users coming in and wanting things changed because… reasons.
He literally wants VLC (although he at least understands this is a him issue) installed by default on all distros. Why? Presumably because he uses VLC and fuck everyone else. And (proprietary) codecs should all be installed by default as well. He is clearly a user that can install this on his own, but he is bellyaching because he wants it his way. And mark my words he will get his way. If for no other reason that he (and more and more people like him everyday) is vocal.
And I am sure this guy thinks he is fighting the good fight. I don’t have any issue with him personally and I even watch his videos from time to time to get a different perspective. This is simply one example of many of how the newer users of Linux are trying to shape it without really understanding the (Unix) philosophy behind how it was built.
Don’t believe me?
Let’s look at a few examples that turned to shit as they transitioned from niche to main stream.
Reddit in the late 2000’s was a niche site. The communities were small and you could find a real diversity of opinions that were amazingly still civil in their disagreements. As the site grew I noticed that at around 50k users, a sub would becoming “too big”. You lost the community vibe. Trolls showed up. Moderation had to becoming heavier. Dissenting opinions were run off. In short, as the masses showed up Reddit turned to shit.
Another example that dovetails perfectly here is video games. Way back before my beard wasn’t grey, video games were niche. Video games were for nerds. And they were pretty great. Video games were basically made by people who loved video games. Every generation was met with excitement. Atari to NES to SNES to Dreamcast to PSX to Xbox. It was a wild ride. Then, after what many consider the golden age of about 1995-2005, the masses showed up. Larger markets led to more profits. Video games were no longer made by games, but by suits (aside: indie games are where it’s at these days). Long story short, whether they asked for it or not, the “modern audience” became the target market and gamers were left with the dog shit in a post Gamer Gate world.
I think the video game example is quite apropos. There is a lot of overlap between the new Linux users and the video gamers. Video games were made “more accessible” to appeal to a wider audience. Linux is following suit. The one major difference I can see is that even though Dragon Age: The Veilguard was hot trash, it doesn’t mean the next Dragon Age game can’t be a banger. Every video game is a blank slate whereas Linux (and much modern software) continually builds on itself. You can’t simply scrap some bad decision from last release like you can with a game.
Further, the whole Idea of the “modern audience” is kind of bullshit in and of itself. There is an old saying: “if you try to make something for everyone, you make something for no one.” With the trend of making games for as large of an audience as possible, you end up alienating your core market. The Dark Souls games are a glaring example of how there is a huge audience that doesn’t want their hands held. They don’t want easy games. These games are not my cup of tea, but I would never ask them to be changed to suit my preferences. There were something in the range of 18,000 games released on Steam last year. Not all of them have to be for everyone. Every other industry looks for product-market fit, the video game (and operating system and software in general) industry might do well to take note.
Blood in the water. That is what the masses are to the shark corporations. Where Linux, like video games, was built and maintained by people that did it because they were passionate, Windows/Mac (and anything else you can think of) are made for profit. They are made to appeal to the broadest audience possible, the lowest common denominator, the masses. That is where the money is.
Long before Linux starts gaining significant desktop market share you can bet your bottom dollar that Microsoft is watching. There is even a joke that Poettering was working for Microsoft when he built systemd. In recent years Microsoft has been much more open about working with Linux and open source software in general. Trust me when I say Microsoft does’t give a damn about anything other than profit. Which is fine, but the profit motive of Microsoft and the passion motive of Linux are incompatible. And profit will win.
The more users Linux garners the more corporate sharks show up.
Here is Google cozying up to the Linux Foundation. Not to mention Google’s control of Android.
Here are Google and Samsung, 12 years ago, being some of the top contributors.
And here is The Linux Foundation’s page on setting up corporate contribution.
This isn’t to say corporations shouldn’t be allowed to contribute, but it is another cautionary tale. A fox in the henhouse so to speak. Maybe the fox won’t eat the hens, but it would be imprudent to not at least keep a close eye on the fox.
Linus Torvalds is the “Benevolent Dictator for Life”. He is often harsh and runs a pretty tight ship from what I can tell. He is a gatekeeper par excellence. While I can’t speak for him, I’d wager he understands the critical nature of not letting any bullshit into a project as large as the Linux kernel. It has taken years to get Rust into the kernel. I’m not knowledgeable about the technical aspects of this addition, but I can guess that Linus’s caution is well warranted. While so much of modern software tries to adhere to the agile motto of “move fast and break things”, it stands to reason that the kernel is not one of the things that should break.
For 30 years he has guided the development of Linux with great success. The fact that I am writing this warning about getting too popular is testament to his way of doing things. When he retires what will happen? I’d guess the sharks move in. Can you imagine how Google or Apple or Microsoft would be salivating at the prospect of controlling Linux’s development?
Gabe Newell in 2013 discussed Linux as the future of gaming. While the Steam Machine failed, Steam pressed forward developing Proton. I’ve made it clear that gaming via Proton is the number one reason we are seeing the influx of new Linux users. And while I am no fanboy of Gabe, it is hard to argue that Valve is an/the absolute juggernaut in PC gaming today. So much so that the word monopoly is not unmerited. In a recent lawsuit it was disclosed that Steam’s profit per employee is at roughly $19MM. Almost certainly one of if not the highest profit per employee of any company in the world.
Yes, I was harping about corporate sharks only a few paragraphs ago, but… I stand by what I said and Valve is no different. While Valve has contributed greatly to Linux, it would be fool-hearted, to not be skeptical of their intentions. And those intentions are almost certainly to make as much money as they can.
The thing is, Gabe seems to be one of those rare breeds of people who have an actual passion for video games. He has, similar to Linus, run a tight ship to get where he is today. And like Linus I wonder, what happens to Valve when Gabe retires?
Richard Stallman. Beyond putting the GNU in Linux, he has been a free software advocate seemingly his whole life. While he probably isn’t as well known as Linus of Gabe nor does he have anything like the Linux Foundation or Steam he certainly has made quite the impact on the FOSS community via his Free Software Foundation. Love him or hate him, he is a real OG gatekeeper (to proprietary) no doubt.
Now that I’ve shit all over the new users I want to make it clear that I have no animosity towards anyone jumping ship to Linux. Linux is great and if it works for you, use it. My real point is that these new users should simply not be allowed to dictate where Linux goes in the future. You don’t like editing config files? Fuck right off back to Windows. You don’t like the terminal? Fuck right off back to Windows. You don’t go into someone else’s house and start rearranging things to be more like your house.
On the other hand if you want to actually learn to use Linux, there are ample resources (the Arch Wiki as one example) where you can find about anything you want. And contrary to popular belief, there are tons of people out there willing to help. Sure, you might get a cold RTFM now-and-again, but that is probably because you asked a question that has been asked a million times before. If you can’t search the ’net and/or read the man page, you probably should head on back to Windows/Mac. It isn’t a slight, but Linux is a technical user’s OS and a technical user knows how to solve their own problems. Or at least make a reasonable attempt to. If you post a question and list the things you’ve tried you’ll get a lot better help. You’ve shown you want to learn and given more context about your problem, win-win.
I ended up switching to Wayland 3 or 4 years ago precisely because X11 was so shit about remembering my monitor positions. I had to run an xrandr script every time it booted or otherwise decided to shit itself.
Why is running xrandr in your .bashrc such a bad thing? Oh, that’s right you are a non-technical user and want it to “just work”.
we’re back to having to basically having to write .xinit scripts. Because that’s what little so far wayland offers: less than xinit.
Why are .xinit scripts a bad thing? Oh, that’s right you don’t want to actually configure your machine to your liking… the machine should know how you want it to act without instructing it.
So far I’ve only talked about the end user side of things. This is where I am and I think this is where 99% of the people will see the changes happening.
That said, there is currently a lot of internal drama in the world of the kernel.
I can’t personally say I have any experience with this and I am only an amateur programmer, so I am not opinionated as to the technical side of things. I do, however, see the same patterns emerging that I have discussed up to this point.
There have been a few “high profile” maintainers that have recently left the project. One such example is Karol Herbst. In their own departure letter, they focused on a very specific and non-technical issue they had. Many people have chimed in and said this was the straw that broke the camel’s back and that there were other issues brewing under the surface. While this may or may not be the case, the final straw was in my opinion a clear cut example of identity politics and exactly the kind of thing I am arguing to gatekeep.
However, there is one thing I can’t stand and it’s hurting me the most. I’m convinced, no, my core believe is, that inclusivity and respect, working with others as equals, no power plays involved, is how we should work together within the Free and Open Source community.
I can understand maintainers needing to learn, being concerned on technical points. Everybody deserves the time to understand and learn. It is my true belief that most people are capable of change eventually. I truly believe this community can change from within, however this doesn’t mean it’s going to be a smooth process.
The moment I made up my mind about this was reading the following words written by a maintainer within the kernel community:
“we are the thin blue line”
This isn’t okay. This isn’t creating an inclusive environment. This isn’t okay with the current political situation especially in the US. A maintainer speaking those words can’t be kept. No matter how important or critical or relevant they are. They need to be removed until they learn. Learn what those words mean for a lot of marginalized people. Learn about what horrors it evokes in their minds.
I can’t in good faith remain to be part of a project and its community where those words are tolerated. Those words are not technical, they are a political statement. Even if unintentionally, such words carry power, they carry meanings one needs to be aware of. They do cause an immense amount of harm.
source:: https://www.phoronix.com/news/Karol-Herbst-Nouveau-No
First of all, the offending phrase was used as a turn of phrase to express how the kernel leaders can only stop new (presumably bad) code from making its way in. To wit:
The maintainers are not “all-powerful”. We are the “thin blue line” that is trying to keep the code to be maintainable and high quality. Like most leaders of volunteer organization, whether it is the Internet Engineering Task Force (the standards body for the Internet), we actually have very little power. We can not command people to work on retiring technical debt, or to improve testing infrastructure, or work on some particular feature that we’d very like for our users.
All we can do is stop things from being accepted (either in our subsystem, or the imprimatur of the IETF). Hopefully, we’re only stopping bad things from progressing, but that really is all we can actually do.
source:: https://lore.kernel.org/lkml/20250208204416.GL1130956@mit.edu/
Karol somehow takes offense to this drawing some vague connection between police (the phrase is often used to mean the police are the “thin blue line” between order and chaos) and some kind of oppressive and noninclusive environment.
The whole thing reeks of activism and I hope you can see that it has no place in a project like the Linux kernel (or any professional environment for that matter). I do not think people like Karol are acting in good faith and are feigning offense to gain some kind of social credibility to further their control over tone policing people.
This is, IMHO, exactly how activists infiltrate and co-op projects and organizations. A few of them get brought in. They push their beliefs until other people react. Then they themselves claim victimhood. The people that criticized them for bringing in (often) political views are painted as bullies and are either fired or quit. This opens up more positions and signals that the activists are welcome. Eventually all the “good” people leave and the project collapses under its own weight. This is a textbook example of The Long March Through The Institutions, a 1960’s socialist strategy to implement radical change in government.
The response to this has been bifurcated. Some people are suggesting that this “good old boys club” attitude is off-putting to younger programmers that want to contribute. I have seen no evidence of this.
Here are a few interesting quotes I found on the Phoronix forum before the thread was removed.
This dude was literally a maintainer for 10 years. It’s more that the Linux community has shifted into being more politics driven than technically driven. Politics around languages, and actual governmental politics, determines more about how you and your code is treated than the quality of the code or your personal behavior.
Top Linux maintainers seem to be very conservative in both politics and views on programming languages, and are actively hostile to anybody who disagrees with them regardless of code quality and it’s causing a lot of long time Kernel developers and fellow maintainers to leave. The Linux kernel is going to completely die out if nothing changes especially with Linus and GKH looking toward retirement.
Am I missing something? Karol is the one who stirred the pot making the loose connection between the “thin blue line” and their perceived oppression. It is also not going to die out. At least not any time soon. Linux runs the web and the more likely scenario is that corps take over. Hell, many of the kernel developers are paid employees of places like Microsoft and Google. I am FAR more concerned with corporate control weaseling deeper into Linux than running out of volunteers because of drama.
The number of far right nut jobs in this forum is insane. Like, stop shoving your quite frankly ridiculous political beliefs into every crevice and faucet of the internet and accusing people of falsely being woke when that’s not even what woke means, you far right lunatics made up a whole different meaning because you can’t handle different beliefs and opinions to yours.
Anyways, it was good of Karol to leave; I wouldn’t want to touch the Linux kernel either with how toxic the kernel’s leadership team is.
keep calling everyone that disagree with you about the woke agenda far-right exactly like the democrats did for the past 4 years, and see what the future will hold for your kind when you cannot read the room. Hint: it won’t swing back in 4 years unless your kind swing your behavior away from woke.
While this points more to the other forum posters, it is quite frankly ridiculous to not see that the political beliefs were being brought up by a far left nut job. No “normal” people are offended by the phrase “thin blue line”.
The response is correct. If the people on the political left can’t see the sea change, they are going to get swept away. Like it or not social and political sentiment seems to ebb and flow every few generations. My Grandpa was conservative. My Dad and his siblings were more progressive. The younger generation is clearly shifting to the right (although still move progressive overall).
And lastly:
being offended is the most powerful modern weapon: the one who gets offended the most (or early) wins.
I think this sums it up perfectly. We’ve given the so-called offended people such privilege that you can’t be surprised when more people try to use offense to get what they want.
There is another longer running drama in the kernel concerning the adoption of Rust. While I don’t see this one on par with the clearly identity politics of Karol, it is still a symptom of decay. Before you say “no, Rust is about injecting new life”, hear me out.
First of all, Rust very well may be a good addition to the kernel. Again, I don’t have the expertise to comment on the technical side of things. My naive take is that there seems to be so much contention and enough people evangelizing in favor or Rust, that a hard fork might be the best approach. Or at least the Rust evangelicals should build their own “Rustix” to show how much greener those grasses are.
I say “decay” because I feel like many people today are so accustomed to changing seemingly for the sake of changing that they are missing the bigger picture. The people that built Linux have been around the block. The people that created C have been around the block. Are Linux and C perfect? Of course not. But, those grey beards were sharp as tacks. They were working with systems that are dwarfed in every way by our modern watches and phones let along PCs and servers. They came from an era where there were real hardware limitations. Today you rarely have such limitations outside of the largest and most cutting edge applications.
If you want to write a new program today, you barely have to concern yourself with memory. Who cares if there is a leak when Joe Average has a computer with 8 or 16 BILLION bits of memory as opposed to the 1024 bits in the 1970’s Intel 1103. The same case can be made for CPU speeds. We have so much raw horsepower today that efficiency is not only not required, it often isn’t even an afterthought.
Here is an wonderful short video on how fast our modern computers are and how garbage modern software is by Jonathan Blow.
But this leads to bad places. Bloat. What happens if we collectively continue down this road? Maybe the quantum computers will have even more horsepower and we can continue writing inefficient code. Maybe we will reach some kind of hard limit on computational power and have to go back and start wringing every ounce of efficiency out of old systems.
Adopting Rust simply because if seems good on paper might lead to some disasterous effects 10 years down the line. Of course if might also lead to wonderful effects also. I’m not arguing for or against Rust, I am arguing that the Linux kernel has come a long way in the last 35 odd years. It has taken over the world as Linus liked to joke about. It is in almost all of our devices. Over half of the worlds’ phones. The smart appliances. The server world. It is literally dominat to a laughable degree everywhere except the desktop.
Should we really try a new approach because it might be better eventually? Should we really be trying out new things in such a mission critical piece of software like the Linux kernel? Are the old guys stuck in their ways for good reasons or bad? Do the youngsters not understand the importance of stability at the cost of progress?
Some have said that C is a legacy language and dying out. This is patently false. Some are arguing that Rust is the future. This may or may not be the case. Granting Rust is able to deliver, I’m sure it will be adopted. None of that is the point though.
The point is that new people show up and make (in the case of developers) or request (in the case of users) changes that might seem reasonable to them at the moment that might have far reaching second and third order consequences that have not even been explored let alone understood.
When this happens to some new fangled application and it turns out to be a disaster, it is fairly easy to drop those changes in a future version. My example of video games is an extreme example of this. Every release of a game is a fresh slate. You can retain good changes and discard bad ones. With something like Windows you have a similar, but longer, iteration process. The Linux kernel and the operating system components surrounding it don’t work quite the same way. The changes we make today will be with us for decades. There is no “hard reset” every now and then.
Personally I would prefer, again in all my naivety, that the kernel move slowly. Glacially. It works great today. C isn’t in any danger of dying out. If Rust or anything else is proven to be a right fit, so be it.
As for the desktop, I would prefer basically the same thing. Move slowly. Don’t make changes for the sake of bringing in new users. Don’t “dumb down” anything without a good reason. I’m not completely averse to change, but I simply want a damn good reason to change.