

So what you’re saying is that there are more wankers on Linux nowadays!???
So what you’re saying is that there are more wankers on Linux nowadays!???
I had the exact same experience: been doing Linux since the 90s, both for fun and professionaly - the latter mainly in pure server configurations - finaly got around to moving my home PC (which is mainly for gaming) to Linux (using Pop!OS, since I have a Nvidia graphics card and it just supports it out of the box) and it just worked.
Only problem I have with it is that on startup of X I usually get a blank screen and have to switch my monitor OFF and back ON again.
Oh, and startup times are a fraction of Windows startup times (my Windows 10 work machine literally takes longer to wake up from hybernation than my home Linux PC takes to cold boot, and they have equivalent SSDs.
I think I got more hassle with Windows than I do with Linux.
Funnily, sometimes the pirate version of a game works whilst the official one does not…
Yeah, but at least we knew how to switch consoles.
I bet that most Linux users nowadays don’t event know the CTRL+ALT+Fx shortcuts to switch console.
Can’t say that the old days were really “good” compared to what we had now, but there was definitelly a lot of satisfaction in step by step getting the system to work.
Stories from the “good” old days running Linux on a 386 machine with 4 MB or less of memory aside, in the present day it’s still perfectly normal to run Linux on a much weaker machine as a server - you can just rent a the cheapest VPS you can find (which nowadays will have 128 MB, maybe 256MB, and definitelly only give you a single core) and install it there.
Of course, it won’t be something with X-Windows or Wayland, much less stuff like LibreOffice.
I think the server distribution of Ubunto might fit such a VPS, though there are server-specific Linux distros that will for sure fit and if everything fails TinyCore Linux will fit in a potato.
I current have a server like that using AlmaLinux on a VPS with less than 1GB in memory, which is used only as a Git repository and that machine is overkill for it (it’s the lowest end VPS with enough storage space for a Git repository big enough for the projects I’m working on, so judging by the server management interface and linux meminfo, that machine’s CPU power and memory are in practice far more than needed).
If you’re willing to live with a command line interface, you can run Linux on $50 worth of hardware.
Similar story but I just installed slackware on one of the University PCs (they just had a handful of PCs in the general computer room for the students and nobody actually watched over us) since I did not have a PC yet (only had a ZX Spectrum at the timback then).
Trying to get X-Windows to work in Slackware was interesting, to say the least: back then you had to manually create your own video timings configuration file to get the graphics to work - which means defining the video mode at the very low level, such as configuring the number of video clock cycles between end-of-line-drawing and horizontal-retrace - and fortunatelly I didn’t actually blow up any monitor (which was possible if you did the configuration wrong).
At least we had some access to the Internet (most things were blocked but we had Usenet and e-email and one could use FTPmail gateways to download stuff from remote servers) via Ethernet, so that part was easy.
Anyways, my first reaction looking at the OP’s post was like: yeah, if they’re running X it’s probably a too powerfull machine.
It’s a great game with great gameplay and which is surprisingly replayable in survival mode if you go a year or two between plays.
It’s also one which, IMHO, doesn’t need visual enhancements - their choice of visual style was masterful (it works and is a lot cheaper in terms of 3D modelling costs that something more realistic would have been) and it’s the gameplay (which is pretty much all emergent gameplay in survival mode with no fixed set-pieces, though on a fixed game map) that makes it a great game.
I think that once one goes into software development professionally, mucking about with Linux configuration stops being something one does as a fun learning hobby and becomes something one does for work and hence can’t be arsed to also do at home during one’s free time.
Certainly that’s how it goes for me: all I want from my Linux machine at home is that it delivers the least hindrance possible to my web-browsing, gaming, 3D printing and so on, whilst still protecting my privacy and letting me to a little bit of playing around with its more powerful features but only when I feel like it, not as a requirement to use it.
The same also applies to other techie stuff, by the way: I’m no early adopted of latest and greatest because I don’t want to be somebody’s beta tester, since I have enough hassle already testing and fixing my own code (were I can actually deploy good practices to reduce the amounts of bugs and hence frustration, unlike the vast amounts of amateur-hour crap out there being shipped as final products that are just beta tests that never end).
/RANT
Even if AI is an actual tool that improves the software development speed of human developers (rather than something that ends up taking away in time spending reviewing, correcting and debugging the AI generated code, the time savings it gives in automatically writing the code), it’s been my experience in almost 30 years of my career as a Software Engineer that every single tooling improvements that makes us capable of doing more in the same amount of time is eaten up by increasing demands on the capabilities of the software we make.
Thirty years ago user interfaces were either CLI or pretty simple with no animations. A Software Systems was just a software application - it ran on a single machine with inputs and outputs on that machine - not a multi-tiered octopus involving a bunch of back end data stores, then control and data retrieval middle tiers, then another tier doing UI generation using a bunch of intermediate page definition languages and a frontends rendering those pages to a user and getting user input, probably with some local code thrown into the mix. Ditto for how cars are now mostly multiple programs running of various microcontrollers with one or more microprocessors in the mix all talking over a dedicated protocol. Ditto for how your frigging “smart” washing machine talking to your dedicated smartphone app for it probably involves a 3rd machine in the form of some server from the manufacturer and the whole thing is running over TCP/IP and using the Internet (hence depending on a lot more machines with their dedicated software such as Routers and DNS servers) rather than some point-to-point direct protocol (such as Serial) like in the old days.
Anyways, the point being that even if AI actually delivers more upsides than downsides as a tool to improve programmer output, that stuff is going to be eaten up by increasing demands on the complexity of the software we do, same as the benefits of better programming languages were, the benefits of better IDEs were, of the widespread availability of pre-made libraries for just about everything were, of templating were, of the easiness to find solutions for the problem one is facing from other people on the Internet were, of better software development processes were, of source control were, of colaborative development tools were and so on.
Funnily enough, for all those things there were always people claiming it would make the life of programmers easier, when in fact all it did was make the expectations on the software being implemented go up, often just in terms of bullshit that’s not really useful (the “smart” washing machine using networking to talk to a smartphone app so that the machine manufacturers can save a few dollars by not putting as many physical controllers in it, is probably a good example)
Yeah, I was in a very similar situation as you some months ago (decades of using Linux on an off for fun or at work mainly via command line), did the jump on my gaming PC and because my games are mainly from GoG went down the path of Lutris as a launcher for those games and am very happy with it, especially since it’s both integrated with GoG so can fetch your games from them AND it can handle the offline installers (you just do install from EXE and then chose the GoG script for that game to configure it).
In overall, the rate of failure or even just the rate of hassle (having to go and tweak stuff myself with Winetricks) is very low for GoG games as Lutris already comes with scripts for the vast majority of them that do the necessary Winetricks configurations automatically at the end of install plus in my experience it’s the DRM in games that generally screws Wine compatibility (to the point that at least one of my Steam games won’t work at all in Linux, but the pirated version of the same game works just fine),
There’s also benefits like being able to run the games wrapped in a firejail sandbox that disables networking and disables access to a bunch of other system features for security and privacy that you don’t have either with Steam or in Windows.
No idea how it handles WoW though, it’s been maybe a decade since last I used it.
I play Cult Of The Lamb with mouse and keyboard and the Crusades (basically sequences of fighting arenas) tend to be kinda insane in terms of the intensity with which you have to use them, so it makes total sense.
By the way, thanks for the very complete explanation.
My gut feeling told me it was some kind of memory leak (because those things tend to manifest themselves after some time of using the software, with some randomness on how long it takes for it to happen) but when I looked around I couldn’t find an explanation of its mechanism.
By the way the suggested workaround of adding LD_PRELOAD=“” in the Launch Options for the game seems to work.
Maybe the way those input capture layers work is by putting their input handling methods ahead of the default ones via LD_PRELOAD and forcing LD_PRELOAD to be empty means they’re not in the input processing pathway anymore?! (In all fairness, I can’t be arsed to dive into that codebase ;))
Maybe that’s what I’ve been getting when I’m playing Cult Of The Lamb from Steam in Linux, which is a 2D game so hardly taxing my machine.
Didn’t notice anything like that on my GOG games started from Lutris.
If it’s in your systems in an open format it’s yours, if it’s outside your systems or wrapped in some kind of locked format that forces you to go through somebody else’s software it’s de facto theirs.
Due to my own experience in software development with 3rd party solutions from way back, I never adhered to Streaming solutions (even though I was tempted) and always stuck to getting my entertainment in a media format I controlled (legitimately for a long as I could, not so much once even physical media started having DRM) because I was aware that it’s risky to outsource so much control over one aspect of what you do (in this case entertainment) to an entity which, frankly, sees you as nothing else that microscopic fraction of their bottomline.
(The funny bit is that if Netflix would sell me their Series in an open file format that I could download and at a reasonable price, I would have sent lots of money their way, same as I spent lots of money on DVDs and even VHS tapes back in the day. In fact all throughout that period I was doing something like that for games: as soon as I discovered GOG with their DRM-free downloadable installers, I started acquiring all my games by buying them from GOG)
In the fullness of time, my caution seems to have been proven right.
That’s pretty much the self-made home media system I’ve upgraded to some months ago, only mine has an N100 CPU (which is nicer from a power consumption point of view for an always on system since its TDP is 15W).
It’s wired to my TV, running Kodi on the foreground, runs qBittrorrent on the background over an always on VPN and serves as my home NAS.
From Aliexpress I got a wireless remote that let’s me control Kodi as if it was a TV box, so from my sofa I handle it as a TV box whilst from my PC I can ssh to it and to any computer kind of management.
Probably one of my best purchases ever.
I vaguelly remember reading that Germany made Copyright Violation even for personal use a Crime, rather than merelly a Civil Law affair like it is in most countries.
Mind you, I might be wrong on the countries or on the details (i.e. maybe it’s only a Crime if it’s for profit).
Edit: So I searched for it and from here I got that:
Are there criminal copyright provisions? What are they?
Copyright infringements under German law also constitute criminal acts, which are punishable by fines or up to three years’ imprisonment. If the infringement is done on a commercial basis, the maximum punishment is five years in prison.
According to German copyright law, unlawful exploitation of copyrighted works, unlawful affixing of the designation of an author and the infringement of related rights are subject to imprisonment of not more than three years or a fine. In addition, any attempt shall be punishable.
The unlawful exploitation of copyrighted works on a commercial scale is subject to imprisonment of not more than five years or a fine.
The infringement of technological measures and rights management information is subject to imprisonment of not more than one year or a fine.
As I said, in most countries copyright infringement is not a Crime, just a Civil Law matter (i.e. you can be sued by the owners of the Copyright for damage but you won’t be sued by the State to pay a fine or even be jailed for it). Frankly judging by what it says there German law is very draconian on this.
Copyright if elements of the game such as 3D models, images and code have been copied.
Trademark if the name of the game is used (i.e. “Stardew Valley Romance Sims”).
Patents for game mechanics.
As a side note, personally I think that game mechanics shouldn’t be at all patentable
Look, I’m extrapolating from the general rule to the specific case of torrenting.
The general rule is that, because the IP protocol requires numerical addresses to connect to a remote machine, if what you have is a site name you have to translate that name into a numerical address before you can actually establish a connection, and a DNS query is how you translate site names into their numerical IP addresses.
Now, if you look at the contents of a tracker, what you see are not numerical addresses but site names, so those must be translated into numerical addresses before your client can connect to those trackers, hence DNS queries are done to do that translation.
Meanwhile, if you look at the “peers” section in an active torrent in your torrenting program, you see that they all have numerical IP addresses, not site names. This makes sense for two reasons:
Hence my conclusion is that the torrenting protocol itself will only deal with site names (which require DNS queries before network connections can be made to them) for the entrance into the protocol (i.e. start up and connect to trackers) and then deal with everything else using numerical IP addresses only, both because almost no peer will actually have a site name and because it’s low performance and doesn’t make sense to get site names from peers and have to resolve those into numerical addresses when then peer itself already knows its numerical address and can directly provide it. Certainly that’s how I would design it.
Now, since I didn’t actually read the protocol or logged the network connections in a machine torrenting to see what’s going one, I’m not absolutely certain there are now DNS queries at all after the initial resolution of the trackers of a torrent. I am however confident that it is so because that makes sense from a programming point of view.
Well, if the trackers are specified as names (and a quick peek at some random torrent shows that most if not all all), those do have to be resolved to IP adresses and if that DNS query is happening outside the VPN then your ISP as well as the DNS server being queried can see you’re interest in those names (and it wouldn’t be hard to determine with a high probability that you are indeed torrenting something, though WHAT you are torrenting can’t really be determined by you merely accessing certain servers which have torrent trackers active, unless a specific server only tracks a single torrent, which would be pretty weird).
Things like peers aren’t DNS resolved since they already come as IP adresses.
So when it comes to torrenting as far as I know all that the DNS can leak is the information that you ARE torrenting but not specifically WHAT you are torrenting.
It’s more in things were you’re constantly doing DNS queries, such as browsing, that DNS leaking can endanger you privacy: if for example somebody is going to “hotsheepbestialityporn.com”, somebody at their ISP could determine that person’s very specific sexual tastes from seeing the DNS queries for hotsheepbestialityporn.com coming in the open from their connection.
It might be a DNS problem.
I vaguely remember that Mullvad has a setting to make sure that DNS queries go via the VPN but maybe that’s not enabled in your environment?!
Another possibility is that Mullvad going down and then back up along with your physical connection when your ISP forces a renewal of the DHCP is somehow crapping up the DNS client on your side.
If you have the numerical IP address of a site, you can try and access the site by name in your browser when you have problems in the morning and then try it by nunerical IP address - if it doesn’t work by name but it does by numerical IP it’s probably a DNS issue.
PS: you can just run the “ping” command from the command line to see if your machinr can reach a remote machine (i.e. “ping lemmy.dbzer0.com”) and don’t need to use a browser (in fact for checking if you can reach machines without a webserver, the browser won’t work but the ping command will).
Well, been * ahem * told that a friend of a friend didn’t found any videos there were “Your friendly neighborhood geek goes into the house of a hot milf to upgrade her Windows 10 machine to Arch and she shows them how hot she found their Linux install skills and how thankful she is”, so that seems unlikely.