For me it was the objectively incorrect choice. Sound issues, display issues, slow. Whatever is up with mint, it absolutely doesn’t work with my hardware.
It’s possible to install a newer kernel in Mint using the Update Manager. This might have solved your hardware issues. Admittedly, though, this option is not very easy to find if you’re not aware of it.
Me: At least my neighbors aren’t those inbred hicks over there
You : Actually, those people have only practiced cousin marriage, which has been socially acceptable for various centuries
Me : Yeah, the point is more that they are hicks.
What the fuck? Not sure whether that’s more strawman or red herring, but either way it’s a really cheap deflection that provides a window into your mind, not mine.
Is that how you respond to everyone who points out why you’re wrong?
The only reason I dislike mint is because the developers kept postponing the Wayland transition so insanely long. Once it does HDR, variable refresh rate, and fractional scaling on heterogeneous displays correctly, I’ll start recommending it again.
It’s better to delay it and release an immediately usable product than to break the desktop when an unexpected bug is encountered and make the computer unusable. I’ve never transitioned a desktop environment and framework to an entirely different display system, but I don’t imagine it’s as simple as flipping a switch.
It definitely takes time, and stable distros should exist. Wayland has been the clear choice moving forward for 7 years though. It feels like Mint & a few others are just stalling at this point.
Wayland: display server. The thing that shows the visual stuff on screen. Wayland=new and more features (features explained below). X11=old but stable and takes time to transition from without bugs.
HDR: high dynamic range. If you have a really nice TV or monitor, this gives you better color accuracy. Make sure you have good brightness levels with brightness cranked up, or it will counter intuitively look worse, like the brown filter PS3 era of video games.
VRR: variable refresh rate. When you run a game, some parts are harder to render than others due to increased detail and things happening in the screen. Thus, your frame rate will dip, making a noticeable jittery effect that is not smooth, especially if you have a high refresh rate monitor. My monitor refreshes 165 times per second to detect changes, and if the frame rate goes from 140 frames generated to 90, that is very noticeable. VRR syncs the refresh rate of your monitor to the GPU itself, so it knows exactly how many frames it will be getting. My monitor will refresh 90 times for that second that I got a frame drop instead of 165, which drastically decreases the jittery effect of the dropped frames. You can still kind of tell, but it is more smooth and responsive in terms of what is happening on screen.
Heterogeneous displays: monitors of different resolutions.
Fractional scaling: this allows you to set display zoom at different percentages on different monitors, as well as setting non-integer scaling (integer is 100% to 200%, non integer is 100% to 125%). This is important because 100% scaling is often too small on high resolutions, and 200% is comically large. Also for the multiple monitor scenario, most people have a new monitor and their old monitor as the secondary. For example, 4k will require 150% scaling at least to be readable st most screen sizes. 1080p will look too zoomed in at over 100%, and not match the look of the other monitor.
In summary, most of this is going to matter only if you are a gamer or watch HDR content like movies on your computer. Having matching monitors despite non matching resolutions is pretty nice though. But if you have matching monitors or 1 monitor it doesn’t matter either. Hence, Mint is not a good choice for a gaming or home theater situation, but its hyper focus on being stable makes everyone else like it more because they never do anything different unless it is for sure going to work. At this point though, most distros are using Wayland with no issues.
It’s the objectively correct choice, but it might draw the ire of Fedora stans.
Nixos is the objectively correct choice if anything.
I don’t think you understand meaning of that word
Oh god, no. Are you trying to drive people away from using Linux?
For me it was the objectively incorrect choice. Sound issues, display issues, slow. Whatever is up with mint, it absolutely doesn’t work with my hardware.
It’s possible to install a newer kernel in Mint using the Update Manager. This might have solved your hardware issues. Admittedly, though, this option is not very easy to find if you’re not aware of it.
It’s also kinda antithetical to what people are saying, which is that mint is great out of the box.
Mint noobs : Haha entry level distro go brr
Fedora wearers : Noooo milady you have to use a version of an OS based off corporate bs
RHEL is downstream from Fedora. They’re both forks of Red Hat, but Red Hat ≠ RHEL
yeah, corporate bs with all the corporate bs taken out.
also, LMDE exists.
Me: At least my neighbors aren’t those inbred hicks over there
You : Actually, those people have only practiced cousin marriage, which has been socially acceptable for various centuries
Me : Yeah, the point is more that they are hicks.
What the fuck? Not sure whether that’s more strawman or red herring, but either way it’s a really cheap deflection that provides a window into your mind, not mine.
Is that how you respond to everyone who points out why you’re wrong?
idk look through my history and find out.
The only reason I dislike mint is because the developers kept postponing the Wayland transition so insanely long. Once it does HDR, variable refresh rate, and fractional scaling on heterogeneous displays correctly, I’ll start recommending it again.
HDR is like the only reason I’m on Windows 11. Already switched my entire home lab over to proxmox.
What would you recommend today if I wanted good HDR support and gaming with a Radeon GPU?
Or should I just wait for Mint to get those features?
Only?!
Are we pretending 2 years old software is a good thing?
We are not pretending. Debugged and working reliably is better than new. Being new is not automatically useful value.
Enough posts about Wayland are complaints I don’t care if it ever happens. Mint works fucking great just like it is.
For some variety, I recently switched from x11 to Wayland and it fixed some problems I was having with game input.
It’s better to delay it and release an immediately usable product than to break the desktop when an unexpected bug is encountered and make the computer unusable. I’ve never transitioned a desktop environment and framework to an entirely different display system, but I don’t imagine it’s as simple as flipping a switch.
It definitely takes time, and stable distros should exist. Wayland has been the clear choice moving forward for 7 years though. It feels like Mint & a few others are just stalling at this point.
Wayland: display server. The thing that shows the visual stuff on screen. Wayland=new and more features (features explained below). X11=old but stable and takes time to transition from without bugs.
HDR: high dynamic range. If you have a really nice TV or monitor, this gives you better color accuracy. Make sure you have good brightness levels with brightness cranked up, or it will counter intuitively look worse, like the brown filter PS3 era of video games.
VRR: variable refresh rate. When you run a game, some parts are harder to render than others due to increased detail and things happening in the screen. Thus, your frame rate will dip, making a noticeable jittery effect that is not smooth, especially if you have a high refresh rate monitor. My monitor refreshes 165 times per second to detect changes, and if the frame rate goes from 140 frames generated to 90, that is very noticeable. VRR syncs the refresh rate of your monitor to the GPU itself, so it knows exactly how many frames it will be getting. My monitor will refresh 90 times for that second that I got a frame drop instead of 165, which drastically decreases the jittery effect of the dropped frames. You can still kind of tell, but it is more smooth and responsive in terms of what is happening on screen.
Heterogeneous displays: monitors of different resolutions.
Fractional scaling: this allows you to set display zoom at different percentages on different monitors, as well as setting non-integer scaling (integer is 100% to 200%, non integer is 100% to 125%). This is important because 100% scaling is often too small on high resolutions, and 200% is comically large. Also for the multiple monitor scenario, most people have a new monitor and their old monitor as the secondary. For example, 4k will require 150% scaling at least to be readable st most screen sizes. 1080p will look too zoomed in at over 100%, and not match the look of the other monitor.
In summary, most of this is going to matter only if you are a gamer or watch HDR content like movies on your computer. Having matching monitors despite non matching resolutions is pretty nice though. But if you have matching monitors or 1 monitor it doesn’t matter either. Hence, Mint is not a good choice for a gaming or home theater situation, but its hyper focus on being stable makes everyone else like it more because they never do anything different unless it is for sure going to work. At this point though, most distros are using Wayland with no issues.
Love seeing Clone High memes in the wild.
Was just reminiscing the theme song a couple days back.
It’s the reason that while Joe Flaherty is a legend in his own right, the first thing to comes to mind when I hear his name is this:
what do you recommend now?