Does it need to support scaling when everything is designed to run on exactly three screen sizes (Macbook Air/Pro, Mac Pro)? Same reason iOS doesn't need to scale anything or handle weird screen configurations.
Logical deduction? iOS won't run on non-Apple hardware, so there's only like six devices that you need to support at any given time, all with known screen sizes and ratios.
But anon. iOS has text and GUI scaling. I don't remember how many steps each one has, but it was about 5 or 6 different scalings for each. So that makes about 6 text sizes, 5 GUI scalings, 6 devices.
6*6*5=180 different configurations
you have to program for at least 180 different configs. go ahead, write a ruler app with Xcode and see for yourself.
Logical deduction? iOS won't run on non-Apple hardware, so there's only like six devices that you need to support at any given time, all with known screen sizes and ratios.
retarded fucking nagger
you know apple supports plugging external monitors, right? such as the APPLE PRO DISPLAY XDR? Or literally any other hdmi/dp monitor in existence?
The Pro Display XDR is one more screen size to account for. As for any other external monitor, if you complain to Apple about the scaling being off they'll probably tell you to buy a Pro Display XDR.
> I can see why they go with just oversampling
I do this in Windows using Nvidia's DSR feature. I love how it makes fonts look (that smooth slightly fuzzy aesthetic similar to MacOS).
I've never seen anyone else talk about doing this ever, it seems I'm the only one on the planet.
Why not simply uses Apples font smoothing? There are patches available for Windows.
Protip: Instead of improving GPU performance in order to smear text around faster, improve your screens PPI and turn off AA/font smoothing completely.
It's quite idiotic to render something sharp, only to then render some more and make it unsharp again. It's a garbage concept.
Did you know? Android and iOS had no AA for a very long time, as OpenGL didn't even support it. Did somebody complain? No, the very high PPI of the retina and HDPI displays prevents those stair-like artifacts before they even begin being noticeable.
I'm still mad as fuck I didn't buy that TV from my supermarket in 2016 or something. It had 4k resolution at 32" for just €300. Fuck my life.
>Why not simply uses Apples font smoothing? There are patches available for Windows.
You mean Mactype? It's good but not the same as just using DSR to supersample the whole desktop. I don't find it to be a huge performance hit, my GPU still sits at single digit usage percentages and low temps while browsing etc.
I don't recall its name, soz. I've used it in Windows 7 over 10 years ago, lol. And concerning the rest of your post: Sure, however I felt the urge to explain the principle. It's better when we move to HDPI screens instead of fucking around with "better smearing" (which, like I said, even wastes cycles).
I don't recall its name, soz. I've used it in Windows 7 over 10 years ago, lol. And concerning the rest of your post: Sure, however I felt the urge to explain the principle. It's better when we move to HDPI screens instead of fucking around with "better smearing" (which, like I said, even wastes cycles).
>Android and iOS had no AA for a very long time, as OpenGL didn't even support it
??? Text AA has been in Android since literally the very first versions. Apple handhelds have had text AA since the iPod Photo in 2004. 3rd gen and beyond iPod Nanos (2007) had 200PPI screens and text AA. OpenGL also does not have _any_ text rendering functionality on its own, and never has.
I meant actual anti aliasing in 3D applications. It's kind of the same technology (making corners seem less sharp by blurring them), but in 3D and not for text. So I noted it as it's relevant.
>try doing it using X11, XrandR, on KDE and break the entire fucking desktop
God I hate linux.
How can I set my desktop to run at 4k and dowscale it to my 1080p display at 2X scaling?
>try doing it using X11, XrandR, on KDE and break the entire fucking desktop
God I hate linux.
How can I set my desktop to run at 4k and dowscale it to my 1080p display at 2X scaling?
Lmao I just figured it out how to do desktop supersampling in Linux and it was actually incredibly easy (in X11, if you're a Waylandfag get fucked)
get the name of your currently used monitor output by running xrandr, it'll be the one that says "connected primary", for me it was called 'HDMI-0'
replace HDMI-0 with the name of your monitor output you obtained from the previous step
make --mode's value your actual monitor resolution
make --panning and --scale-from your desired supersampled resolution, this will be the resolution your GPU renders the desktop at internally before it is downscaled to fit your monitor
so as you can see I am rendering at 4k and downscaling to fit my 1440p monitor
can't believe it was this easy lmao, I'm going to abandon Windows now since the ability to do this for nicer fonts was the only thing holding me back
huh, I can't believe it was this easy
I'd like to know if there's a way to do it in Wayland too even though I don't use Wayland, just for when they inevitably try to force us to switch
I think it's partially because a lot of nerd/dev types hate the smooth/fuzzy font aesthetic, and genuinely believe (and I think they are retards for feeling this way, but they appear to be sincere) that ultrasharp, jagged looking fonts with visibly pixelated edges are aesthetically superior and 'readable'
Wayland does (actually did) not have fractional scaling >Hahaha wayland btfo half cooked project
Mac os does not have ui scaling >You don't actually need it, it was made that way and it's right
I have a ThinkPad T14s (AMD 6850U) and MBP14 (M1Pro). The There's literally no contest between battery life.
The T14s gives me 9-10 hours on a charge, which is good and I have no issues with it. The MBP regularly gives me 15-16 hours. That's with Windows scaling back the processor to get longer life, and the Mac giving me the same performance without the energy saver setting.
While I love working on the T14s, there's just no contest.
Moreover the videoCam and mic are sort of embarrassing, while the mac's are almost life like when I'm on a call
What do you mean? They've had UI scaling far longer than anyone else. Their UI scaling is the only one that isn't shit. It has been perfect since it came out.
It does, you may choose between 100% and 200%.
It doesn't have software.
>m-m-muh gaymes
adults are talking, sweaty.
go over and argue with the other manchildren here Wrong board
Games? I'm talking about FPGA synthesis.
it doesn't even have windows or gpu drivers in linux
you cannot even connect a computer mouse to it
Does it need to support scaling when everything is designed to run on exactly three screen sizes (Macbook Air/Pro, Mac Pro)? Same reason iOS doesn't need to scale anything or handle weird screen configurations.
>Same reason iOS doesn't need to scale anything or handle weird screen configurations.
This makes me curious, who told you that?
Logical deduction? iOS won't run on non-Apple hardware, so there's only like six devices that you need to support at any given time, all with known screen sizes and ratios.
But anon. iOS has text and GUI scaling. I don't remember how many steps each one has, but it was about 5 or 6 different scalings for each. So that makes about 6 text sizes, 5 GUI scalings, 6 devices.
6*6*5=180 different configurations
you have to program for at least 180 different configs. go ahead, write a ruler app with Xcode and see for yourself.
retarded fucking nagger
you know apple supports plugging external monitors, right? such as the APPLE PRO DISPLAY XDR? Or literally any other hdmi/dp monitor in existence?
The Pro Display XDR is one more screen size to account for. As for any other external monitor, if you complain to Apple about the scaling being off they'll probably tell you to buy a Pro Display XDR.
That's exactly the point, mac os is shit on purpose unless you use an apple screen, an apple touchpad and an apple keyboard.
If you already dumb enough to buy Aplel products you should buy their displays as well and pay your idiocy tax
The idiocy tax is included in the price of their products already, sir.
Subpixel AA for text is ideologically so umpire, I can see why they go with just oversampling even if I disagree.
>umpire
what?
*Impure
common bot L ignore and carry on
>common bot L ignore and carry on
what?
> I can see why they go with just oversampling
I do this in Windows using Nvidia's DSR feature. I love how it makes fonts look (that smooth slightly fuzzy aesthetic similar to MacOS).
I've never seen anyone else talk about doing this ever, it seems I'm the only one on the planet.
Why not simply uses Apples font smoothing? There are patches available for Windows.
Protip: Instead of improving GPU performance in order to smear text around faster, improve your screens PPI and turn off AA/font smoothing completely.
It's quite idiotic to render something sharp, only to then render some more and make it unsharp again. It's a garbage concept.
Did you know? Android and iOS had no AA for a very long time, as OpenGL didn't even support it. Did somebody complain? No, the very high PPI of the retina and HDPI displays prevents those stair-like artifacts before they even begin being noticeable.
I'm still mad as fuck I didn't buy that TV from my supermarket in 2016 or something. It had 4k resolution at 32" for just €300. Fuck my life.
>Why not simply uses Apples font smoothing? There are patches available for Windows.
You mean Mactype? It's good but not the same as just using DSR to supersample the whole desktop. I don't find it to be a huge performance hit, my GPU still sits at single digit usage percentages and low temps while browsing etc.
I don't recall its name, soz. I've used it in Windows 7 over 10 years ago, lol. And concerning the rest of your post: Sure, however I felt the urge to explain the principle. It's better when we move to HDPI screens instead of fucking around with "better smearing" (which, like I said, even wastes cycles).
Oh, It's "HiDPI" and not HDPI:
>Android and iOS had no AA for a very long time, as OpenGL didn't even support it
??? Text AA has been in Android since literally the very first versions. Apple handhelds have had text AA since the iPod Photo in 2004. 3rd gen and beyond iPod Nanos (2007) had 200PPI screens and text AA. OpenGL also does not have _any_ text rendering functionality on its own, and never has.
I meant actual anti aliasing in 3D applications. It's kind of the same technology (making corners seem less sharp by blurring them), but in 3D and not for text. So I noted it as it's relevant.
>try doing it using X11, XrandR, on KDE and break the entire fucking desktop
God I hate linux.
How can I set my desktop to run at 4k and dowscale it to my 1080p display at 2X scaling?
Guy you replied to here. Yeah I dunno either, I tried to figure out a way to do desktop supersampling in Linux and failed just like you did.
It's what keeps me from switching, I just love the smooth look you get from rendering your desktop at a resolution higher than the monitor resolution
(me)
Lmao I just figured it out how to do desktop supersampling in Linux and it was actually incredibly easy (in X11, if you're a Waylandfag get fucked)
get the name of your currently used monitor output by running xrandr, it'll be the one that says "connected primary", for me it was called 'HDMI-0'
now run:
xrandr --output HDMI-0 --mode 2560x1440 --panning 3840x2160 --scale-from 3840x2160
replace HDMI-0 with the name of your monitor output you obtained from the previous step
make --mode's value your actual monitor resolution
make --panning and --scale-from your desired supersampled resolution, this will be the resolution your GPU renders the desktop at internally before it is downscaled to fit your monitor
so as you can see I am rendering at 4k and downscaling to fit my 1440p monitor
can't believe it was this easy lmao, I'm going to abandon Windows now since the ability to do this for nicer fonts was the only thing holding me back
ahahaha wayland btfo once again
huh, I can't believe it was this easy
I'd like to know if there's a way to do it in Wayland too even though I don't use Wayland, just for when they inevitably try to force us to switch
why didnt windows just do that? holy shit microsoft is bunch of naggers
I think it's partially because a lot of nerd/dev types hate the smooth/fuzzy font aesthetic, and genuinely believe (and I think they are retards for feeling this way, but they appear to be sincere) that ultrasharp, jagged looking fonts with visibly pixelated edges are aesthetically superior and 'readable'
>blurry fonts are actually good because uhh clear text is for nerds
I shouldn't be surprised any more by how vapid macfags are but I am
Why do you want to suck his dick?
to pay it forward, if you want your dick sucked, you must suck some dick
performance, intel igpu shits itself when doing 4k hidpi rendereing
It also doesn't have subpixel fonts. It's fucking awful. This OS is at least 20 years out of date.
Are there legit reasons for owning a mac other than having the ability to develop for iOS?
unix that just works unlike linux
>just works
Not true see this thread
UNIX is a server OS.
No one's using OS X's for servers.
it has
This is just a lower resolution that is then upscaled.
nope it renders at x2 and then downscales it
1920x1200 is actually 3840x2400
it's playing a trick on the entire screen at once, not good for editing photo's etc.
UI scaling is not the same as display scaling.
What the fuck are you talking about?
>Larger text
>More space
You can't make this shit up, lmao
There is actually one way. BetterDisplay. Using the option Configuration Override you can scale at any %
>10.14 and up
into the bin it goes
written in swiftshit
kek, a 2 button mouse is "confusing" the user and then they pull off this scaling shit
>ackchully you have 318972310x281730783291pixels, but it's only HD ready, but you can also choose subpixel FullHD with retina pro©
>fag who never used anything Apple makes thread about Apple
many such cases
>does not deny or dispute OP's claim
Both Windows and the Linux DE's do a better job at this (unironically).
Linux des are kind of a mess with this but at least they try to support it which is more than can be said for applel
apple is anti le workaround bloat
Wayland does (actually did) not have fractional scaling
>Hahaha wayland btfo half cooked project
Mac os does not have ui scaling
>You don't actually need it, it was made that way and it's right
Lightyears ahead.
https://demo.os-js.org/
^
the fonts don't even look good at 100%
Is there a windows laptop that doesn't have all this fuckery but it's as good and doesn't eat battery like no tomorrow?
Have you been paying attention in the last 5+ years?
AMD laptops in particular have great battery life these days.
I have a ThinkPad T14s (AMD 6850U) and MBP14 (M1Pro). The There's literally no contest between battery life.
The T14s gives me 9-10 hours on a charge, which is good and I have no issues with it. The MBP regularly gives me 15-16 hours. That's with Windows scaling back the processor to get longer life, and the Mac giving me the same performance without the energy saver setting.
While I love working on the T14s, there's just no contest.
Moreover the videoCam and mic are sort of embarrassing, while the mac's are almost life like when I'm on a call
unironically it just works
gay ass windows niggas don't even into high dpi
lincope
works good enough on my Huawei 4k+ screen
It does
it does not
yea thats why it looks 10000x better than window's mix of blurry and super clean pixel perfect lines
you can shit on macos for a million reasons, their scaling is not one of them. they objectively made the right choice here
is there any linux distro that does supersampling like MacOS?
What do you mean? They've had UI scaling far longer than anyone else. Their UI scaling is the only one that isn't shit. It has been perfect since it came out.
why do you care about an OS you cant afford lincel?