>now you need 12 GB minimum
What went wrong
10 years ago, 2GB VRAM was enough for 1080p
Falling into your wing while paragliding is called 'gift wrapping' and turns you into a dirt torpedo pic.twitter.com/oQFKsVISkI
— Mental Videos (@MentalVids) March 15, 2023
why optimize your game when the consoomer can buy more memory
>Checked
Yes, this whole lack of limitations is the cancer killing modern programming. Do yourself a solid and go look up some of the crazy shit programmers did for Mega Man 2 or Summer Carnival '92 - Recca. No creativity anymore when you can just malloc a few more GBs.
>killing modern programming.
modern programming is better than ever. wtf are you talking about?
>Mega Man 2
how many pixels do you have in that game?160000?
Literally this
There's likely millions of transparencies in that scene for no fucking reason. Globs of particles using a single shader would run a thousand times faster. As each particle gets close to another it just gets grouped up into a single one with others and the shader adjusts the sprite. A random offset of time decides if the particle splits. Holy shit don't give these idiots any money.
Is this the power of Nvidia(tm) RTX(r) Technolology(c)?
That's the power of garbage collection running overtime.
that's the power of Unity
Both games use unity.
I know, your point?
Crazy how a ragtag team of third worlders can create a better optimised game than a billion dollar American corporation.
didn't the first game go down hill pretty hard? I stopped playing once they started the data collection
Data collection? Explain this bullshit. It's a fucking video game.
That's been going on for a decade or more. I remember Mass Effect 3 having a data collection setting. All supposedly to help the devs improve the game, but if you believe that...
Well for ksp they specially state their intentions to sell any data they collect for targeted advertisements.
>several years old game has better optimization than a game in beta
WOWWWWWW BIG REVELATION KSP2 DEVELOPERS FUCKING GAY AND CANT MAKE GAME
>bro it's just a beta
I have never seen a game that was released in early access/beta that fixed the issues found during that time. Maybe some exist, but it almost always releases in the same state
they updated their eula several years ago saying that they can collect a bunch of data about the player and share/sell it with advertisers
Then fuck 'em, why we even talking about them then?
ahahahahaha oh wow
and that video doesn't show how many times you have to reload the take-off because the rocket keeps wobbling in place
framerate issues aside, the one on the right looks way better
As a KSP1 player, I know the left side is bullshit.
That's pretty typical performance for KSP1 with no mods. That rocket is big but only has about 30-40 parts, assuming there's nothing crazy inside the fairing.
KSP2 is just super unfinished. In particular there are problems with the fuel flow calculations and engine exhaust rendering. The framerate will double when the engines are turned off, and double again (if not triple or more) when away from the high LOD terrain.
It's also devouring VRAM regardless of the settings, which seems to be bottlenecking it hard on lower end cards.
Presumably they were forced to release by the higher ups, because it's definitely not ready, but I think they'll get it fixed up in time.
Buying it in the current state is essentially a pre-order.
Look at the hecking smoke
That's raytracing powered by Nvidia RTX(tm).
>columbian engineering program got a sequel
i wasn't even paying attention. no videos even recommended to me about it on youtube.
there was some kind of disastrous youtuber event where it ran like ass and had a million bugs but it wasn't that bad because they'd fix a lot of things
then it had a disastrous release where it ran like ass and had a million bugs
I saw this pop up on steam
The ones on the left and right both look like they've had their full 8 hours of sleep. I wonder who's the one doing all the programming.
>that photo
another leftist chud photo to archive into my collection of mentally ill, deranged leftist.
please upload a mega
I'm not sure about the other two, but the planet on the left is a community manager or whatever, doesn't actually work on the game.
>community manager or whatever
they get paid to post on twitter?
have you been living under a rock for the last 9 years?
why do you need to pay a hambeast to post on twitter?
sadly corporations think internet points matter and bots cannot come up with clever replies yet to make fake brand wars just yet which is exactly why they're all investing into it
But why pay a hambeast though? Why not pay a cute intern?
HR is full of women
But why a hambeast?
the other women feel prettier when she's around
because the hambeast lets them virtue signal without doing any actual good
virtue signal what? that's they are into hambeasto? how is that a good signal?
are you a retard? they pay a hambeast because of "muh inclusivity"
why not pay a cute asian intern then? maybe a hapa even? why a hambeast?
More oppression points per person -> less money spent
pretty sure you have to pay a hambeast the same due to anti-discrimination laws you would be sued otherwise
White dwarf crushed by (non)binary ham planet system.
lel
It was supposed to get released in 2020 or something.
But they just did an "early release" that's full of bugs, crashes all the time and only has a fraction of the gameplay it's supposed to get....eventually.
Basically you can pay them money to become an alpha tester, or wait another 3 years or so for the beta version if they ever get to that.
that scott manley guy keeps having an orgasm over it.
What? I just checked his latest video and he advises new players to just buy KSP1 instead. Especially since it's on sale and isn't full of bugs.
Not only this, but devs are relying on DLSS to optimize their games for them.
DLSS became like screen resolution on consoles. "The PS3 is a 1080p machine, buy it", but games ended up with substandard resolutions in order to make higher fidelity possible. DLSS is now similarly only a marketing term to make up for other things.
Game programmers should be forced to use old shit boxes.
4K. Also
because if you can make games for poorfags you have more potential customers
fpbp
not like they have a choice with the amount of diversity hires they are required to hire by law
nobody even writes their own engines anymore.
i think games would be a lot more optimized if they did.
the unity and unreal have been a disaster for gamerkind.
I still use 4gb ram
devs forgot about texture streaming/texture compression
Lazy devs and javascript kiddies.
>I want to be the change I want to see and make an impressive looking game within intentional spec limitations. First crippling factor, Godot 4.
I am also severely retarded and my body riddled with autism all over.
Stop falling for the VRAM allocation meme.
Reminder that a 3060 Ti still mogs the 3060.
I have a 3060ti, all mogging stops at 1440p
I have a 3060ti and I only ever intend to play 1080p. I went out and bought a really nice 27" 144hz 1080p monitor and I plan on playing everything at max settings for the next 3 years. Money well spent.
enjoy your 2014 experience bro
holy cope
you dont need more than 1080p
they don't even fix game-breaking bugs
actually they don't even finish their games nowadays, what makes you think they'll optimize
covid hysteria made everyone retarded, sorry what was the question?
>2003: GeForce FX 5700 256 MB
>2013: GeForce GTX 770 2048 MB (doubled 3 times)
>2023: GeForce RTX 4070 Ti 12288 MB (doubled 2.5 times)
Progress is slowing down if anything.
The standard FX 5700 had 128MB, not 256MB.
Game developers are at least 4 times LESS competent than they were 10 years ago. More and higher fidelity assets being displayed at once also play a part, but we're already at a point where it's pretty obvious devs don't even try to get a game to run well, and just tell people to spend close to a grand on a new GPU if they want better performance. I actually want the economy to crash hard, so buying a new GPU simply won't be an option for most people and studios will have to relearn how to optimize to push sales. AI will eventually help as well.
Don't forget upscaling and frame generation being used to make up for all this.
Woah... The console experience of the last 25 years, now on your $2,000 pc! Everyone is killing PC gaming from every angle, and then they wonder why GPU sales are at a 20 year low.
Buy cheap Tesla P40 from ebay. 24gb ecc vram buckos.
Square is disgustingly incompetent at PC ports but people keep giving them money.
FF7R was a bit janky at first but it scales damn well and the graphical effects and fidelity are off the fucking charts though.
>the graphical effects and fidelity are off the fucking charts though
lol
It looks like absolute shit at times. Especially the scenery and ESPECIALLY anything round like pipes or tyres, which look like something out of a PS2 game. The only thing that looks decent are the character models, and even then the hair has tons of aliasing.
FF7 has the most retarded and easily-impressed fanbase in video games.
Do I need super fidelity on tires? No, that's called optimizations. Just like your favourite Minecraft game, as long as you understand the representation of a thing and can recognize it that's "good enough" which is the enemy of perfect.
>Yes, the door to the apartment is shitty.
Holy fuck that looks so bad. Just cobble some medium detailed models together and call it a day.
10 years ago i had to imagine my waifu
now she's basically real
Recently I played a game called "In Sound Mind" and the textures were absolute shit, it looked like a PS3 game yet it used 4gb on High and, even the "Streaming Textures" setting could barely keep it under 3gb.
After I checked it was an Unity game.
10 years ago games had fantastic gameplay.
Now they have fantastic graphics but shit gameplay.
nah nowadays we have neither of those things
unity and unreal engine
Common-Core math was a mistake. Programmers these days don't understand efficiency.
you need more vram for higher resolution textures, and sparse acceleration structures
games also tend to load as much as possible into vram to minimize loading, good games will make this configurable
you can ignore any answer that involves complaining about diversity hires, minorities, trannies, or lazy developers
Reddit spacing. Pure bullshit. Inexperience.
>Checks out.
Why are you mad at progress? In fact I would argue the vram ammounts new cards have have been stagnant since the 1000 series. If anything 4060 should probably feauture at least 24gb of vram if not more.
>WAAHHHH my 750ti can no longer run new games WAAHHHH
Well, why shouldn’t it be able to?
>what is pixel density
imagine thinking 1080p in 2013 is the same thing as 1080p in 2023
nice bait
Poly counts are a lot higher now, also games were rendered at 1080p back then but most of the time textures were much smaller, compressed, and multi-purpose. Modern games have assets out the ass and have at least 1024x1024 textures for everything, if not much larger.
your screen's resolution means jack shit when it comes to the size of textures, number of polygons, and shaders. devs could certainly do a better job of optimization but the graphics of a modern AAA game and one from 10 years ago are incomparable
you may as well ask why windows 11 requires 4gb of ram when windows 7 requires only 1, as tech develops people expect it to do more and more things, which requires more and better hardware, which creates more expectations, etc.
It's a feedback loop and it isn't going to stop until it's literally physically impossible to do so
it works on my PS4
I have a 3080 10gb and can gayme fine at 1440p let alone 1080p
I have a 1660 and can game at 1440p 90fps+
Web browsers were faster in 1998 than they are now.
sexo
Jesus christ every post in here reads like they don't know what the fuck goes on in the outside world. Do any of you actually understand why GPUs and CPUs are at a bottleneck right now? Do you know why the materials for them are "scarce?" Do you do anything else beside being a bitch on a monkey herding yokel-yodelling forum?
>Do any of you actually understand why GPUs and CPUs are at a bottleneck right now?
>multithreading very hard
>hardware specific optimization literally impossible due to causing exponential code inflation, only slight general usecase optimization possible
>industry doesn't pay well enough to hire actually good graphics programmers due to them being snapped up by related fields/disciplines (in particular low level AI would probably be the hot shit right now)
>overuse of "generic" engines that perform worse across the board due to programmers and other developers being more familiar with their asset workflow and it being more practical to hire for them
unreal specifically comes to mind, apparently they only JUST started optimizing their level loader to properly chunk things for open world games
>Do you know why the materials for them are "scarce?"
communist china being either literally retarded or pretending to be retarded to hurt the west + price gouging?
i don't see what that has to do with anything
t. actual graphics programmer
i have 24gb