Falling into your wing while paragliding is called 'gift wrapping' and turns you into a dirt torpedo pic.twitter.com/oQFKsVISkI— Mental Videos (@MentalVids) March 15, 2023
is your image your distended pooper?
noice... would be a shame if something happened to TMSC in Taiwan, G-d forbid.
israelite are less than 2% of the planet
Somehow in every country on earth
Especially western ones
make it relevant for the board you linus tech tipping nagger
it's actually Linus Tips Nigs
>In 2008, Pat Gelsinger, at the time serving as Intel's Chief Technology Officer, said that Intel saw a 'clear way' towards the 10 nm node.
lol this. They were stuck on 14nm for fucking ages. going down more is even more challenging, so yields are no doubt garbage, and you won't see sub 10nm chips for a long time.
Everything you actually see on the market is 22nm.
Si atom is like .2nm, iirc
even at 22nm, that's stupidly small scale
based.. America will be #1 again in fab tech and we will use it to destroy the demonic shitholes of Russia and China
u are the demons john
Every single innovation in the IT industry in the past 20 years have been primarily used to make the lives of your country's citizens even more hellish. More surveillance, more censorship, more psyops, more social engineering, more data harvesting, more rent-seeking, more alienation, more propaganda, more brainwashing. Unfortunately, like all other cancers, the tech dystopia described above also spilled out into Europe.
Don't worry, he knows.
Death to JudeoAmerica.
Guess again, chump.
>Wang Rui, president and chairman of Intel China, said at an event that the company had finalized the development of its Intel 18A (18 angstroms-class) and Intel 20A (20 angstroms-class) fabrication processes.
I guess fab tech stays with the asiatics, just not the same ones
I remember getting 100% performance boost from upgrading pentium 133 to amd k6 266. Now my PC from 2014 is enough for everything. Is current cpu just goytech?
You need latest Nvidia for making deeplearning shit.
what about. RAY TRACING goy?
you dont want the RAY TRACINGGGG?
i can do deep learning on my ti-84 wtf are you on about
So useless, thanks.
Developpers in 2023 use 50 Gig of ram and 100 GB updates every two days for games with Morrowind class graphics.
Moore's Law end
yeah more Meltdown or Spectre BS
I thought Intel just said they weren't able to keep up with Nvidia or AMD now they are saying they have something that blows them both out of the water by an order of magnitude?
thats for on chip memory...the headline is just pr bullshit about making a transistor at a small size...nothing to do with actuallly executing.
intel is run by glownaggers, anything serious they do is classified
Can someone translate dork into chad for me.
I skate too much and wear my hat too backwards to know what geeksquad here is talking about.
no you're just a retarded homosexual, anyone with half a brain cell can google what a node process is or watch a cbs video on chip manufacturing
not spoonfeeding you, get filtered
Hahaha I've never triggered someone so much.
Doesn't tell me much. *Kick flips*
Kinda don't get half of this
Guess it's some gay science shit that doesn't matter, catch ya later nerds.
*Sick guitar riff*
the next two marketing names for some bullshit computer maker have been planned
aka get ready to buy more shit in the next 2 years if you want the next latest thing
further refinement of microchip manufacturing. Smaller manufacturing allows same speed less power or more power more speed. more components in a given area etc.
the numbers themselves are meaningless and are more of a tech or brand name as you'd be hard pressed to find any but the most arbitrary features on a chip that actually matched 2nm or even 4nm, 7nm etc. but it is still progress.
you can't spell transistor without trans
How is my trans sister doing?
IBM did 2nm two years ago retard
not at manufacturing scale
this has all materials and processes along with expectations in place to move to large scale turnout
except euv machines that samsung and tsmc are buying out
Crazy how these company sit here any say they produce sub 10nm chips yet.... technically speaking....theres alot of reasons they are 'making this up' and its 'not truly a 7nm cpu'.
>after intel delays7nm for 5 years.
Yeah exactly. Supposibly they got it FIGURED OUT. And by 2025, 2nm will be production/shipping. But again thats it. There is not going to be a 1nm cpu. We are at physical limits of single atom thick gold wire traces. Most ALL SCIENCEFAG understand that at that level the elections start having quantum effects and cannot be reliable with our current designs.
>technology is dead stand still
>make up ray tracing nagger shit to reset GPU ladder / knock all AAA games from 300fps to 20fps for absolutely no visual improvement beyond a little lighting.
Most of the gains for computers going forward is going to be from firing programmers that don't use c or c++
The future is analog computing. The kind of shit IBM is doing.
Yeah hybrid traditional, analogue, quantum, and photonic transistors each playing to their strengths running optimized code is the only way were ever going to see big leaps in performance again. The biggest hurdle is optimized code though. Programmers who actually give a shit about clean, elegant, efficient code are going extinct and everything moving towards shit like "low code" and "no code" is only making it worse. Soon chat bots will probably be writing the bulk of the individual objects, classes, functions, etc and only a handful pajeets who barely understand how it works will be needed to stitch it all together, eventually also with the help of AI. It'll be cheap and fast to make software but it'll all run like shit and well be making faster hardware just to mine more data in the background and make shit code run almost as well as decent code could run on 15+ year old hardware.
>dunning your kruger this hard
Damn, now theres an accomplishment.
"traditional" transistors are analog, btw.
Physically analog, not logically analog. If you want to be an autistic homosexual you can argue everything is technically physically analogue. The earliest attempts at making transistors for processors used a base 10 system but it wasn't reliable so we eventually settled for base 2. We eventually got base 3 working later too but they never took off. Analogue processors aren't logically base anything though, they just use raw fuzzy values. These could theoretically work really well for anything with fuzzy logic like audio, video, lossy encoding/decoding and AI applications, but again the hard part would be getting people coding to take advantage of the new hardware.
You are hilariously conflating numerous ideas here because you don't know what they are. In a desperate attempt to save face you've dug an even deeper hole.
Transistors are *physically* analog, you poor sap. A transistor has no true off state, it has no singular definite on state, there is endless granularity by way of the v/f curve. The output is only interpreted logically, and only in binary because its convenient. Binary and trinary computing do not require magically different transistors, a specific state can be set to any arbitrary point.
Thanks for the laughs though.
Nah you just have actual full blown autism. The only thing I said that wasn't accurate was saying "transistors" instead of "processors" in my first post that made you start with your autistic screeching. When a normal person with a functional brain notices a single imperfect or out of place word or letter in an otherwise perfectly coherent sentence, they substitute for the obvious automatically. You instead fixate on the minor detail like the true autist you are congratulate yourself for being blind to what is obvious to everyone else around you. I hope for your sake that you don't do the same shit in real life. Good luck keeping a job or any friends, ever.
>oh shit I'm completely found out!
>what do I do?!?
Sasuga, homosexual. Just stop talking out of your ass in the future. Its a bad habit. There will always be someone out there who can see through your bullshit, someone can always spot the LARPer, someone will always know more than you and can instantly see a low IQ bluff for what it is.
If you just found a healthier way to get attention you wouldn't have to get into these humiliating situations.
You don't have any argument or anything to add to the conversation other than "haha you made a typo". Autism.
You didn't make a typo, LARPfag. You made it abundantly clear across several of your posts that you were a bloviating retard using words you weren't actually familiar with. I called you out, and you doubled down, offering up a retarded attempt at saving face which spelled out explicitly that you didn't even know how a transistor operated.
You're really still going to keep bluffing? Is this just humiliating fetish that you and a bunch of other discord troons are in on? Why do you little retards just go on the internet to tell obvious lies and then die on that hill when you're called on it by anyone in the know? What makes you behave this way?
How do they plan on cooling that?
>How do they plan on cooling that?
Maybe submerging it in liquid.
That sounds dangerous, Wayne
>That sounds dangerous, Wayne
3M has a liquid for just that.
a 12 feet container with sub zero azot chamber
But can it run Crysis?
Cool and all, but not really interested in spending another $450 for a motherboard just because it uses a completely different socket, again.
Apple is already manufacturing 3nm. It's still a pipe dream for Intel anon. By the time intel actually makes a 1.8nm chip Apple will be making pico scale chips.
>apple will defy quantum mechanics
>Apple is already manufacturing 3nm
apple isn't manufacturing company. They don't manufacture anything
Still running on my I7-2700k @5.2Ghz lol why do I need a new rig, this fucking beast has been going for over 11 years.
>Don't ask questions.
>Just consume product and get excited for next product.
i sold mine upgraded to a i5 6600k and i have regrets.
just built an i9-13900k system with an rtx 4090 to replace my 5 year old dying i7. hoping this expensive piece of shit lasts me at least 10 years.
How much did that cost you in total?
roughly 4000$, also has 64gb ddr5 5200, AIO cooling, 2x 2TB nvme.
6400 g skills are like 120 right now
Water cooling is a meme and you are a tool.
better than having a gigantic heatsink and fan taking up half the case and sounding like a jet engine every time i unzip a large file or fire up a game.
better than your AiO failing because the pump stopped working or sprung a leak, and all your shit short circuits because de-ionized water wasn't used.
>sounding like a jet engine every time i unzip a large file or fire up a game.
nice meme, wew lad. also, 4090's have a habit of melting power plugs, enjoy your paperweight brick.
seems like you're making a lot of assumptions
Well lets break it down.
>Gigantic heatsink and fan taking up half the case
If you're spending 99% of your attention span looking at your screens, then why would it matter?
I want a functioning computer that can keep up with today's demand, not a christmas tree.
>Sounds like a jet engine when opening up a zip file
Either you bought yourself a shitty heatsink+fan combo from wish.com, or you have a bigger problem than you're noticing. did you forget to remove the plastic film before applying thermal paste? probably overclocked, didn't you and didn't study voltages all that well. I don't know because it's not my box anyway.
>And sounding like a jet engine every time i fire up a game
It is the job of your RTX 4090's cooling fans, is it not? Or again, did you get scammed inadvertently, or overclocked?
It's not assuming, it's refuting dumb arguments.
No, you're assuming.
>AiO failing because the pump stopped working or sprung a leak, and all your shit short circuits because de-ionized water wasn't used
Kind of a bold assumption because AiOs use deionized water and you wouldn't buy the one with terrible reviews because it leaks
>4090's have a habit of melting power plugs, enjoy your paperweight brick
implying you wouldn't notice your plugs melting
i use a pretty expensive APC UPS that will sound an alarm and eventually shut down if it starts overloading. i'm not too concerned.
previous system was an i7 8700k with a coolermaster hyper 212. was not overclocking because i don't overclock. games can be just as CPU intensive as they are GPU. i also do video/3d rendering as a hobby which would demolish my machine. like 5 seconds into a render and my machine sounded like a 747 getting ready to take flight. i've been building computers since the late 90s, i somewhat know what i'm doing. this was my first time using an AIO. i know some fan/heatsinks can do very well, even better than liquid, but i didn't get some bullshit aliexpress AIO, if this thing magically springs a leak then the manufacturer is buying me a new machine.
>if this thing magically springs a leak then the manufacturer is buying me a new machine.
lmao sure thing buddy
to be honest, been tossing the idea back and forth of grabbing a Corsair H150i (3-fan radiator) the next computer i build up. there's probably a better pick out there, maybe.
>what even is a fan curve?
Tinker with your shit or buy a better case. R7 3700x/2060 super here in a fractal designs XL R2 and I never hear a peep from my system, not even under 100% load.
waaaay overkill dude
good choice, amd has really dropped the ball on this round of cpus
even the x3d ones arent great because they only stacked the caches on one of the nodes instead of the whole thing
what a grand, and intoxicating innocence
i had an rtx 2060 before and honestly i was expecting to really see some big improvements to blow my mind by getting the best card out, and to be honest it's been rather underwhelming. my 2060 could play anything at near max settings at 60 fps. without raytracing that is. first game i tried with 4090 was cyberpunk at full raytracing and i can't say that it blew my mind. if anything the added smoothness and hyperrealism of the lighting was a bit disorienting. then i realized that was the only fucking game i had that i actually play that benefited from the upgrade. but hey, at least i'm pretty much guaranteed to be able to play whatever comes out a decade from now. if the world even has a decade left that is.
I recently got a 3080 because it was cheap. ($450 new on ebay) It's pretty underwhelming. NO game runs at 60fps with ray tracing on, even at 1080p with optimized medium-high settings (forget ultra, forget 1440p or 4k). I thought ray tracing was their entire marketing gimmick and the 3080 was the flagship until recently? I don't really care about it but if the 3080 can't do it, then what the fuck can? The $2000 4090 that only just came out, that almost no one owns? Also the nvidia control panel is absolute fucking dogshit compared to radeon. The only things nvidia has going for it are DLSS (which virtually no games support), marginally better adaptive sync features, and ray tracing that runs like shit and is barely even noticeable. New radeon cards are actually beating nvidia not only at price/performance but also at thermals and energy efficiency. Why does anyone still buy new nvidia cards? I've always just bought whatever had the best available performance for the price at the time near the flagship level, but I might go to radeon for good after this. Nvidia's homosexual drivers requiring an account is some of the more israeli shit they've ever pulled and even then they're a bitch to navigate, everything is split into two guis (fucking why?) and they still don't have half the features radeon drivers do.
>No game runs at 60fps with ray tracing on, even at 1080p with optimized medium-high settings
You probably configured something wrong. My 2060 super can push 60+ frames at max settings (albeit with DLSS set to quality) in a bunch of games.
I doubt it. I've been building PCs, modding games, and doing obscure performance tweaks to make shit run on hardware that shouldn't be capable for 20 years. I also have background in compsci and I'm a senior sysadmin. I've always noticed that FPS in reality is usually 10-25% lower than what NPCs claim online though. It's like the typical pathologically obedient mentality behind the NPC retail wage, salesman, fanboy, or fake social media timeline. NPCs are delusionally optimistic to the point of complete dishonesty, always pretend things are better than they are, and I'll never understand it.
I dunno what to tell you bro. I'm currently playing Deathloop maxed out with ray tracing and dlss on and getting 100 fps @ 1080p on average with 1% lows around 75. Again that's on a 2060 super. Your 3080 should be doing better than you report.
>i'm pretty much guaranteed to be able to play whatever comes out a decade from now
oh my sweet summer child
pajeet coding will throw a stick into your wheels
all new games are garbage anyway
Bruh, my R3 1200AF 4,1GHz steamrolls everything I throw at it. You had a fine CPU
how did the sandy bridge do it, anons?
My i5-2500k was great for 9 years, then I upgraded to an i7-2600K and it eats any new game I throw at it. Nothing compares to Sandy Bridge. I thought the Athlon 3000+ was great but this blew it away.
Yeah I've noticed CPUs have hardly gotten any faster in real world gaming performance since around the 2000 series. Idgaf about benchmarks, I care about how games actually run. My wife's PC is running some shitty office tier 4000 series i7 I pulled from a decommissioned OC at work I had been using until recently and it hasn't bottle necked either of us in anything yet. I only upgraded to a 5600x because I had most the parts to build a new PC for her with spare parts but it's not really any different. I guess if you do autistic shit like encoding video all day it might matter but not in every day or gaming use.
single core performance has pretty much peaked as far as noticeable to the average consumer. all they can really do at this point is just add a shit load of purpose driven cores and baked in software enhancing functionality that most apps won't even take advantage of. i've had this system since last friday and with just regular desktop use i don't notice any difference from my 8700k as far as responsiveness and app load times. last machine was running off a fast nvme too though. its crazy how you can increase the performance/lifespan of older machines by just swapping in a super fast SSD.
My mobo just went bad can’t find another one not from china makes me sad
A fucking shame gaming has stagnated. I havnt upgraded in like 10 years and my desktop still plays high graphics at 1080p
I think it's a case of the 1080 generation being so widespread which means companies aim to make their games playable with it to get the widest audience.Until the 1080s start expiring on their own
Ive downloader hogwart legacy
That shit is so NOT optimised its crazy. It leaks about 1-2 minutes in. Literally a console port.
T. 2 gb vram mustard race
Please explaine like im short.
Yeah but at what cost? The problem isn't an inability to go smaller, although that's part of it. The real problem is that the economics are making it not worth doing at all.
ive been buying intel stocks for months now mainly because everytime I opened my stock app on my phone, their was an article written by a israelite (the best was a named Noah MosesMan lol) saying why intel was a bad buy
even yesterday morning the top intel stock article was "sell intel, their business makes no sense, says top analyst"
you can make a fortune just doing the opposite of what the "experts" tell you
IMO biggest thing holding intel in the game is their GPU
I wish this were true but economics be damned in all of history..... Shitty economics never stopped us before..
he problem is that government does not invest or actually order anything anymore because the entire business is co-oped at this point. There is no investments into private company to produce new technology. There will only be advacments in areas of improsment of humans.
the a770 is sick, especially for like multiple screen data recovery programs
but im using 13900k rn and it is beyond epic
isn't it crazy seeing the 32 threads in task manager. kind of overwhelming.
reckon i like me them graphs, simple as
Intel just cut their dividend by 66% lmao
Intel are great at announcing “exciting prospects” that are not realised for full scale production, I will hold off.
OMG IMG ONNA
I HECKIN LOVE 1.8NM AND 2NM PRODCUTION NODES AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
gubment wont let any good tech out. they think brown ppl will out smart them.
ironic, i know.
>Intel's market cap today: $105 billion
>TSMC's market cap today: $437 billion
>Intel's market cap in 2030 when their Intel Foundry Services is in full swing and the AI revolution is on us: $1 trillion
>TSMC's market cap when China invades Taiwan before 2030 and the bombed out fabs becomes controlled by the heavily sanctioned CCP: $0
Now add what that looks like when INTC chips are guiding personal sized guided missiles up the asshole of every Chink General who tested the US?
Little late to party homosexuals.
The tartarians had this a very long time ago.
Cool, can't way to play shitty open world at some unnoticeably higher FPS while providing ten times more data to the occupying government.
All I do is read and watch videos. My 2009 Thinkpad with an i5 is adequate. Maybe we don’t need more technology than web 1.0 and dumbphones.
intel is zog pwned hardware beyond gen 5.
REMINDER THAT NIGTEL IS BACKDOORED.
THEIR CPUS RUN THEIR OWN SMALL OPERATING SYSTEM CALLED MINIX.
IT CAN CONTOL YOUR ENTIRE SYSTEM.
still not using amd haha soorry just ughhhhh i know but haha nooooo
>THEIR CPUS RUN THEIR OWN SMALL OPERATING SYS
intel core xeon has this backdoors?
NigTel is backdoored AND it has the added problem of being manufactured in Pissrael.
However, look into AMD PSP, which was incorporated around the same time as the Intel ME (2013) - nothing is safe. Mass surveillance projects are the norm not the exception.
The safest method is air gapping, but that won't stop glownaggers coming and pointing an antenna at your house or using the electricity lines to tunnel into your shit. Having said that, if you are serious, all of their intrusion methods are detectable one way or the other.
webm related indexes only a fraction of some of the programs. the ones for phones are insane (see WARRIOR PRIDE).
Also if anyone is interested, music that went with this WEBM is Rob Dougan - Clubbed to Death (Kurayamino Variation)
Post your INTC positions then, surely you believe what you're saying and want to make easy money off of it?
I am broke anon, I own no positions.
that's ok nvm then
Intel is the white man's choice
AMD is garbage that degrades IN STOCK
a long, long time ago i got my first and only AMD CPU, it was nothing but trouble because of incompatibilities, heating, etc. i know they've obviously improved since then, but the safe bet was always intel for me after that. i don't buy a new system every other year and i'm tired of wasting time troubleshooting avoidable problems. thus, intel/nvidia for life.
AMD is a steaming pile of crap
think about this, Ryzens have working voltages of OVER 1.4 fucking volts in 5-7nm process chip! They are literally designed to fucking die on you
and nvidia cards don't even fit in computer cases.
Well NVIDIA at least makes better quality chips than AMD
although with the current pricing, not sure if I ever get one, for my games, iGPU is enough
They're called APUs you dumb nagger. But yeah, the Ryzen g series APUs are all you really need for 1080p gaming.
reddit is right over there
/pol/chuds choose based Intel, a white man's choice
Summer 07 trully never ended.
>NOOOOOO YOU MUST BUY RYZEN!!!! LINUS TECH TIPS SAID SO!!!!!!
I literally do not care what you buy you stupid homosexual.
S O Y
my rtx 4090 had to be bolted to the chasis with a strong metal elbow bracket with 8 screws to keep it from ripping the pci express port out. and yeah, if you go to a build your own pc site and pick 4090 you lose like 75% of case options. the thing is fucking massive.
I fit my 3080ti into a tiny little form-factor MicroATX case. It's just gotta be built for it.
they have laptops with 4080s coming out soon, not sure how the fuck that's possible. unless that 4080 runs like complete shit in comparison to the desktop version
Laptops always run much worse than their desktop equivalents.
>Intel stock price eats shit
>Articles about them having new CPU tech pop up
Switch out the bullshit mobile cores and stop changing sockets every year and I'll consider switching from AMD.
Fuck both of the modern 100C housefires though.
My 5950x barely cracks 35C in normal use, where as the current offerings are built to run as hot as possible to gain an edge over the competition.
>he bought a Ryzen CPU
PFFFFFFFTTT AHAHAHAHAHAHAHAHAH WHAT A RETARD
Easily the most efficient CPU around and fast as fuck.
Of course you can hold onto the broscience idea about high voltages that some reddit cuck came up with around the launch.
I'll patiently wait for this to surely die on me.
yeah I still got the 2950x, it runs like a champ. Giant oversized fuckin die. Lots of cache too. MMIO split down right the middle too(can optimize VM p100)
lol AMDnagger, no worries, your "processor product" will serve you a couple of years for sure
my 1600x is still going strong. Honestly at this point I've had plenty of use out of it for the price and still feel no need to upgrade.
Ryzen 1st and 3rd gen ain't bad, it's the 2nd gen that are degrading
I knew they the news was nonsense and when people said how behind Intel was, it's because gaming has been dead for a decade and this technology will be used for much more important purposes before it is ever in the hands of a consumer.
That said, I am I still running a i7-7700K.
This. High-end turbo-charged graphics cards are practically obsolete for any game worth your time, barely anyone plays AAA stuff anymore, and a good graphics card bought in like 2018-2019 can run de-facto everything you'd really wanna play, even if you'd have to put graphics on average to get that 60FPS. Still, computing is the future.
you can play cyberpunk at quality settings with a 1080ti and that card came out 7 years ago. like i mentioned earlier. RTX is a meme. Only a handful of decent games currently support it and why would developers waste time adding functionality/assets only half their users will gain a slight visual benefit from.
i got a 1070ti and i don't foresee myself buying another graphics card tbh
Oh boy i can't wait for the latest Incel Cope 9001 with israelite cum under the lid
more like israeli nanometers
Intel is not using real scales, those are just some marketing scales
heck, Intel can't even do 10nm, only 10jnm
do you feel those nanometers up your taco smelling arse, Jose?
>Wang Rui, president and chairman of Intel China, said at an event that the company had finalized the development of its Intel 18A (18 angstroms-class) and Intel 20A (20 angstroms-class) fabrication processes.
HUE HUE HUE
CHINA NUMBA ONE
X nm doesnt actually refer to any physical measurement in cpus
Its a marketing term
Look it up, distance between transistors are still in the 40 nanometer range
It's measuring distance between transistors? I always thought it was the thickness of the board. God I am dumb.
>measuring distance between transistors
Moore's Law states that the number of transistors on a microchip doubles every two years. The law claims that we can expect the speed and capability of our computers to increase every two years because of this, yet we will pay less for them.
I'm not typing all this out again.
As always these threads of full of blithering retards.
>It's measuring distance between transistors?
No. Distance between devices is gate pitch.
so whats the point of even doing EUV with mirrors and all that expensive as fuck shit.
stick to PRE-EUV processes, and just create a chip lasagna. Infinite transistors per nm^2
imagine how electromagnetically fragile and easy to fry this shit must be, compared to say Sandybridge?
>Intel Completes Development of 1.8nm and 2nm Production Nodes
>will now charge $1400 for an entry level CPU
Fuck these greedy and monopolistic chip makers. They should all be broken up into a thousand tiny companies.
>nm means nothing
>ghz means nothing
what matters is to make it seem as if they are making faster chips, by slowing down old chips
>8 gb ram
Still can into most of the everyday tasks like web browsing, watching movies, doing some Python stuff with ML and AI, and playing CS GO or Genshin. I guess silicon is dead.
there are just little to no new games that are worth the upgrade
Although Squad and Hell Let Loose are OK
This is the point I'm trying to express: all those nanometers are useless since no one uses them efficiently. The difference between 32 nm and 10 nm is so small that an ordinary PC user won't recognise it. Aside from scientists and graphic designers only some synthetic tests could show how powerful your PC is.
>The difference between 32 nm and 10 nm is
The difference is that the newer chip will not live as long and is more susceptible to damage from voltage and temperatures.
Try to edit video on a quadcore 32nm Sandy Bridge i7 2600k vs a 10nm Alder Lake i9. The difference is utterly incomprehensible. We can do things in minutes now that used to take days.
get on my level
Sorry, bro. I do not use Ubuntu. I'm on RPM side.
I need to change though idk what to use.
I've tried several Linux based OS so far. The most stable and reliable is Debian. Less stable but more up to date is Fedora. For dudes who want real privacy and safety custom made Gentoo is an option.
Zorin OS my friend, been using it for 7 years. Best one so far
I have never heard about it, bro.
oh, so how many more years until they release chips that aren't so warm it's completely retarded?
won't even talk about HW encoders with free drivers, or ECC support
Unless power consumption(thereby efficiency) starts to massively outpace area scaling then thermal density will continue to increase, and chips will be harder and harder to keep cool.
IHS is still a huge limiting factor. Even now the power hungry intel gen 13 and AMD Zen4 top SKUs are easy to keep at very moderate temps on a naked die.
The advantages of EUV over older DUV are better etches. The definition and consistency of structures being laid down. EUV facilitates a reduction in total mask counts and brings down total exposures as a result. You can etch with one pass of EUV and get better definition than you can utilizing SAQP with a DUV lightsource.
a total mask count reduction doesnt mean SHIT
if it is going to be a gorillion dollars, and if your process is stochastic as fuck, rendering useless wafers on those trial n errors israeli cycles
>look Goyim, on the good days the process can yield you what our israeli specs are telling you
>oh no, you didn't have a good production day, expend more money with us so we can help you out sort of what the problem might be
Holy shit now my phone is gonna be 5.75 mm thick as opposed to 5.8 and weigh .0005 grams and Twitter is gonna load 0.000001 seconds faster less oh God oh God IM COOONSSOOOOOMMMING
how did they fix/avoid tunneling?
>have metal stack with proper capacitances
>have proper insulators and source and drain wells
>have a channel with good characteristics
>don't have design technicians on team educated by reddit clickbait articles constantly foretelling doom and always being wrong
Its just that simple
ok but when do they actually start producing? and at what cost...
lets be real, intel has gone to shit, they are basically done as a company. Too many parasites latched on.
Intel's 13th gen Raptor Lake parts are still intel's N7(formerly called 10nm enhanced Super Fin)
They've yet to commercialize their N4, N3, both of which will be some years off for actual market penetration.
>a total mask count reduction doesnt mean SHIT
Yes, actually, it means a lot. Mask count can increase complexity exponentially. You need more exposers, theres more potential for misalignments, high mask count processes have the worst yields. Its why we've seen partial EUV processes accelerate in ramp up better than all others in recent history.
2 more weeks
How small is the NSA backdoor?
Interdasting, very interdasting. Hmmmm...
I think the most pertinent point to contemplate is, having weighed the judgements of all anons assembled thusly, will the corresponding performance improvements tempt the master creators of the greatest computer game ever, Thing On a Spring, obviously, goes without saying really, into coming out of retirement and delivering us a new epic?! Cassette loaders at the ready, gentlemen.
>Thing On a Spring
Who left grandpa alone with the computer again?
Hey!!! I’m gonna update to an Amiga when the second hand prices become reasonable. I’m not that out of touch.
YAY NOW WE CAN HAVE EVEN FASTER PROPAGANDA GOVERNMENT SPYING DEVICES IN OUR POCKETS
fuck this autistic cringe thread
GTFO Wrong board
why is no one punching her?
Just built a system with an E5-2696 v3 (Haswell-EP, 18-Core HCC die with 45MB L3, 3.8 GHz max turbo on 22nm and 662mm^2 die size, quad channel DDR4), and an R9 295X2 (dual GPU card — basically 2x 290X 4GB and 500W factory water cooling) into a Lian-Li A05 which is the smallest mini tower that fits full size boards. Reverse flow airflow too and the card faces up (reverse-ATX). All aluminum case.
Don't care. Still buying an AMD CPU
i dont care. im never buying intel ever again
I'll stick to my apple m2 chip pal.
INTCel bros we're back!
Fuck. Last week I sold INTC and bought AMAT.
STILL USING MY AMD FX 8350 AND PLAYING CYBERBOONK WITH 35FPS AND W7
tho, im thinking about upgrading end of this month for a nigtel i5 13600kf. what u think anons, should i do it?
yeah it's not a bad CPU
ryzen is a big jump itself.
i had an fx9370 i had overclocked to 5Ghz and switching to a 3600xt was night and day
read reviews, read buyers reviews and watch videos of ppl using said hardware mostly
and using benchmark programs
i have very tight budget, unfortunately.
i read the i5 kf is on par with the 400€ amd cpus, but got heat problems.oh well it will be fun to debug while building the machine together
i dont trust reviews.
>i dont trust reviews
sorry, i meant "reports on one's experiences".
I literally read hundreds of reports on one's experiences regarding hardware or generally huge investments; build the median/average of it and ofc, read only the negatives first and use my God given discernment if i go in for the buy.
>Verification not required.
>if you're looking at "raw performance" then look at cinebench leadersboard
Floating point engineering for me. Will check it out.
>Don't assume that "bigger number = better"
Yeah, I learned this when I found some i5s beat some i7s.
How do you anons benchmark these processors? Or what sites reliably report benchmarks?
It depends, ie single core, multi core, memory speed, cache, instruction sets (AVX 512, etc). if you're looking at "raw performance" then look at cinebench leadersboard, 7 zip compression/decomprenssion tests, encoding tests, gaming fps(altough be more careful with those since the game version, GPU, updates, etc will affect performance and thus fps numbers).
All depending on what you're intending to use the CPU for. Don't assume that "bigger number = better", ie clock speed, model number, number of cores, etc
If you have a big budget, consider one of the 3D cache (its not a meme feature) processors from AMD. They're top of the line right now for gaymen and productivity due to their tons of cache available.
If you're on a budge then yeah, a recent i5 or r5 will be more than enough for a modern computer.
More like intel-aviv. Practically a israeli company.
is that a mercury converter? or something called like that?