>haha look at the presumed american being mad that systems of measurement can be mangled very easily, specifically colliding with base 10 haha at least im proud of being ripped off
Buy a rope and make yourself a swingset
No. They are just being used for that purpose in specifically this case as theres a terminology confliction between two different systems of measurement. Its not hard to understand
this led to corporations selling you less storage for the same money
it's like you go buy a milk to find out your favorite 1L bottle is now 950ml
3 weeks ago
Anonymous
>less storage for the same money
actually the price of storage/money is continuously falling
3 weeks ago
Anonymous
well maybe add $1 to the hdd price instead of changing fucking terminology which was used during the past 30 years in computer science
3 weeks ago
Anonymous
how about we just preclude any possibility of ambiguity by just using iec prefixes
3 weeks ago
Anonymous
[log in to view media]
how about not introducing the ambiguity in the first place?
3 weeks ago
Anonymous
the ambiguity was introduced the first time someone used k to mean 1024, iec was created to fix that
3 weeks ago
Anonymous
there was no ambiguity until hdd companies started deceiving their customers, not mentioning normies don't care about difference between megabytes and mebibytes
or do you work for seagate or something?
3 weeks ago
Anonymous
no it existed pretty much from the start
https://dl.acm.org/doi/10.1145/362929.362962
3 weeks ago
Anonymous
>Tera is equivalent to a trillion >HDD Corporations: here's a 1 Terabyte drive, it contains one trillion bytes exactly! >Microsoft: no, it has less than a trillion* bytes (*trillion is our special unique kind of trillion that does not mean trillion) >OMG HDD CORPORATION WHY YOU DECEIVE ME??????????????????????????? SAVE ME MICROSOFT WINDOWS 11 IS SO GOOD
HDD manufacturers declare the capacity as
2.000.000.000.000 Byte
There is nothing wrong with that. If clueless normies, who have no reason to buy hardware when they don't know shit about it, conclude that that must mean the HDD has 2TB of capacity and then shit themselves when Windows shows them 1.86TB instead of 1.86TiB or 2.00TB, how is that the fault of the HDD manufacturer?
Retard here, why is this a thing? Why don't they just report amount of memory I can use to store shit? Why they gotta be passive aggressive about this. Hard drives I don't really care, but for flash drives they fucked me over cause something that is 64 gigs is really 58, which is just little too small to keep my collection of roms
The computer will interpret 1mb as 1024kb (because a computer reads binary code, which is base 2) but the base 10 (decimal system that humanfags use) says 1 mb = 1000 kb which is false
The standard manufacturers go by is the latter, its a scam lol.
The reason everything about computers is weirder than anything else we supposedly invented is that in this case, we didn't.
>so like uhhh the high level language is written in a lower level one and then the machine magically "understands" the machine code >yeah and then electricity goes bzzz and the circuits made of the magic mineral answer our questions
I prefer IEC prefixes because they're unambiguous. "5 GB" could mean either 5,000,000,000 or 5,368,709,120 bytes, but "5 GiB" will always mean the latter.
The one thing I pick my bone at are internet providers. The fucking bit / byte speed can go shit itself.
Anytime you move file through any other transfer you don't have to recalculate in head how long it takes to pass through?
Fucking why ?
And I don't speak about IT needs I speak about shit marketing blasts to normies. Oh the number bigger so it must be better !
Its divide by 8 if I recall. Gives me a headache when dealing with anything about it.
Especially when I have to make speed limits for every smart shit my family discovers.
This is the stupidest thread on LULZ
Any bit or byte metric about a computer device or protocol is de facto in the context of computing i.e. base 2 , supplanting it with a base 10 version is always a lie or a mistake
If you disagree you don't understand what a lie is
>Any bit or byte metric about a computer device or protocol is de facto in the context of computing i.e. base 2
The actual bytes count up normally in a file
6, 7, 8, 9, 10, 11 bytes
Are you saying metric units for measurement of data were invented to deceive computer storage customers?
decimal prefixes yes
well your obviously wrong
hello victim of corporate scam
This isn't metric. However the metric system was invented in the same light. Base 10 anything is a fucking travesty.
>Base 10 anything is a fucking travesty.
Seething American detected
>haha look at the presumed american being mad that systems of measurement can be mangled very easily, specifically colliding with base 10 haha at least im proud of being ripped off
Buy a rope and make yourself a swingset
seeth
No. They are just being used for that purpose in specifically this case as theres a terminology confliction between two different systems of measurement. Its not hard to understand
but where is the deception? it's not like there is any manufacturer that uses binary-based prefixes, storage labeled 64GB always yields 59.7GiB
>64GB always yields 59.7
>where is the deception
kek
>implying there has ever been a situation where this has led to more HDDs sold
this led to corporations selling you less storage for the same money
it's like you go buy a milk to find out your favorite 1L bottle is now 950ml
>less storage for the same money
actually the price of storage/money is continuously falling
well maybe add $1 to the hdd price instead of changing fucking terminology which was used during the past 30 years in computer science
how about we just preclude any possibility of ambiguity by just using iec prefixes
[log in to view media]
how about not introducing the ambiguity in the first place?
the ambiguity was introduced the first time someone used k to mean 1024, iec was created to fix that
there was no ambiguity until hdd companies started deceiving their customers, not mentioning normies don't care about difference between megabytes and mebibytes
or do you work for seagate or something?
no it existed pretty much from the start
https://dl.acm.org/doi/10.1145/362929.362962
>Tera is equivalent to a trillion
>HDD Corporations: here's a 1 Terabyte drive, it contains one trillion bytes exactly!
>Microsoft: no, it has less than a trillion* bytes (*trillion is our special unique kind of trillion that does not mean trillion)
>OMG HDD CORPORATION WHY YOU DECEIVE ME??????????????????????????? SAVE ME MICROSOFT WINDOWS 11 IS SO GOOD
Did Linux adapt to the Microsoft standard?
thanks to microsoft for being based and using the real prefixes
[log in to view media]
HDD manufacturers declare the capacity as
2.000.000.000.000 Byte
There is nothing wrong with that. If clueless normies, who have no reason to buy hardware when they don't know shit about it, conclude that that must mean the HDD has 2TB of capacity and then shit themselves when Windows shows them 1.86TB instead of 1.86TiB or 2.00TB, how is that the fault of the HDD manufacturer?
linux shows 1.8 tb too
you got fooled by corporations into believing it while you getting scammed out of storage
Google says that 2 trillion bytes are 2 terabytes
well maybe don't believe everything you see on google
google also says trannies are real woman
[log in to view media]
they don't write 2.000.000.000.000 bytes, maybe only on another side with 5pt transparent font
>1.86TiB
cringe, it's 1.86TB
Wouldn't it be 2TiB
>2TiB
TiB does not exists, stop it
The bit-byte scam ISPs do is far worse
well, at least they don't try to hijack terminology
Not the same
Retard here, why is this a thing? Why don't they just report amount of memory I can use to store shit? Why they gotta be passive aggressive about this. Hard drives I don't really care, but for flash drives they fucked me over cause something that is 64 gigs is really 58, which is just little too small to keep my collection of roms
because fucking capitalists
Base 2 vs base 10
The computer will interpret 1mb as 1024kb (because a computer reads binary code, which is base 2) but the base 10 (decimal system that humanfags use) says 1 mb = 1000 kb which is false
The standard manufacturers go by is the latter, its a scam lol.
> because a computer reads binary code, which is base 2
jesus
So are they using the 1024 because it is easier to write it down in binary code?
lol what are you even doing on LULZ
retard
it's literally 1111111111 - it's computer base10 of ten "down" bits.
>1024 == 1111111111b
of by one kek
it's 0x400, 4*0x100, 4*256.
10^0=1
get smart friend!
0^0=0
owned
0^0=1 but okay
What if the file is 35 Bytes in size
International standards are stupid inefficient trash, which is why no person with a brain will ever use them.
you send your comment using the HTTP/1.1 standard, dummy
This, binary math is unreal to most boomers/normies, rarely
schools even taught them
Based base 2 chad, reject disk storage scams
I was always entertained by the fact that a 1.44MB floppy took a bet either way, and was exactly 1,440KB, 1,474.560 bytes, or 1.38MiB.
Least retarded LULZ thread
>standards are BAD!
the world would be better off without "people" like you
[log in to view media]
The reason everything about computers is weirder than anything else we supposedly invented is that in this case, we didn't.
>so like uhhh the high level language is written in a lower level one and then the machine magically "understands" the machine code
>yeah and then electricity goes bzzz and the circuits made of the magic mineral answer our questions
>an alien spacecraft land in your back yard
>you reverse engineer, patent and start producing them within a few months
OK
[log in to view media]
>ask computer scientists how the silicon magic really operates or is produced
>no one has a clue
OK
[log in to view media]
I prefer IEC prefixes because they're unambiguous. "5 GB" could mean either 5,000,000,000 or 5,368,709,120 bytes, but "5 GiB" will always mean the latter.
The standard make it easier to NOT be deceived
Only leftists, crooks and globohomo troons would concoct such reframing of a decades old established standard to disrupt the norm.
1024KB is 1MB. Anything less isn’t.
[log in to view media]
>decades old established standard
>pretending to be retarded
>my delusion of the importance of my nerdy niche trumps reality
Many such cases.
8 bits = 1 byte
Computers run on base2
This is why base10 isn’t used by people who are knowledgeable and competent in computer science.
TL:DR your a plebeian if you use base10 for calculating storage in computers.
The one thing I pick my bone at are internet providers. The fucking bit / byte speed can go shit itself.
Anytime you move file through any other transfer you don't have to recalculate in head how long it takes to pass through?
Fucking why ?
And I don't speak about IT needs I speak about shit marketing blasts to normies. Oh the number bigger so it must be better !
What is so hard about, just divide by 10 it's roughly what you are getting.
Its divide by 8 if I recall. Gives me a headache when dealing with anything about it.
Especially when I have to make speed limits for every smart shit my family discovers.
Why did they make 1024 Bytes = 1 KiB?
It must be due to the binary counter (and not the actual storage). The binary counter is easy to read with the binary counting system.
0000000000010000000000 - 1024 Bytes (1 KiB)
0000000000010000000001 - 1025
0000000000011111010000 - 2000
0000000000100000000000 - 2048
0011110100001001000000 - 1000000
0100000000000000000000 - 1048576 (1 MiB)
1000000000000000000000 - 2097152 (2 MiB)
This is the stupidest thread on LULZ
Any bit or byte metric about a computer device or protocol is de facto in the context of computing i.e. base 2 , supplanting it with a base 10 version is always a lie or a mistake
If you disagree you don't understand what a lie is
>Any bit or byte metric about a computer device or protocol is de facto in the context of computing i.e. base 2
The actual bytes count up normally in a file
6, 7, 8, 9, 10, 11 bytes
The file is not either 2, 4 or 8 bytes
Congratulations, you have discovered the idea of counting
Why are we counting like this - 998, 999, 1000, 1001, 1002, ..., 1021, 1022, 1023, 1,000
Nobody but you is
3 in binary is 11
When a microcontroller advertises 4K flash memory, how many bytes is that?