Now that the dust has settled, is M2 is the best out there?!

Now that the dust has settled, is M2 is the best out there?!

  1. 4 weeks ago
    Anonymous

    >no AV1 support
    >no hardware ray tracing
    >in 2022

    Good luck, Apple.

    • 4 weeks ago
      Anonymous

      qualcomm doesn't support that garden gnomegle shit codec either

      VVC is coming, AV1=DOA

      • 4 weeks ago
        Anonymous

        [log in to view media]

        >VVC is coming

        >proprietary, royalty-encumbered, locked down codec

        Nah, I can't do it.

        • 4 weeks ago
          Anonymous

          >proprietary
          the reference encoder and decoder are open source

      • 4 weeks ago
        Anonymous

        VVC is worse than AV1 but, Okay

        >No AV1 support
        >No VVC support
        >No hardware ray tracing
        >In 2022

        • 4 weeks ago
          Anonymous

          [log in to view media]

          nah

        • 4 weeks ago
          Anonymous

          [log in to view media]

          • 4 weeks ago
            Anonymous

            [log in to view media]

            ]

        • 4 weeks ago
          Anonymous

          [log in to view media]

          kek

  2. 4 weeks ago
    Anonymous

    lmao
    wake me up when its compatible with x86

    • 4 weeks ago
      Anonymous

      Wake up, Rosetta 2 exists.

    • 4 weeks ago
      Anonymous

      Isn't this the kind of thing that ultimately holds back technology? If every new standard has to support the old stuff too, how will things ever advance beyond the previous generation

      • 4 weeks ago
        Anonymous

        It is needed.
        "congratulations, you have to rebuy every piece of software you ever had and only some exist" is a death sentence to any new technogy.

        • 4 weeks ago
          Anonymous

          just don't be poor

  3. 4 weeks ago
    Anonymous

    i prefer the M3

    • 4 weeks ago
      Anonymous

      Based LULZ-tist

  4. 4 weeks ago
    Anonymous

    [log in to view media]

    • 4 weeks ago
      Anonymous

      It's useless anyway, only gaymers maybe and still make gaymes shit.

      • 4 weeks ago
        Anonymous

        3D artists use RTX to speed up rendering.

        https://code.blender.org/2019/07/accelerating-cycles-using-nvidia-rtx/

        • 4 weeks ago
          Anonymous

          Are you a 3D artist? no, do you even have any artsy skill? no
          ok

          • 4 weeks ago
            Anonymous

            [log in to view media]

            Why does he have to be a 3D artist to be able to critique Apple not having RT cores? It's a genuine miss especially with Apple doing AR/VR and investing into Blender. Without any sort of RT, Apple will keep losing to Nvidia Laptop RTX 3050s in rendering speed even with Metal and the M1 Ultra.

          • 4 weeks ago
            Anonymous

            [log in to view media]

            RTX is a godsend for me

            • 4 weeks ago
              Anonymous

              sauce on image pls

            • 4 weeks ago
              Anonymous

              looks more like a satansend to me
              touch grass

            • 4 weeks ago
              Anonymous

              touch grass

      • 4 weeks ago
        Anonymous

        RTX hardware is basically an "is this line intersecting those triangles" accelerator.
        There's a FUCKTON of science that can be done with that.
        But scientists only use PCs anyway, so it's not of concern.

        • 4 weeks ago
          Anonymous

          ray-tri is only one piece of the puzzle. There's a lot more including ray-box, BVH traversal, bucket sort RT, motion blur acceleration (kinda a stupid feature but it exists on RTX 3000), and that's just what we know about and exists right now

  5. 4 weeks ago
    Anonymous

    [log in to view media]

    >releasing two new laptops in 2022 that can only have 1 external display.

    • 4 weeks ago
      Anonymous

      >pretending laptops are relevant when we have desktops and tablets
      laptop is a poorfag compromise

      • 4 weeks ago
        Anonymous

        [log in to view media]

        >LULZ supporting tablets

        Haven’t been here in years and holy shit you homosexuals went full retard

        • 4 weeks ago
          Anonymous

          Most likely other boards immigrants, give it a week, happens every apple event.

        • 4 weeks ago
          Anonymous

          [log in to view media]

          >Haven’t been here in years
          And it shows

          • 4 weeks ago
            Anonymous

            [log in to view media]

            I think I’m better for it if being here means you turn into an apple dick sucking tablet user .

            • 4 weeks ago
              Anonymous

              nothing wrong with sucking dicks

              • 4 weeks ago
                Anonymous

                [log in to view media]

                Agreed but sucking corporate dick is bad.

              • 4 weeks ago
                Anonymous

                Sucking the dick of people that don't care about you is bad

  6. 4 weeks ago
    Anonymous

    [log in to view media]

    >no AV-1 decoding
    >but 8K x264, x265 and ProRes encode and decode
    How can apple say with a straight face they are trying to relentlessly pursue efficient performance while doing this? What does a sub-1440p ultrabook need the ability to encode and decode 8K for? But I can very easily give a good reason to have HW decode for AV-1, pic related

    also no ray tracing HW despite metal having ray tracing support for years now, what the fuck?

    all in all there's a lot to like but also a lot of "what the fuck were they smoking?"

    • 4 weeks ago
      Anonymous

      >>also no ray tracing HW despite metal having ray tracing support for years now, what the fuck?
      >
      >all in all there's a lot to like but also a lot of "what the fuck were they smoking?"
      Not defending apple because honestly I don't give a fuck about tech companies but could apple have not included their own tensor cores because they're inefficient performance per watt? Don't nvidia chips chew through watts?

      • 4 weeks ago
        Anonymous

        I think this is reasonable unlike only having one external display .

      • 4 weeks ago
        Anonymous

        >but could apple have not included their own tensor cores because they're inefficient performance per watt? Don't nvidia chips chew through watts?
        Apple DOES include their own "tensor cores", and they have for a while. They call it the "Neural Engine"
        Nvidia's performance/watt woes are mostly from using GDDR6X and their aggressive boosting behavior

      • 4 weeks ago
        Anonymous

        >Don't nvidia chips chew through watts?
        Yes but you're getting just as much raw performance out of it. Apple's M1 max only matches a 3080 Adobe premier, and it's not entire because of the GPU performance.

      • 4 weeks ago
        Anonymous

        >but could apple have not included their own tensor cores because they're inefficient performance per watt? Don't nvidia chips chew through watts?
        Apple DOES include their own "tensor cores", and they have for a while. They call it the "Neural Engine"
        Nvidia's performance/watt woes are mostly from using GDDR6X and their aggressive boosting behavior

        NVIDIA's core is actually quite efficient they just overvolt the shit out of it. Same with RDNA2. The Steam Deck proved that. RDNA2's perf/watt is actually pretty insane.

  7. 4 weeks ago
    Anonymous

    Modest upgrade over the original M1.

    Very Modest.

    1080p webcams finally.

    Still limited to 2 displays total.
    RAM can now go to 24GB instead of the original ceiling of 16GB.

    It's criminal that an 8GB option still exists and is the default.
    It's criminal that you still only get ONE external display.
    24GB RAM ceiling is a joke. 32GB should have been an option.
    You can't convince me that a Pro laptop shouldn't be able to have 32GB of RAM. In 2022.

    The new MacBook Pro 13 is a lie. It is not a Pro. They should have renamed it to simply MacBook. Or the MacBook Air Fans Edition. Because that is what it is: a MacBook Air with Fans.

    They didn't even change or update the industrial design of the Pro. The new design of the Air is at least a change. Those borders around the Pro screen are criminally large compared to the Air. Jony Ive's ghost still haunts Apple it seems.

    Like the original base M1, the base M2 is a hobbled chip that belongs in an iPad. Stay away.

    Wait for the upgraded M2-derived chips. Like M2 Pro/Max/Ultra. Or don't wait and just buy a M1-Pro/Max/Ultra machine that has features you need like larger RAM or more displays.

    • 4 weeks ago
      Anonymous

      Base M chips are Apple’s Celeron, the real deal will be M2 Pro and beyond

      • 4 weeks ago
        Anonymous

        no, more like
        base M = i3
        Pro = i5
        Max = i7
        Ultra = i9

    • 4 weeks ago
      Anonymous

      >Can't support more than 1 external
      >8k h.264 for some reason
      >No raytracing

      In any case I'm glad I didn't wait for m2. 19% cpu speed improvement after two years really isn't all that, I don't honestly even really care that much about CPU speed anymore. It has no appreciable impact on my quality of experience or my productivity. Honestly computers have just gotten to the point I really just have no need to care about the latest and greatest.

      Seems mediocre as expected.

      I honestly wasn't expecting them to break the 16gb limit and 24gb is quite a bit for an SoC that can only power one external (less VRAM usage). They probably did it for the sake of 8k encoding. I think by the way that's pretty much a confirmation that the iPhone 14 pro records in 8k.

      Honestly I think it makes nerds angrier than it actually matters, incredibly few users can significantly benefit from >24gb memory and they also sell the MBP. The chip is fine at what it's trying to be, 24gb memory limit is the least of its problems compared to being ARM or supporting a single fucking external.

      [log in to view media]

      >no AV-1 decoding
      >but 8K x264, x265 and ProRes encode and decode
      How can apple say with a straight face they are trying to relentlessly pursue efficient performance while doing this? What does a sub-1440p ultrabook need the ability to encode and decode 8K for? But I can very easily give a good reason to have HW decode for AV-1, pic related

      also no ray tracing HW despite metal having ray tracing support for years now, what the fuck?

      all in all there's a lot to like but also a lot of "what the fuck were they smoking?"

      >What does a sub-1440p ultrabook need the ability to encode and decode 8K for?

      Iphone 14 Pro

      Slower than any industry grade X86 machine. Doesn't run any industry standard software. Congratulations applefags, if your shit wasn't linux tier before it is now.

      As always, anyone with work to do uses a windows machine. Apple is a lifestyle company and nothing more. No one has a real reason to use apple shit over windows (the official OS of having a job) unless forced to by iOS development requirements.

      I have no idea what "Industry grade" means and the machine is as fast as anything on the market at its intended purpose, browsing Facebook. This is a consumer computer, I'm buying one for my stepfather. Technically it's marginally faster, but not to a degree that anybody sane would care. I have a machine filled with "industry standard" and I have yet to run into any of my tools not working.

      You're being vague, unspecific, and lazy because you're more concerned about blowing out crapple than spending 5 seconds critically analysing the system.

      This is M1 Plus. Not M2. Same 5nm process as iPhone 12>13, 15% better performance or same performance and 15% more efficient, and they took the performance gains with the mac and slapped on an additional memory controller to allow for more ram, iPhone took the efficiency gains and got two day battery life. This advanced process allowed for more gpu power but not substantially more gpu power. These MacBook's will most likely also have worse battery life than the m1 machines they're replacing. The latest rumors look to be true: M2 actually debuts next year via M2 Pro/M2 Max/M2 Ultra with an advanced 3nm process in the new MacBook pro, iMac pro, and mac mini/studio/pro. The original apple silicon rumors have been 99% correct, and M2 was supposed to have a 12 core gpu, allowing for dual display support. Now another year will go by where you cannot buy a mac for less than $2K that out of the box supports more than 1 display. A crime, thank god displaylink exists.

      True, this is kind of the shit generation between the m1 and the m3.

      • 4 weeks ago
        Anonymous

        $1200 minimum to browse facebook? get a grip

  8. 4 weeks ago
    ukojpi

    Still only 1 external display

  9. 4 weeks ago
    Anonymous

    Is it armv9?

    • 4 weeks ago
      Anonymous

      Don't know but it's SUuuuuUuuuPeeeEerrr efficient for those intense web browsing sessions.

      • 4 weeks ago
        Anonymous

        until you try to watch youtube or netflix and get buttfucked with software AV-1 decode

        • 4 weeks ago
          Anonymous

          Na it's alright. AV1 runs fine on a dual core i5 skylake.

  10. 4 weeks ago
    Anonymous

    >still not socketable
    >still can't upgrade ram or storage
    >still no pci lanes
    it does not matter how it performs, it's garbage totalitarian unusable trash

    • 4 weeks ago
      Anonymous

      >Upgrade
      But then apple can't overcharge for the higher tier models
      >Replace parts
      Apple already has this problem with some well made generations of MBPs and iPhones working 12 years later. they have to blacklist them in OS updates and installers to stop midwits and retards from using them as long as they works and robbing apple of sales.

  11. 4 weeks ago
    Anonymous

    I’ll wait for the benchmarks but the 20% bump in price and the base model only having 8GB ram is effectively pushing me toward a base M1 Pro

  12. 4 weeks ago
    Anonymous

    I dread a Mac mini with a base M2.

    Still not enough RAM.
    Still only 2 displays.

    • 4 weeks ago
      Anonymous

      >I dread a Mac mini with a base M2.
      >Still not enough RAM.
      >Still only 2 displays.

      For that matter an iMac with a base M2 is also a nightmare.

      2024 is now looking like the year when Apple might un-hobble the base Apple Silicon. M3 might finally give 3 displays and 32GB of RAM.

  13. 4 weeks ago
    Anonymous

    If this pattern keeps up:
    Tick-Tock evolution might exist for Apple Silicon.
    2020 Tick. Original M1
    2021 Tock. Evolved M1 (Pro/Max/Ultra)
    2022 Tick. Original M2.

  14. 4 weeks ago
    Anonymous

    [log in to view media]

    [deleted post]

    >not ready for desktop
    huurrrffffdurrrrfff

  15. 4 weeks ago
    Anonymous

    [log in to view media]

    no

    >Apple’s new M2 processor is mostly an update to the M1, rather than a successor. That mainly comes down to the manufacturing process M2 is built on. Chipmaker TSMC is behind manufacturing for the M1 and M2, and Apple says the M2 comes with a “second-generation 5nm” node.

    >For TSMC, which is by far the world’s largest semiconductor company, a full node improvement is what you’re looking for between CPU generations. That means shrinking the manufacturing process to fit more transistors on the chip while improving efficiency. The problem is that TSMC delayed its next-gen node in 2021, and it appeared to be a prime candidate for Apple’s M2.

    >The M1 is built on TSMC’s N5 node, and the M2 will almost certainly use the N5P node. The true next-gen node is N3, which is a 3nm process that delivers up to 15% higher performance and 30% lower power draw versus N5. By comparison, N5P is a 7% improvement with 15% less power draw.

    https://www.digitaltrends.com/computing/apple-m2-not-next-gen/

  16. 4 weeks ago
    Anonymous

    Didn't 3DO made the M2 back in the 90's?

  17. 4 weeks ago
    Anonymous

    [log in to view media]

    Depends on how well it does on the GTA V benchmark. M1 didn't do to well.

    • 4 weeks ago
      Anonymous

      >running x86 emulation on top of windows emulation on a 35W part vs a 500W+ desktop
      nvidiots everyone

      • 4 weeks ago
        Anonymous

        [log in to view media]

        The point is x86 emulation performance of things that don't play nice with rosetta 2 (ie mostly everything). You can think of GTA V as a baseline to compare everything to.

        • 4 weeks ago
          Anonymous

          People fucking rub their cocks fucking raw to GTAV benchmarks as if Apple users spend all their time playing Video games, things Apple users are well known for doing, buying their Macs to play video games on and all.

          The bigger issue is that rosetta is so ungodly buggy that having a SINGLE rosetta II process can fuck the audio subsystem into disrepair before rebooting. Of course LULZ doesn't know what the fuck they're talking about so they don't talk about actual problems Macs have, they just think that Apple users use high performance software that isn't apple silicon native all day and drool onto their keyboards going "WHY THIS SLOWW??!!" and don't have a realistic impression of who uses these fucking computers. Nobody is playing GTAV on these fucking computers, they're running some shitware driver or enterprise crapware that is rosetta only and it causes the Mac to become so fucking unstable you have no choice but to reboot to resolve the massive memory leak or whatever the fuck happens because Apple is a piece of shit fucking shit company.

          • 4 weeks ago
            Anonymous

            It's not the game, it's the x86 emulation performance. You DO realize you can't just slap the rosetta 2 band aid to everything and the majority of devs would rather drag their nutsacks over a mile of broken glass before porting their x86 software to arm when x86 emulation exists, right?

            • 4 weeks ago
              Anonymous

              Yes numbnuts, I understand your obsession
              "emulation is SSLOWWWW! MAChomosexualS BTFFOO!!!!"

              You guys have been repeating it endlessly for two fucking years and literally cannot comprehend anybody, ANYBODY who was running any application remotely performance sensitive either waited for the apple silicon port before buying the Mac or just bought Windows because people aren't nearly as retarded as LULZ users who project their own retardation onto other computer users. What a profound point numbnuts, you shouldn't buy a computer to run performance sensitive software if you're going to be emulating it all the time, no fucking shit!

              Anybody who actually bought crapple who doesn't have severe brain damage is running basically all native software all the fucking time and maybe like 5 fucking rosetta 2 processes in the background that use like 0.1% CPU but still manage to destroy your computer somehow. The majority of devs would rather drag their nutsacks over a mile of broken glass instead of port?

              Look at the shit that has been ported.
              https://isapplesiliconready.com/for/productivity
              IDEs? Mostly ported.
              Music production? Mostly ported.
              Video production? Mostly ported.
              Photo editing? Mostly ported.
              Random productivity tools? Mostly ported, and many of them run in safari anyways, and they're not performance sensitive.

              I'm using exactly 5 processes that are not apple silicon native, 2 of them were ported already but I'm too lazy to reinstall to update, the remaining ones are Logitech shitware they will never update, some enterprise cloud sync shitware I am actively working to replace since it's redundant and terrible, and Toggl track. These cumulatively use 1% of the CPU. I swear to god reading LULZ you would think my computer is fucking bursting into flame fucking straining under the load of the massive amounts of rosetta emulated heavy duty software, the porting effort having been a complete and utter failure.

              • 4 weeks ago
                Anonymous

                A lot of those ports are genuinely SLOWER than x86 emulation.

                >t. returned my m1 air a week later

              • 4 weeks ago
                Anonymous

                I don't see GTA IV let alone GTA V anywhere in there.

  18. 4 weeks ago
    Anonymous

    Slower than any industry grade X86 machine. Doesn't run any industry standard software. Congratulations applefags, if your shit wasn't linux tier before it is now.

    As always, anyone with work to do uses a windows machine. Apple is a lifestyle company and nothing more. No one has a real reason to use apple shit over windows (the official OS of having a job) unless forced to by iOS development requirements.

  19. 4 weeks ago
    Anonymous

    This is M1 Plus. Not M2. Same 5nm process as iPhone 12>13, 15% better performance or same performance and 15% more efficient, and they took the performance gains with the mac and slapped on an additional memory controller to allow for more ram, iPhone took the efficiency gains and got two day battery life. This advanced process allowed for more gpu power but not substantially more gpu power. These MacBook's will most likely also have worse battery life than the m1 machines they're replacing. The latest rumors look to be true: M2 actually debuts next year via M2 Pro/M2 Max/M2 Ultra with an advanced 3nm process in the new MacBook pro, iMac pro, and mac mini/studio/pro. The original apple silicon rumors have been 99% correct, and M2 was supposed to have a 12 core gpu, allowing for dual display support. Now another year will go by where you cannot buy a mac for less than $2K that out of the box supports more than 1 display. A crime, thank god displaylink exists.

  20. 4 weeks ago
    Anonymous

    >20 HOUR BATTERY LIFE WATCHING VIDEOS, WINDOWS BTFO*
    *only when playing VP9 since AV1 isn't hardware accelerated

  21. 4 weeks ago
    Anonymous

    Aren't they dropping Rosetta 2 support soon?

  22. 4 weeks ago
    Anonymous

    >is M2 is the best out there?
    yes if you can get by with a raspberry pi with a bunch of specialized coprocessors built in

Your email address will not be published.