What comes next after reaching the physical limits of silicon?

What comes next after reaching the physical limits of silicon?

  1. 4 weeks ago
    Anonymous

    Apple's M2 Silicon.

    • 4 weeks ago
      Anonymous

      autist

  2. 4 weeks ago
    Anonymous

    Specialized chips obviously.
    Already seeing it in raytracing, ai, stuff like that. Also software optimizations.

    • 4 weeks ago
      Anonymous

      I do think we may see FPGA cards in addition to our CPUs and GPUs. What CUDA does for compute, FPGAs can do to CUDA.

      Obviously we need a lot of software before FPGA cards for consumers becomes sensible, but the benefits could be huge.

      • 4 weeks ago
        Anonymous

        Yeah I see the potential for FPGA units in consumer use but not for a long time I don't think. It's still far away from being economical/applicable to daily uses.

        • 4 weeks ago
          Anonymous

          Actual FPGA chips aren't that expensive. I'm having a harder time coming up with applications though. The best I can think of is stuff like decoding video formats that are invented after the chip is made. But GPGPU seems more applicable for that.

      • 4 weeks ago
        Anonymous

        Yeah I see the potential for FPGA units in consumer use but not for a long time I don't think. It's still far away from being economical/applicable to daily uses.

        FPGAs are dead in the water for general purpose compute with how absolute shit and vendor/version specific the tooling is.

        • 4 weeks ago
          Anonymous

          Yeah I see the potential for FPGA units in consumer use but not for a long time I don't think. It's still far away from being economical/applicable to daily uses.

          Yeah, FPGAs today are basically where GPUs were in the mid 90s. They existed, they kicked ass, but you had to write your software specifically for this one model. Corporations did some pretty impressive stuff with the GPUs of the time, just like corporations today do some pretty impressive stuff with FPGAs (Bing replaced entire racks of algorithm computers with FPGA cards).
          Some standardisation and common libraries like openGL and DirectX and I could see them become more viable.

          • 4 weeks ago
            Anonymous

            Spot on. They have potential but we're 10 or 20 years out.

      • 4 weeks ago
        Anonymous

        >FPGA cards
        I hope not.
        I wouldn't mind an FPGA socket on the MB, but in my opinion the expansion card model is pretty aged. I'd rather have a GPU socket also. You could have a new motherboard standard with CPU socket on top, GPU socket in the middle, and FPGA socket on the bottom, and end up with much slimmer and more efficient computers than we have today, with effective server-like front intake rear exhaust ventilation using standard heatsinks. The benefits of ATX is easy expansion with cards, but the only cards 99% of people use today is GPUs anyway. It made more sense back when a computer wasn't usable with less than five cards. Better to move that to the MB. Most don't even use SATA any more, just two M.2s and a high speed ethernet is enough thanks to NAS.

      • 4 weeks ago
        Anonymous

        >FPGA cards
        It's doubtful that another 'aib' other than gpus will ever enter the consumer market.
        There has, however, been a push towards making some parts of the cpu reconfigurable (intel csa) to reduce power usage and to speed up some algorithms.
        Besides that, FPGAs are being coupled with memory in servers for in-situ processing, no discrete board needed.

      • 4 weeks ago
        Anonymous

        I dunno what FPGAs will do considering the MiSTer can only really emulate a PSX/Saturn.

        • 4 weeks ago
          Anonymous

          FPGAs can be programmed to mimic any kind of accelerator circuit, which means the most common ones like encryption and raytracing will remain on our CPUs and GPUs, while more niche ones can go on the FPGA. Horrendously inefficient code can be AI-optimised into a weird circuit, and FPGAs can be used to implement these immediately. For example.
          "Hey Siri, find all my photos of clowns" uses the AI accelerator on your GPU to process natural language, and an FPGA circuit to extremely rapidly classify and filter out the clowns in your images. FPGAs are CUDA on steroids.

  3. 4 weeks ago
    Anonymous

    they’re going to encode software functionality into the hardware directly completely undoing the philosophical progress we’ve made towards general computing. rather than abandoning the transistor model, directly taking heed of their own propagandisms “computer science isn’t about computers”, they’ll continue to stack SIMD processors on SIMD processors until every liter of oil and every kg of coal in NA is being used to compute dot products.

    • 4 weeks ago
      Anonymous

      this... is so true

  4. 4 weeks ago
    Anonymous

    [log in to view media]

    biochips

    • 4 weeks ago
      Anonymous

      How did they cut it open without accidents cutting into the brain. Also no iron filings from the cutting either.

  5. 4 weeks ago
    Anonymous

    Maybe MESO?

  6. 4 weeks ago
    Anonymous

    no more electron apps

    • 4 weeks ago
      Anonymous

      sorry. Baseddevs are everywhere and they are too addicted.

  7. 4 weeks ago
    Anonymous

    Huge scale cloud computing farms and 5g will remove the need for smaller more powerful silicon

    • 4 weeks ago
      Anonymous

      Only correct answer in this thread. Everyone else is retarded

      • 4 weeks ago
        Anonymous

        There will always be applications for owned hardware

        • 4 weeks ago
          Anonymous

          Like what? I want to believe but I don't see it.

          • 4 weeks ago
            Anonymous

            Video games immediately come to mind
            Also on the enterprise scale, companies just aren't going to want to depend on a provider for everything. They just aren't. If that were the case we wouldn't still have salesmen, accountants, legal teams, etc. They'd all be contracted out

            • 4 weeks ago
              Anonymous

              >Video games immediately come to mind
              Idiots aren't going to care about the horrible latency.
              One thing that brings me hope though is how utterly unprofitable these services are. I believe Spotify cut the revenue of the music industry in half.

              • 4 weeks ago
                Anonymous

                >Idiots aren't going to care about the horrible latency
                Yes, this is actually why stadia managed to completely shut out Sony and Microsoft and why Google is the biggest name in gaming

              • 4 weeks ago
                Anonymous

                RDR2 on PS4 already has like 350 miliseconds of lag. Stadia failed because it's a shit service, the real threat is things like Gamepass.

            • 4 weeks ago
              Anonymous

              [log in to view media]

              >companies just aren't going to want to depend on a provider for everything

            • 4 weeks ago
              Anonymous

              >he says as every company switches over to cloud computing
              It happened at my megacorporation job. It'll happen to yours.

              • 4 weeks ago
                Anonymous

                What does your company do?

              • 4 weeks ago
                Anonymous

                Retail. All computer units in over 5000 stores nationwide forced to switch to cloud point of sale units over night. Corporate now micromanages every store's time sheets and everything since it's all now tracked by them. Formerly it was all up to the discretion of an individual franchise.
                Also they fucking suck but that goes without saying. We went like 2 weeks with no ability to print anything at all.

              • 4 weeks ago
                Anonymous

                That must have been a hell of an on-call week.

            • 4 weeks ago
              Anonymous

              [log in to view media]

    • 4 weeks ago
      Anonymous

      >input latency and bitstarved video streams are the future
      i hate the antichrist

  8. 4 weeks ago
    Anonymous

    your mom

  9. 4 weeks ago
    Anonymous

    molecular processors
    optical/photonic processors

  10. 4 weeks ago
    Anonymous

    don't worry they'll find a way to make it faster yet the code runs less efficiently like it ever has since the 2000s

  11. 4 weeks ago
    Anonymous

    silicon 2
    or maybe silicon B

  12. 4 weeks ago
    Anonymous

    Silicone

  13. 4 weeks ago
    Anonymous

    Step 1: Instead of using electricity between components of a CPU, use light. Use light switching rather than transistors.
    Step 2: Remove the solids in the CPU, since light travels at about 1.5x slower in a physical medium than in a vacuum. This will also get avoid the excess heat.

  14. 4 weeks ago
    Gon

    [deleted post]

    [deleted post]

    Rocks.

  15. 4 weeks ago
    Anonymous

    a fuckton of optimisation in both hardware and software
    and bloat, a lot of bloat
    hopefully the need for good backend for the bloat will lead to rise of open source projects but I wouldn't count on that

Your email address will not be published.