GRT bros...it's over...

The Graph is hardly generating any fees...even projects you've never heard of are doing better...it's so fucking over

  1. 3 weeks ago
    Anonymous

    i was betting myself for 2025 for this to make any fees. but it's getting really hard to hang in here. i have only 30k delegated. i wish here was anons how knew how to math still. they could give the predictions. i can't do those calculations with my brains. we need quality hopium

    • 3 weeks ago
      Anonymous

      why do you need someone to do calcs every week , they have been done for 100k stack, and 350trillion? queries idk exactly anymore but the numbers in that post will be hit between now and 2025. Just find the post and check for the numbers there if u need hopium. I got roughly same stack as you and wish i had enough to get more, but im accuming other stuff first right now.

      • 3 weeks ago
        Anonymous

        i was betting myself for 2025 for this to make any fees. but it's getting really hard to hang in here. i have only 30k delegated. i wish here was anons how knew how to math still. they could give the predictions. i can't do those calculations with my brains. we need quality hopium

        You guys are going to make like twenty cents a year. This was a scam.

    • 3 weeks ago
      Anonymous

      If you check framwork ventures wallet movements they have been buying grt daily for months now. I cant figure out why any insights?

      • 3 weeks ago
        Anonymous

        Firehose / Substreams powered subgraphs.

        • 3 weeks ago
          Anonymous

          What the fuck is that? How bullish is it for the graph?

  2. 3 weeks ago
    Anonymous

    Yea its a dead coin now. lol. 🙁

    • 3 weeks ago
      Anonymous
  3. 3 weeks ago
    Anonymous

    It's over

  4. 3 weeks ago
    Anonymous

    It's over
    t. 200k delegated
    Thank you Tegan

  5. 3 weeks ago
    Anonymous

    Wasn't it at like $200 last year? That's good growth. Might buy a bag once it goes below 1c.

  6. 3 weeks ago
    Anonymous

    Tegan married, Yaniv not gay. So much has changed since TheGraph started!
    > being thankfull to be part of that!

  7. 3 weeks ago
    Anonymous

    I don’t have any investment since I’m all in LINK like a true biztard but I can tell you the last company I worked for was using the graph but we needed more powerful indexing for NFTs and we built a better version for our needs within a month. A year later it still didn’t work perfectly but was better than the graph. That might not be true for other needs or use cases but it’s enough to make me think whatever they built most companies will probably be better off building their own specialized indexer since it’s relatively cheap and simple.

    • 3 weeks ago
      Anonymous

      I know you're just fudding, but running an ethereum archive node is not simple.
      Big established projects might get away with running one, and then having someone dedicated to making sure it works, and then developing, maintaining and running their own indexer on top of it and hoping it works.
      But that shit costs a lot of money and no way in hell "just write your own indexer bro" is going to hold any water for projects that target ethereum, especially for smaller teams/projects... and I'm not going to delve into verifiable indexing aka making sure your shit actually reflects what's on chain, even though that's also pretty fucking important in a crypto project, because you're not going to get it if you run your own indexer and you will eventually fuck up along the way and cost the project more than if you'd been using the graph from the start... And you were probably paying waaaaaay more anyway because you were spending all your money to run a fucking archive node and all that entails.
      So yeah. Pretty comfy overall, thanks for reminding me why I bought this piece of shit.

      What the fuck is that? How bullish is it for the graph?

      Speed was a major concern, but firehose/substreams tech gave it a major boost so you can index in real time all contracts without worrying about lagging behind during intense activity. Good luck developing something like that on your own.

      • 3 weeks ago
        Anonymous

        Why are the fees so low tho? I know the hosted service hasn't ended but I would think the decentralized mainnet would be bringing in more fees than that. Also do you think once the hosted service ends GRT will be bringing millions in fees per day?

        • 3 weeks ago
          Anonymous

          I expect it will grow considerably after the shutdown, but how much I can only guess.
          I am betting on the very long term for things to go critical.
          Consider that most web3 tutorials start with "create a subgraph". If Web3 becomes a thing and subgraphs become THE development standard, then it would mean you'd earn money from each HTTP request of every user, as every click can trigger a query, possibly even thousands of queries. How would that reflect on the price? Plus the network effect of "everyone uses it, therefore I am forced to use it too".
          But there's lots of ifs in the way, especially "if someone doesn't steal their lunch", as

          It's gonna be rekt by space and time soon anyway, just abandon ship.

          said.
          I'm just a bagholder, and this is nothing but copium and plenty of hopium.
          It's actually over.

      • 3 weeks ago
        Anonymous

        My employer was very interested in TheGraph a year or so ago, since some of the cutting-edge research announced at Graph Day would have supercharged what we were building at the time.

        There have been no further public mentions of that technology since Graph Day, and we've since changed course, bulked up in-house research to fill the gaps, and are shipping our own in-house solution. It doesn't directly compete with the core of what TheGraph does, but it does potentially encroach on some use cases that TheGraph has flirted with but not aggressively pursued R&D for, in what is a massive emerging market in the sector that would have been ideal territory for TheGraph's dominance. It also demonstrates how the markets and businesses adapt to complex needs when the established players don't deliver in a timely manner.

        With that said, the grande armee of archive nodes is what really sets The Graph apart. Management wants our technology to play well with full nodes, but the in-house EVM experts know damn well that within a decentralized context, we're going to need robust, credibly neutral, high-availability, multichain archive nodes for some of the products and services we're shipping. The Graph's EIP-4444 initiative could potentially fill the gaps here down the road.

        The Graph can still win in scalable decentralized offchain data aggregation, which is desperately needed in the world at large granted it's developer-friendly and consumer-friendly, and it still has a shot at being first to the punch at the value subgraph composability provides given their computation and storage firepower, but they really need to move faster in the R&D department.

      • 3 weeks ago
        Anonymous

        >especially for smaller teams/projects
        Smaller projects will simply run a lightweight client w/ pruning turned on and accept the tradeoff of not even needing to query historical data beyond some date, and plug any gaps with something like google BigQuery (who hosts a full archive node). Otherwise just generate the indexed archive data through a bigquery process as an init, then run a lightweight node locally. There is absolutely no need to run an archive node once that data has been indexed.

        • 3 weeks ago
          Anonymous

          I was pricing different crypto APIs last week to get an idea of how much it would cost to power a crypto analytics website. One of them was GraphQL, as far as I could tell they offered a single fucking endpoint for any given application with fuck all data. Absolutely useless, I would sooner run and query my own nodes locally than pay for such a shit service

          Ummmmm can someone refute these points??? Fuck it's over isn't it

        • 3 weeks ago
          Anonymous

          >light client
          Ethereum's dirty little secret is that nobody likes peering with light clients. You'll be waiting hours for some good Samaritan to start providing you data to sync.

          The full node is the sweet spot.

          >Smaller projects will simply run a lightweight client w/ pruning turned on and accept the tradeoff of not even needing to query historical data beyond some date
          Full nodes drop historical state data after 25-51 minutes depending on how their client build is configured. You really can't do anything with a light client, and they're also inherently too economically vulnerable to be a source of truth for any large protocol.

          >Otherwise just generate the indexed archive data through a bigquery process as an init, then run a lightweight node locally. There is absolutely no need to run an archive node once that data has been indexed.
          Doesn't that completely defeat the purpose of Web3? You're not just relying on a centralized data provider, but trusting it to aggregate data in a manner which has no plans to be verifiable.

          • 3 weeks ago
            Anonymous

            >Doesn't that completely defeat the purpose of Web3?
            I guess that just depends on what level of deventralization you are willing to call decentralized. Block explorers like etherscan are centralized and much of the data they show is only possible through indexing. The most efficient method of doing most anything at scale is going to be a centrally managed execution, indexing included. Mining pools and liquid staking services centralize voting power. The trend toward centralization is a natural result of pursuing efficiency. Decentralized structures will have a certain degree of redundancy which trades efficiency for reliability, security etc.

            Crypto brands itself as decentralized and certainly to an extent it is. But it only just leans decentralized, less as time goes on it seems. Because to actually design something that is both highly decentralized and efficient at scale is so difficult that it practically begs to be a paradox. And in most cases I'm not sure it matters, crypto included. Intent from a centralized entity to be decentralized is a moot point. What will make it highly decentralized and efficient is rapid adoption and development from a fuckton of unrelated people. That's the only way I can see it happening. Like globalized economies developing super complicated supply chains slowly over time, it took trillions of social interactions for people to reconfigure their behaviours in a way which made something so unbelievably complex possible. Similar concept. What's it going to take for that to happen? Well, crypto needs to be more than a gambling den. It needs to deliver on the supposed benefits that have been promised. Sorry for the tangent, but I'm pretty disillusioned after years of mere speculation and time spent (wasted?) "studying" crypto projects and what exactly their supposed breakthrough is.

            • 3 weeks ago
              Anonymous

              I am not gonna read all of that, buy PRQ instead

            • 3 weeks ago
              Anonymous

              You can rely on centralization for anything that is not mission critical and does not have externalities of the GIGO variety. For anything else, you need uptime, some degree of redundancy, and verifiability, never mind censorship resistance and security. Otherwise the reliability of what you're building is at the risk of the next cloud service outage, arbitrary corporate decisions, targeting for shutdown due to cultural/ideological climate, or data manipulation to serve a political or personal agenda. Traditional business could get away with it, but it still takes a massive toll on economic activity.

              Centralization in decentralized ecosystems is a tradeoff - participants trade robustness for reliable yield. But it's still a systemic risk, especially when its competitive edge against legacy industry is boasting the minimization of these risks. Centralization of voting power in PoS protocols leads to well-funded malicious agents rigging the governance decisions of protocols themselves, as was demonstrated by some of the culprits in the contagion of 2022. Decentralized protocols relying heavily on centralized infra end up losing all benefits of decentralization and inheriting the censorship decisions of their sources.

              The pitfalls of decentralization largely stem from latency and massively redundant resource expenditures, but next generation protocol architectures seem to be aiming to use verifiability more extensively - including in execution - to reduce overhead of decentralization without sacrificing integrity through reliance on centralization.

              The breakthrough of Bitcoin was distributed version control of state. The breakthrough of Ethereum was distributed version control of Turing-complete operations on state. The breakthrough of oracles and indexers was distributed I/O and distributed operations on I/O.

              • 3 weeks ago
                Anonymous

                I'd also concur that the evolution of a complex and robust macro ecosystem makes or breaks this sector. As time goes on, I believe there will continue to be centralization, but that the ecosystem will develop more sophisticated risk management with those potential pain points in mind, and deploy more verifiability tech which will gradually supplant traditional centralization.

                You can feed end-users garbage on their dashboards, but anything happening onchain - which necessarily depends on state integrity - must strive for a pristine environment.

                But anyway, The Graph's original sin was not baking end-to-end verifiability into its protocol from day one. It was understandable as the circumstances in which it emerged, was funded, and developed meant there was a lot less liquidity at the time to fund developing more than an MVP with a more traditional form of governance / risk management, with stated plans to eventually make up for the verifiability gaps later. But five years and hundreds of million dollars later, it seems like there should be more to show for it. Nevertheless, it's a master class in high availability decentralization.

                You sound smart. Is 500k GRT enough to make it?

              • 3 weeks ago
                Anonymous

                Not even 1M. This shitcoin is not going above 10c ever again.

            • 3 weeks ago
              Anonymous

              I'd also concur that the evolution of a complex and robust macro ecosystem makes or breaks this sector. As time goes on, I believe there will continue to be centralization, but that the ecosystem will develop more sophisticated risk management with those potential pain points in mind, and deploy more verifiability tech which will gradually supplant traditional centralization.

              You can feed end-users garbage on their dashboards, but anything happening onchain - which necessarily depends on state integrity - must strive for a pristine environment.

              But anyway, The Graph's original sin was not baking end-to-end verifiability into its protocol from day one. It was understandable as the circumstances in which it emerged, was funded, and developed meant there was a lot less liquidity at the time to fund developing more than an MVP with a more traditional form of governance / risk management, with stated plans to eventually make up for the verifiability gaps later. But five years and hundreds of million dollars later, it seems like there should be more to show for it. Nevertheless, it's a master class in high availability decentralization.

    • 3 weeks ago
      Anonymous

      Companies can develop their in-house solutions, and the fact that they have needs that The Graph cannot provide right now hurts query fee potential, but those will always be manifestations of the very problem that The Graph intends to resolve: data silos.

      The Graph aims to be the perennial commons of public and permissionless data and all desired permutations of that data. No other project has really publicly embraced such an ambitious vision.

  8. 3 weeks ago
    Anonymous

    It's gonna be rekt by space and time soon anyway, just abandon ship.

  9. 3 weeks ago
    Anonymous

    Just use Parsiq. Stop obsessing over GRT. It's over

    • 3 weeks ago
      Anonymous

      it is somehow the only coin that is consistently doing worse than GRT in my shitfolio

    • 3 weeks ago
      Anonymous

      Parsiq is barely noticed by the prime movers in this sector. Too little manpower, too little resources, too little focus proportional to its manpower and resources, and a lot of its products have weak moats that do not warrant vendor lock-in. Also, the lion's share of its services are backed by a centralized company, with decentralization talk only as concessions to their exit liquidity, fighting a continuous uphill battle against much larger and comparatively more versatile players on both the centralized and decentralized fronts.

      My employer was very interested in TheGraph a year or so ago, since some of the cutting-edge research announced at Graph Day would have supercharged what we were building at the time.

      There have been no further public mentions of that technology since Graph Day, and we've since changed course, bulked up in-house research to fill the gaps, and are shipping our own in-house solution. It doesn't directly compete with the core of what TheGraph does, but it does potentially encroach on some use cases that TheGraph has flirted with but not aggressively pursued R&D for, in what is a massive emerging market in the sector that would have been ideal territory for TheGraph's dominance. It also demonstrates how the markets and businesses adapt to complex needs when the established players don't deliver in a timely manner.

      With that said, the grande armee of archive nodes is what really sets The Graph apart. Management wants our technology to play well with full nodes, but the in-house EVM experts know damn well that within a decentralized context, we're going to need robust, credibly neutral, high-availability, multichain archive nodes for some of the products and services we're shipping. The Graph's EIP-4444 initiative could potentially fill the gaps here down the road.

      The Graph can still win in scalable decentralized offchain data aggregation, which is desperately needed in the world at large granted it's developer-friendly and consumer-friendly, and it still has a shot at being first to the punch at the value subgraph composability provides given their computation and storage firepower, but they really need to move faster in the R&D department.

      As far as specific teams go, StreamingFast and GraphOps need free rein to go crazy with their product development. Messari and Semiotic need more money thrown at them: Messari to vastly expand its subgraph development bandwidth, and Semiotic to expand its research team. Also, at this point, the board should just authorize Tegan to go on a reign of terror with the migration, which looks to be quietly unfolding internally.

      I've been blowing my allowance on this 758k grt tokens so far. How fucked am i?

      If Tegan's mainnet query volume target is hit, expect usd 22k-29k/yr at current hosted service utilization. If you're more inclined to bearish expectations, expect something like usd 11-12k/yr. Double that for what you might expect in a bull market year, assuming TheGraph has similar or greater developer traction to what it enjoyed at the end of 2021.

      • 3 weeks ago
        Anonymous

        And what is tegan's mainnet volume target? How many queries is this bitch expecting? Just wondering how you got those numbers

        • 3 weeks ago
          Anonymous

          >Last report on the hosted service
          37B/mo

          >Tegan's target
          80% of the hosted service, or 29.6B/mo

          >Estimated average cost per query
          $1/5k

          >Current annual GRT issuance
          238M. Used 300M in calculations, as was the case in 2022, but that's since changed.

          >Total GRT stake
          2.46B. Used the historical 2.7B in the calculations, but it seems 240M staked GRT has exited the protocol since then.

          >Bobo coefficient
          0.1x

          • 3 weeks ago
            Anonymous

            tldr $40 eoy is fud
            findom me harder mommy tegan aaaaaaa im goorting aaaaaaaaaaaa

    • 3 weeks ago
      Anonymous

      Why struggle over choosing when you can have both anon? Although GRT has earned me more gains so far, and now I also diversify in on alts like TOMO, DUA, and RNDR for long term gains.

  10. 3 weeks ago
    Anonymous

    the largest part ofmy folio, im fucked if it doesnt succeed

    its over bros...

  11. 3 weeks ago
    Anonymous

    I've been blowing my allowance on this 758k grt tokens so far. How fucked am i?

    • 3 weeks ago
      Anonymous

      I mean, it's just, what, 60k? If shit goes wrong, you should be able to earn back your 50k in a relatively short time, just work your ass for a year tops until you get back your 40k, or just wait until your 30k pumps back to its initial price... it should happen before 2100 according to Tegan but it will likely be worth 20k first. If that doesn't happen, well, it's just 10k.
      It's over.

  12. 3 weeks ago
    Anonymous

    is that the one that was a smiley face logo and the guy making it had some fucking horrible stutter so he decided "hey let ME, the stutterer pitch my coin live a web3 summit?"

  13. 3 weeks ago
    Anonymous

    >that dump
    It hasn't even started
    See you below 3c gents

  14. 3 weeks ago
    Anonymous

    Assuming we get another bull run in 2024/2025, is it possible that grt can get to trillions of queries per month?

  15. 3 weeks ago
    Anonymous

    Tegan just bought a new diamond plated dildo with everyones money and is currently shoving it in her fat pussy. Thanks for playing lads.

  16. 3 weeks ago
    Anonymous

    I was pricing different crypto APIs last week to get an idea of how much it would cost to power a crypto analytics website. One of them was GraphQL, as far as I could tell they offered a single fucking endpoint for any given application with fuck all data. Absolutely useless, I would sooner run and query my own nodes locally than pay for such a shit service

    • 3 weeks ago
      Anonymous

      Are you sure it was The Graph? It has only a couple primary gateways as far as I know, but it has around 30-40k endpoints as of 2022. The Graph's query interface is GraphQL-based, but that doesn't mean all GraphQL APIs are The Graph or that all GraphQL APIs are created equal for that matter.

      I've always found its data pretty good for what what it is; my only complaints are that it's limited to 1,000 rows per query with a skip limit of 5,000, again, as of 2022, and that mostly serves as a DOS prevention strategy on the free service.

  17. 3 weeks ago
    Anonymous

    I was pricing different crypto APIs last week to get an idea of how much it would cost to power a crypto analytics website. One of them was GraphQL, as far as I could tell they offered a single fucking endpoint for any given application with fuck all data. Absolutely useless, I would sooner run and query my own nodes locally than pay for such a shit service

  18. 3 weeks ago
    Anonymous

    Thread reminder.

    • 3 weeks ago
      Anonymous

      >twitter screenshot by literallywho

      Regardless, Alameda backed Covalent but not The Graph.

  19. 3 weeks ago
    Anonymous

    Umm GRT sisters? Why did Tegan dump on us again?

  20. 3 weeks ago
    Anonymous

    i pumped my grt up to 50k delegated. i hope this is enough in 2 years

Your email address will not be published. Required fields are marked *