Most active commenters
  • pptr(5)

←back to thread

756 points dagurp | 19 comments | | HN request time: 0.001s | source | bottom
1. bee_rider ◴[] No.36882605[source]
As noted in the article, Google comes up with a scheme like this every couple months. They also can’t seem to identify good sites anymore, based on their search results.

So… fuck it. Let them DRM their part of the internet. It is mostly shit nowadays anyway. They can index Reddit, X, and a bunch of sites that are GPT SEO trash.

We’re never getting 201X internet back anyway, so let Google and friends do their thing and everybody who doesn’t want anything to do with it can go back to the 200X internet. It was kind of disorganized but it it better than fighting them on DRM over an over again.

replies(2): >>36882868 #>>36882902 #
2. lambic ◴[] No.36882868[source]
What are 200X and 201X internets?
replies(3): >>36882964 #>>36882993 #>>36883554 #
3. pptr ◴[] No.36882902[source]
If you can identify bots more accurately, you get less "GPT SEO trash".
replies(3): >>36883165 #>>36883343 #>>36883353 #
4. zls ◴[] No.36882964[source]
decades :)

If we had known how fleeting the glory of the early 2010s internet would be, with everything ad-free and seo still comparatively rudimentary, would that have made it easier or harder to watch it die?

replies(1): >>36885525 #
5. ◴[] No.36882993[source]
6. callalex ◴[] No.36883165[source]
There is approximately a 0% chance that people won’t figure out how to make their bots “verified” if this goes through.
7. square_usual ◴[] No.36883343[source]
That's not how it works, because the GPT SEO trash is being generated by the people on the server.
replies(1): >>36883622 #
8. webstrand ◴[] No.36883353[source]
This proposal does not affect bots producing web content, only (potentially) bots browsing web content.
replies(1): >>36883692 #
9. bee_rider ◴[] No.36883554[source]
As the other comment said, the decades. I could have used 2010’s I guess, it is just hard to refer to the first decade of the millennium that way.
10. pptr ◴[] No.36883622{3}[source]
Well there is that and there are users that post GPT SEO trash to Reddit et al, which is what the attestation API could help with.
replies(1): >>36886164 #
11. pptr ◴[] No.36883692{3}[source]
It does affect bots creating social media content.
replies(1): >>36885365 #
12. hellojesus ◴[] No.36885365{4}[source]
Not necessarily. Even with WEI, spammers could farm legit tokens and then set up their own api that hands one out to their bot when one is necessary.
replies(1): >>36887467 #
13. nfw2 ◴[] No.36885525{3}[source]
Everything was free because interest rates were nothing, and every startup could use investor capital to cover their costs
replies(1): >>36894281 #
14. blibble ◴[] No.36886164{4}[source]
the spammers are quite capable of buying several hundred old phones with valid attestation certificates to pump out crap
replies(1): >>36887561 #
15. pptr ◴[] No.36887467{5}[source]
My understanding is that you can't reuse tokens, because the system uses challenge response.
replies(1): >>36888424 #
16. pptr ◴[] No.36887561{5}[source]
Which is orders of magnitude more expensive than deploying the bot to a cloud or botnet.

I don't know how much bot spam pays these days. Maybe it's still worth it.

replies(1): >>36891599 #
17. hellojesus ◴[] No.36888424{6}[source]
But can you get a token and then not send it and save it for later? That's more what I was thinking. Not replay attacks but gathering a bunch of tokens thst are valid but never submitted to the origin, and then provide them via api requests to those that need one to use unauthorized devices with that origin.
18. blibble ◴[] No.36891599{6}[source]
the attestation requirement increases the cost but also increases the value of the spam as the spammers competitors are put out of business
19. lambic ◴[] No.36894281{4}[source]
And everything was simpler, you could throw something up on a $10/month shared host. Now you need a full stack of services running in the cloud charged by the minute.