←back to thread

Claude in Chrome

(claude.com)
278 points ianrahman | 1 comments | | HN request time: 0s | source
Show context
dmix ◴[] No.46340843[source]
Web devs are going to have to get used to robots consuming our web apps.

We'll have to start documenting everything we're deploying, in detail either that or design it in an easy to parse form by an automated browser.

replies(3): >>46340986 #>>46341078 #>>46341165 #
jclulow ◴[] No.46340986[source]
Actually, you don't need to do anything of the sort! Nobody is owed an easy ride to other people's stuff.

Plus, if the magic technology is indeed so incredible, why would we need to do anything differently? Surely it will just be able to consume whatever a human could use themselves without issues.

replies(4): >>46341119 #>>46341263 #>>46341483 #>>46341615 #
1. dmix ◴[] No.46341119[source]
> Nobody is owed an easy ride to other people's stuff.

If your website doesn't have a relevant profit model or competition then sure. If you run a SaaS business and your customer wants to do some of their own analytics or automation with a model it's going be hard to say no in the future. If you're selling tickets on a website and block robots you'll lose money. etc

If this is something people learn to use in Excel or Google Docs they'll start expecting some way to do so with their company data in your SaaS products, or you better build a chat model with equivalent capabilities. Both would benefit from documentation.