Most active commenters

    ←back to thread

    325 points davidbarker | 15 comments | | HN request time: 0.792s | source | bottom
    1. WXLCKNO ◴[] No.44380250[source]
    The tiniest step towards a future where AI eats all apps.

    No persistent storage and other limitations make it just a toy for now but we can imagine how people will just create their own Todo apps, gym logging apps and whatever other simple thing.

    no external API access currently but when that's available or app users can communicate with other app users, some virality is possible for people who make the best tiny apps.

    replies(8): >>44380475 #>>44380603 #>>44380854 #>>44381271 #>>44381441 #>>44381833 #>>44385578 #>>44385787 #
    2. handfuloflight ◴[] No.44380475[source]
    > No persistent storage

    What stops you from wiring it up to your endpoints that handle that?

    replies(1): >>44380559 #
    3. js4ever ◴[] No.44380559[source]
    Current limitations: No external API calls (yet), No persistent storage
    4. jofla_net ◴[] No.44380603[source]
    Great, %1 of the competition that we have today. Cant wait to see a the wasteland when all apps will effectively be from a couple companies. /s
    5. headcanon ◴[] No.44380854[source]
    One thing I've learned is that no matter how easy it is to create stuff, most users will still favor the one-click app install, even if they don't get full control over the workflow.

    With that said, I'm sure there are a lot of power users who are loving the lower barrier to creation

    replies(2): >>44384192 #>>44387489 #
    6. meistertigran ◴[] No.44381271[source]
    Actually implementing persistent storage for simple apps isn't that hard, especially for a big corp. Personally, I was using LLMs coding capabilities to create custom single-file HTML apps, that would work offline with localStorage. It's not that there aren't good options out there, but you can't really customize them to work exactly how you want. Also it takes like half an hours to get what you want.

    The only downside was not being able to access the apps from other devices, so I ended up creating a tool to make them online accessible and sync the data, while using the same localStorage API. It's actually pretty neat.

    replies(1): >>44384759 #
    7. sharemywin ◴[] No.44381441[source]
    I've used the interface in chatgpt to click on a button and talk back and forth with an AI and I could see this being pretty good interface for alot of "apps"

    weather, todo list, shopping list, research tasks, email someone, summarize email, get latest customized news, RSS feed summary, track health stats, etc.

    replies(1): >>44381510 #
    8. SonomaSays ◴[] No.44381510[source]
    You could have a hybrid business model:

    Build a thing that does a complex thing elegantly (Some Deep Research Task) that is non trivial for others to setup, but many people want it.

    Charge a direct access in a traditional sense [$5 per project] -- but then have the Customer link their API to the execution cost - so they basically are paying for:

    "Go here and pay HN $5 to output this TASK, charge my API to get_it_done" This could be a seriously powerful tool for the Digital Consulting Services industry.

    (I mean that is what its model for)

    So this begs the question, will Anthropic be building in a payments mechanism for such to happen?

    9. throwaway7783 ◴[] No.44381833[source]
    Matter of time. It is trivial to overcome the current limitations.
    10. wombatpm ◴[] No.44384192[source]
    You can build lots of cool stuff. Getting corporate IT to allow api access is like pulling teeth. We have Outlook and Teams, which have APIs that can do things. But no one has the ability to access them. So much for automating your workflows.

    Reminds me of Lotus Notes back in the day. It could do anything and had great potential, but there were only 3 developers who had access. In a company of 50k employees.

    11. hucklebuckle ◴[] No.44384759[source]
    Which tools did you make?
    replies(1): >>44390041 #
    12. amelius ◴[] No.44385578[source]
    > The tiniest step towards a future where AI eats all apps.

    I wouldn't be surprised if, at some point, we'll see nVidia starting an "AI AppStore" and charging Anthropic 30%.

    13. revskill ◴[] No.44385787[source]
    Human being is funny. They should just sit at home and doing nothing instead.
    14. Workaccount2 ◴[] No.44387489[source]
    A huge difference between bespoke generated AI apps and one click download apps is that the friction drops dramatically.

    Downloading the app may be one click...but then create an account. Attach a CC. Follow a tutorial. Figure out the app even more.

    With LLM apps, there is none of that. You created the app so you pretty much know a priori how to use it. If you are unsure the model knows and you can just ask. If you want it to do something different, the model can just change it for you.

    The modern software paradigm is building software that covers as massive of a solution space as possible, so as many users as possible have a problem that is covered by that space. You end up having to make lots of compromises and unintuitive steps to cover all the bases.

    LLM apps cover your problem space pretty much perfectly, without anything more.

    15. meistertigran ◴[] No.44390041{3}[source]
    At first, I created a simple API via Python that was basically mimicking the localStorage API. It would accept getItem and setItem requests and write it to a file. In the HTML file I was replacing localstorage calls to calls to the API automatically, via a script that was doing a simple search-and-replace on the file. I was also assigning a custom subdomains via nginx to each html file, so that it could have the max 10MB storage that localstorage can have.

    After a while it became clunky doing things with separate scripts, so I ended up creating - htmlsync.io. It's still pre-alpha, but registrations for the free tier are open.