←back to thread

419 points serjester | 3 comments | | HN request time: 0s | source
Show context
extr ◴[] No.43538417[source]
The problem I find in many cases is that people are restrained by their imagination of what's possible, so they target existing workflows for AI. But existing workflows exist for a reason: someone already wanted to do that, and there have been countless man-hours put into the optimization of the UX/UI. And by definition they were possible before AI, so using AI for them is a bit of a solution in search of a problem.

Flights are a good example but I often cite Uber as a good one too. Nobody wants to tell their assistant to book them an Uber - the UX/UI is so streamlined and easy, it's almost always easy enough to just do it yourself (or if you are too important for that, you probably have a private driver already). Basically anything you can do with an iPhone and the top 20 apps is in this category. You are literally competing against hundreds of engineers/product designers who had no other goal than to build the best possible experience for accomplishing X. Even if LLMs would have been helpful a priori - they aren't after every edge case has already been enumerated and planned for.

replies(2): >>43538507 #>>43538886 #
lolinder ◴[] No.43538507[source]
> You are literally competing against hundreds of engineers/product designers who had no other goal than to build the best possible experience for accomplishing X.

I think part of what's been happening here is that the hubris of the AI startups is really showing through.

People working on these startups are by definition much more likely than average to have bought the AI hype. And what's the AI hype? That AI will replace humans at somewhere between "a lot" and "all" tasks.

Given that we're filtering for people who believe that, it's unsurprising that they consciously or unconsciously devalue all the human effort that went into the designs of the apps they're looking to replace and think that an LLM could do better.

replies(1): >>43538967 #
1. arionhardison ◴[] No.43538967[source]
> I think part of what's been happening here is that the hubris of the AI startups is really showing through.

I think it its somewhat reductive to assign this "hubris" to "AI startups". I would posit that this hubris is more akin to the superiority we feel as human beings.

I have heard people say several times that they "treat AI like a Jr. employee", I think that within the context of a project AI should be treated based on the level if contribution. If AI is the expert, I am not going to approach it as if I am an SME that knows exactly what to ask. I am going to try and focus on the thing. know best, and ask questions around that to discover and learn the best approach. Obviously there is nuance here that is outside the scope of this discussion, but these two fundamentally different approaches have yield materially different outcomes in my experience.

replies(1): >>43541238 #
2. hexasquid ◴[] No.43541238[source]
Treat AI like a junior employee?

Absolutely not. When giving tasks to an AI, we supply them with context, examples of what to do, examples of what not to do, and we clarify their role and job. We stick with them as they work and direct them accordingly when something goes wrong.

I've no idea what would happen if we treated a junior developer like that.

replies(1): >>43543093 #
3. aledalgrande ◴[] No.43543093[source]
They would become a senior developer? lol ;)