←back to thread

112 points favoboa | 1 comments | | HN request time: 0.202s | source
Show context
bryant ◴[] No.44431158[source]
A few weeks ago, I processed a product refund with Amazon via agent. It was simple, straightforward, and surprisingly obvious that it was backed by a language model based on how it responded to my frustration about it asking tons of questions. But in the end, it processed my refund without ever connecting me with a human being.

I don't know whether Amazon relies on LLMs or SLMs for this and for similar interactions, but it makes tons of financial sense to use SLMs for narrowly scoped agents. In use cases like customer service, the intelligence behind LLMs is all wasted on the task the agents are trained for.

Wouldn't surprise me if down the road we start suggesting role-specific SLMs rather than general LLMs as both an ethics- and security-risk mitigation too.

replies(5): >>44431884 #>>44431916 #>>44432173 #>>44433836 #>>44441923 #
1. anon7000 ◴[] No.44441923[source]
The problem is that so many executives in charge of customer support see dollar signs with AI, and then implement the shittiest possible version of an AI chatbot that makes support interactions significantly worse. For example, I had particularly bad interactions with Doordash, Lyft, and Chipotle “support” in the past year. The chipotle one a few days ago — they gave me a completely different item than the one I ordered in the anpp, and the bot wanted me to talk to the store in person about it. Why even have an app with delivery options if that’s your answer.

If anyone working on support bots reads this, you should realize it’s very easy for a bot to tarnish your companies reputation which can add up over time. You MUST have a way to tell if interactions are deeply frustrating to the humans. Since people are used to being patient with humans, we have a higher tolerance before we become frustrated with a support person. (And support peeps know this isn’t even that much true.) Not when we know we’re talking to a bot.

Seriously, fuck all the executives who bought into chatbots for support without bothering to make it a thoughtful experience. (Which in my experience so far, is all of them. Maybe Amazon is better, but they’ve had easy refunds with a simple form for years and years!)