←back to thread

112 points favoboa | 7 comments | | HN request time: 0.439s | source | bottom
Show context
bryant ◴[] No.44431158[source]
A few weeks ago, I processed a product refund with Amazon via agent. It was simple, straightforward, and surprisingly obvious that it was backed by a language model based on how it responded to my frustration about it asking tons of questions. But in the end, it processed my refund without ever connecting me with a human being.

I don't know whether Amazon relies on LLMs or SLMs for this and for similar interactions, but it makes tons of financial sense to use SLMs for narrowly scoped agents. In use cases like customer service, the intelligence behind LLMs is all wasted on the task the agents are trained for.

Wouldn't surprise me if down the road we start suggesting role-specific SLMs rather than general LLMs as both an ethics- and security-risk mitigation too.

replies(5): >>44431884 #>>44431916 #>>44432173 #>>44433836 #>>44441923 #
1. torginus ◴[] No.44431916[source]
I just had my first experience with a customer service LLM. I needed to get my account details changed, and for that I needed to use the customer support chat.

The LLM told me what sort of information they need, and what is the process, after which I followed through the whole thing.

After I went through the whole thing it reassured me everything is in order, and my request is being processed.

For two weeks, nothing happened, I emailed the (human) support staff, and they responded to me, that they can see no such request in their system, turns out the LLM hallucinated the entire customer flow and was just spewing BS at me.

replies(6): >>44431940 #>>44431999 #>>44432155 #>>44432498 #>>44432522 #>>44433879 #
2. exe34 ◴[] No.44431940[source]
That's why I take screenshots of anything that I don't get an email confirmation for.
3. ttctciyf ◴[] No.44431999[source]
There really should be some comeback for this type of enshAItification.

We're supposed to think "oh it's an LLM, well, that's ok then"? A question we'll be asking more frequently as time goes on, I suspect.

4. dotancohen ◴[] No.44432155[source]
This is reason number two why I always request the service ticket number.

Reason number one being that when the rep feels you are going to hold them accountable to the point of requesting such a number, you might not be the type of client to pull shenanigans with. Maybe they suspect me of being a cooperate QC agent? Either way, requesting such a number demonstrably reduces friction.

5. koakuma-chan ◴[] No.44432498[source]
You basically have to always use tool_choice="required" or the LLM will derail
6. thatjoeoverthr ◴[] No.44432522[source]
The LLM is a smoke bomb they shot in your face :)
7. scarface_74 ◴[] No.44433879[source]
That has nothing to do with sn LLM. Any chah based system whether LLM or not is going to interpret the human input and convert it to a standardized request for backend processing. This is just a badly written system.