←back to thread

308 points ndsipa_pomu | 1 comments | | HN request time: 0s | source
Show context
taylodl ◴[] No.44974720[source]
How many times has a chatbot successfully taken care of a customer support problem you had? I have had success, but the success rate is less than 5%. Maybe even way less than 5%.

Companies need to stop looking at customer support as an expense, but rather as an opportunity to build trust and strengthen your business relationship. They warn against assessing someone when everything is going well for them - the true measure of the person is what they do when things are not going well. It's the same for companies. When your customers are experiencing problems, that's the time to shine! It's not a problem, it's an opportunity.

replies(30): >>44974777 #>>44975081 #>>44975160 #>>44975447 #>>44975773 #>>44975774 #>>44975783 #>>44975938 #>>44976007 #>>44976029 #>>44976089 #>>44976153 #>>44976183 #>>44976414 #>>44976415 #>>44976442 #>>44976447 #>>44976466 #>>44976469 #>>44976596 #>>44976613 #>>44976642 #>>44976791 #>>44977048 #>>44977695 #>>44977726 #>>44978607 #>>44979880 #>>44981825 #>>44988516 #
no_wizard ◴[] No.44974777[source]
The only time a chatbot worked for me is Amazon's, of all things. It auto approved my return after I answered a few questions.

I haven't had any chatbot outside that be useful to me. I always end up getting to the end of all the prompts only to be told I need to speak to a human or the chatbot going in a circle, in which I have to reach out to a different layer of support.

replies(6): >>44975348 #>>44975738 #>>44976002 #>>44976085 #>>44976347 #>>44976350 #
diggan ◴[] No.44976002[source]
> I always end up getting to the end of all the prompts only to be told I need to speak to a human or the chatbot going in a circle

I've had success with just repeating "Agent please" or "I wanna talk to human" if I notice the chat bot isn't a traditional conditional-if-else-bot but an LLM, and it seems like most of them have some sort of escape-hatch they can trigger, but they're prompted to really avoid it. But if you continue sending "Agent please" over and over again, eventually it seems like the typical context-rot prevents them from avoiding the escape-hatch, and they send you along to a real human.

replies(1): >>44978335 #
1. netsharc ◴[] No.44978335{3}[source]
I saw a social media video of people at the drive in, it was a robot voice asking what they'd like. "I'd like a million cups of water please.". The voice immediately changed to a noticably human one asking "Hi how can I help you."