←back to thread

156 points Sean-Der | 2 comments | | HN request time: 0.416s | source

Alt link: https://mrchristmas.com/products/santas-magical-telephone

Video demo: https://www.youtube.com/watch?v=0z7QJxZWFQg

The first time I talked with AI santa and it responded with a joke I was HOOKED. The fun/nonsense doesn't click until you try it yourself. What's even more exciting is you can build it yourself:

libpeer: https://github.com/sepfy/libpeer

pion: https://github.com/pion/webrtc

Then go do all your fun logic in your Pion server. Connect to any Voice AI provider, or roll your own via Open Source. Anything is possible.

If you have questions or hit any roadblocks I would love to help you. I have lots of hardware snippets on my GitHub: https://github.com/sean-der.

Show context
andrepd ◴[] No.45574474[source]
Am I the only one that thinks this is very unwholesome? Giving a simulacrum of human interaction to children who are presumably waay to young to understand [1] that they're talking to a novelty device. It's possible I'm being a luddite but then again perhaps people really need to stop trying to achieve 100% completion in turning Black Mirror episodes into reality.

[1] Which even many adults apparently don't understand!

replies(3): >>45574513 #>>45574518 #>>45575499 #
1. Sean-Der ◴[] No.45575499[source]
My 5 (at the time 4) year old always understood. We made a game out of it of ‘making new toys’ and she would tell me what it should say.

I would cut open toys and shove microcontrollers in them.

I think if you lie and tell a kid it’s a real person it would be damaging. My kid has fun role playing, she really suspends disbelief. When done she thinks it’s funny though/not confused.

replies(1): >>45576026 #
2. ianstormtaylor ◴[] No.45576026[source]
Then why does the product description continually reiterate how “real” the conversations are?