←back to thread

1901 points l2silver | 2 comments | | HN request time: 0.001s | source

Maybe you've created your own AR program for wearables that shows the definition of a word when you highlight it IRL, or you've built a personal calendar app for your family to display on a monitor in the kitchen. Whatever it is, I'd love to hear it.
Show context
hermannj314 ◴[] No.35739680[source]
I hooked up an analog phone to Whisper, ChatGPT, and TTS. I used of one of those old timey candlestick phones you'd see in a 1920 gangster movie. Initially this was a prop for a murder mystery party I was hosting (ChatGPT would give clues if you said certain words), but now I use it for a silly distraction here and there. Ask ChatGPT a question by picking up a phone like it's last century! I think it is fun.

I am using Asterisk on Debian that calls my python script. The analog phone adapter auto dials when the receiver goes off hook, because rotary dialing sucks that much and the answering extension is chatgpt role playing different characters based on prompting.

I think it is neat. I need to work on better voice synthesis and improve latency a bit still, but it is a nice toy.

replies(2): >>35740169 #>>35751437 #
1. awestroke ◴[] No.35751437[source]
How long is the delay between question and answer?
replies(1): >>35752735 #
2. hermannj314 ◴[] No.35752735[source]
It is currently highly variable. Oftentimes it can be a few hundred milliseconds and the conversation feels natural but other times the whisper api will take like 10 seconds to create a response.

I did create some static filler phrases stored locally (ummm, let me think, good question) to break the silence.