←back to thread

858 points cryptophreak | 1 comments | | HN request time: 0.227s | source
Show context
croes ◴[] No.42934439[source]
Natural language isn’t made to be precise that’s why we use a subset in programming languages.

So you either need lots of extra text to remove the ambiguity of natural language if you use AI or you need a special precise subset to communicate with AI and that’s just programming with extra steps.

replies(10): >>42934517 #>>42934537 #>>42934619 #>>42934632 #>>42934651 #>>42934686 #>>42934747 #>>42934909 #>>42935464 #>>42936139 #
kokanee ◴[] No.42935464[source]
> or you need a special precise subset to communicate with AI

haha, I just imagined sending TypeScript to ChatGPT and having it spit my TypeScript back to me. "See guys, if you just use Turing-complete logically unambiguous input, you get perfect output!"

replies(1): >>42947232 #
1. charlieyu1 ◴[] No.42947232[source]
I guess we could have LLM to translate natural language to some precise subset, get it processed, then translate the output back to natural language