←back to thread

Let's talk about AI and end-to-end encryption

(blog.cryptographyengineering.com)
174 points chmaynard | 1 comments | | HN request time: 0.285s | source
Show context
lowbatt ◴[] No.42742349[source]
Maybe a little off topic, but is there a way for a distributed app to connect to one of the LLM companies (OpenAI, etc.) without the unencrypted data hitting an in-between proxy server?

An app I'm building uses LLMs to process messages. I don’t want the unencrypted message to hit my server - and ideally I wouldn’t have the ability to decrypt it. But I can’t communicate directly from client -> LLM Service without leaking the API key.

replies(3): >>42742683 #>>42742892 #>>42743194 #
1. whyage ◴[] No.42743194[source]
Check out https://www.opaque.co/