←back to thread

526 points noperator | 2 comments | | HN request time: 0.44s | source
1. Alifatisk ◴[] No.44492494[source]
I did something similar, but for groupchats. You had to export a groupchat conversation into text and send it to the program. The program would then use a local llm to profile each user in the groupchat based on what they said.

Like, it built knowledge of what every user in the groupchat and noted their thought on different things or what their opinions were on something or just basic knowledge of how they are. You could also ask the llm questions about each user.

It's not perfect, sometimes the inference gets something wrong or the less precise embeddings gets picked up which creates hallucinations or just nonsense, but it works somewhat!

I would love to improve on this or hear if anyone else has done something similar

replies(1): >>44493337 #
2. AJ007 ◴[] No.44493337[source]
There are other good use cases here like documenting recurring bugs or problems in software/projects.

This is a good illustration of why e2e encryption is more important than its ever been. What were innocuous and boring conversations are now very valuable when combined with phishing and voice cloning.

OpenAI is going to use all of your ChatGPT history to target ads to you, and probably will have to choice to pay for everything. Meta is trying really hard too, and already is applying generative AI extensive for advertiser's creative production.

Ultra targeted advertising where the message is crafted to perfectly fit the viewer mean devices running operating systems incapable of 100% blocking ads should be considered malware. Hopefully local LLMs will be able to do a good job with that.