←back to thread

684 points prettyblocks | 2 comments | | HN request time: 0.715s | source

I mean anything in the 0.5B-3B range that's available on Ollama (for example). Have you built any cool tooling that uses these models as part of your work flow?
Show context
mettamage ◴[] No.42784724[source]
I simply use it to de-anonymize code that I typed in via Claude

Maybe should write a plugin for it (open source):

1. Put in all your work related questions in the plugin, an LLM will make it as an abstract question for you to preview and send it

2. And then get the answer with all the data back

E.g. df[“cookie_company_name”] becomes df[“a”] and back

replies(4): >>42784789 #>>42785696 #>>42785808 #>>42788777 #
1. sitkack ◴[] No.42785696[source]
So you are using a local small model to remove identifying information and make the question generic, which is then sent to a larger model? Is that understanding correct?

I think this would have some additional benefits of not confusing the larger model with facts it doesn't need to know about. My erasing information, you can allow its attention heads to focus on the pieces that matter.

Requires further study.

replies(1): >>42790194 #
2. mettamage ◴[] No.42790194[source]
> So you are using a local small model to remove identifying information and make the question generic, which is then sent to a larger model? Is that understanding correct?

Yep that's it