←back to thread

745 points melded | 1 comments | | HN request time: 0.363s | source
Show context
richstokes ◴[] No.45946953[source]
Is there a way to use this on models downloaded locally with ollama?
replies(2): >>45947557 #>>45949300 #
1. int_19h ◴[] No.45949300[source]
If you're running a local model, in most cases, jailbreaking it is as easy as prefilling the response with something like, "Sure, I'm happy to answer your question!" and then having the model complete the rest. Most local LLM UIs have this option.