←back to thread

544 points tosh | 5 comments | | HN request time: 0.001s | source
Show context
simonw ◴[] No.43464227[source]
Big day for open source Chinese model releases - DeepSeek-v3-0324 came out today too, an updated version of DeepSeek v3 now under an MIT license (previously it was a custom DeepSeek license). https://simonwillison.net/2025/Mar/24/deepseek/
replies(5): >>43464375 #>>43464498 #>>43464686 #>>43465383 #>>43467111 #
chaosprint ◴[] No.43464375[source]
it seems that this free version "may use your prompts and completions to train new models"

https://openrouter.ai/deepseek/deepseek-chat-v3-0324:free

do you think this needs attention?

replies(7): >>43464399 #>>43464480 #>>43464512 #>>43464616 #>>43464961 #>>43468548 #>>43470210 #
huijzer ◴[] No.43464512[source]
Since we are on HN here, I can highly recommend open-webui with some OpenAI-compatible provider. I'm running with Deep Infra for more than a year now and am very happy. New models are usually available within one or two days after release. Also have some friends who use the service almost daily.
replies(7): >>43464718 #>>43465081 #>>43466430 #>>43466464 #>>43466949 #>>43469369 #>>43473139 #
1. totetsu ◴[] No.43466464[source]
And it’s quite easy to set up a Cloudflare tunnel to make your open-webui instance accessible online too just you
replies(1): >>43466484 #
2. simonw ◴[] No.43466484[source]
... or a TailScale network. I've been leaving open-webui running on my laptop on my desk and then going out into the word and accessing it from my phone via TailScale, works great.
replies(2): >>43466956 #>>43467521 #
3. wkat4242 ◴[] No.43466956[source]
Yeah this sounds like the more secure option, you don't want to be dependent on a single flaw in a web service
4. totetsu ◴[] No.43467521[source]
I would use tail scale. But I specifically want to use open web-ui from a place I can’t install a Tailscale client
replies(1): >>43468733 #
5. fragmede ◴[] No.43468733{3}[source]
where's that?