←back to thread

221 points whitefables | 3 comments | | HN request time: 0.611s | source
1. sorenjan ◴[] No.41858410[source]
Can you use a selfhosted LLM that fits in 12 GB VRAM as a reasonable substitute for copilot in VSCode? And if so, can you give it documentation and other code repositories to make it better at a particular language and platform?
replies(1): >>41858520 #
2. 0xedd ◴[] No.41858520[source]
Technically, yes, but will yield poor results. We did it internally at big corp n+1 and it, frankly, blows. Other than menial tasks, it's good for nothing but a scout badge.
replies(1): >>41858761 #
3. tbrownaw ◴[] No.41858761[source]
Is that really that much worse than full copilot, though? When we tried it this past spring, it was really cool but not quite useful enough to actually stick with.