←back to thread

358 points mohi-kalantari | 1 comments | | HN request time: 0.201s | source
Show context
kayhantolga ◴[] No.46195479[source]
As a .NET developer who actually likes some Microsoft products, I can say this: the Copilot series is the worst thing they've shipped since Internet Explorer—and honestly, it might overtake it. The sad part is they had a huge head start before competitors gained access to powerful models, yet this is what we got.

If you haven’t seen how bad it is, here’s one example: Copilot Terminal. In theory, it should help you with terminal commands. Sounds great. In practice, it installs a chat panel on the right side of your terminal that has zero integration with the terminal itself. It can’t read what’s written, it can’t send commands, it has no context, and the model response time is awful. What’s the point of a “terminal assistant” that can’t actually assist the terminal?

This lack of real integration is basically the core design of most Copilot products. If you’ve been lucky enough to avoid them, good for you. If your company forces you to use them because they’re bundled with a Microsoft license, I genuinely feel your pain.

replies(8): >>46196100 #>>46196148 #>>46196271 #>>46196343 #>>46196418 #>>46196569 #>>46196754 #>>46197123 #
btbuildem ◴[] No.46196418[source]
> lack of real integration is basically the core design of most Copilot products

I would wager a month's wages that this is the doing of some internal Security Review, wherein a bunch of out-of-touchers decided that the customers will want to prefer to be Safe and Secure instead of getting some actual value from integrating copilot into shell workflows.

Meanwhile people are yolo'ing it with various janky DIY wires and duct-tape githobbits that mash together whatever open weights model and user-level access to the system (or worse).

replies(3): >>46196741 #>>46197775 #>>46198613 #
isodev ◴[] No.46196741[source]
> a bunch of out-of-touchers decided that the customers will want to prefer to be Safe

You mean the other way around, right? Because what could possibly go wrong when we let a language model hallucinate its way through which terminal command rhymes best with your prompt according to that SO comment from training data.

replies(1): >>46196786 #
1. chasd00 ◴[] No.46196786[source]
i mentioned this upthread but an LLM with enough access to be fully integrated into all apps/services/files in an enterprise managed workstation sounds like privilege escalation attacks just waiting to happen.