Most active commenters
  • UltraSane(3)

←back to thread

Claude Sonnet will ship in Xcode

(developer.apple.com)
485 points zora_goron | 42 comments | | HN request time: 2.063s | source | bottom
1. breadwinner ◴[] No.45059612[source]
It seems every IDE now has AI built-in. That's a problem if you're working on highly confidential code. You never know when the AI is going to upload code snippets to the server for analysis.
replies(13): >>45059623 #>>45059634 #>>45059661 #>>45059894 #>>45059943 #>>45060054 #>>45060064 #>>45060101 #>>45060121 #>>45060510 #>>45060668 #>>45061092 #>>45061687 #
2. jama211 ◴[] No.45059623[source]
Well that depends on whether you give it access or not, apple’s track record with privacy gives me some hope
3. tcoff91 ◴[] No.45059634[source]
Neovim and Emacs don’t have it built in. Use open source tools.
replies(1): >>45060072 #
4. nh43215rgb ◴[] No.45059661[source]
> "add their existing paid Claude account to Xcode and start using Claude Sonnet 4"

Wont work by default if I'm reading this correctly

5. XorNot ◴[] No.45059894[source]
If it's that confidential you should be on an airgapped network.

There's simply no way to properly secure network connected developer systems.

6. baby ◴[] No.45059943[source]
Not trying to be mean but I would expect comments on HN on these kind of stories to be from people who have used AI in IDEs at this point. There is no AI integration that runs automatically on a codebase.
replies(5): >>45059964 #>>45060520 #>>45061318 #>>45061438 #>>45061966 #
7. paradite ◴[] No.45059964[source]
There is automatic code indexing from Cursor.

Autocomple is also automatically triggered when you place your cursor inside the code.

replies(2): >>45060015 #>>45060095 #
8. rafram ◴[] No.45060015{3}[source]
Yes, Cursor, “The AI Code Editor.”
9. fny ◴[] No.45060054[source]
You do know: when it's enabled.
10. UltraSane ◴[] No.45060064[source]
People working on highly confidential code will NOT have access to the public internet.
replies(1): >>45060337 #
11. simonh ◴[] No.45060072[source]
They both support it via plugins. Xcode doesn’t enable it by default, you need to enable it and sign into an account. It’s not really all that different.
replies(2): >>45060175 #>>45070755 #
12. LostMyLogin ◴[] No.45060095{3}[source]
Cursor is an AI IDE and not what they are describing.
13. sneak ◴[] No.45060101[source]
No. It’s always something you have to turn on or log into.

Also, there are plenty of editors and IDEs that don’t.

Let’s stop pretending like you’re being forced into this. You aren’t.

replies(1): >>45072466 #
14. bsimpson ◴[] No.45060121[source]
Sublime Text doesn't by default.
15. OsrsNeedsf2P ◴[] No.45060206{4}[source]
If you're worried about someone accessing your unlocked computer to install LLMs, you might need to rethink your security model.
replies(1): >>45060255 #
16. renewiltord ◴[] No.45060255{5}[source]
They could install anything. Including Claude Code and then run it in background as agent to exfiltrate data. I'm a security professional. This is unacceptable
replies(1): >>45060306 #
17. eddyg ◴[] No.45060293{4}[source]
You should install LuLu if you’re that concerned. There are far more nefarious ways of “getting your data”.

https://objective-see.org/products/lulu.html

18. BalinKing ◴[] No.45060306{6}[source]
I think the parent commenter was pointing out that, instead of installing Claude Code, they could just install actual malware. It's like that phrase Raymond Chen always uses: "you're already on the other side of the airtight hatchway."
replies(1): >>45060468 #
19. abenga ◴[] No.45060337[source]
There is a gulf and many shades between "this code should never be on an internet-connected device" and "it doesn't matter if this code is copied everywhere by absolutely anyone".
replies(1): >>45060451 #
20. whatevermom ◴[] No.45060399{4}[source]
Yes. I am so worried as well. This is why I installed an AI to double-check if the password I entered is correct when logging in. Fight fire with fire
21. TheDong ◴[] No.45060401{4}[source]
What commonly gets installed in those cases is actual malware, a RAT (Remote Admin Tool) that lets the attacker later run commands on your laptop (kinda like an OpenSSH server, but also punching a hole through nat and with a server that they can broadcast commands broadly to the entire fleet).

If the attacker wants to use AI to assist in looking for valuables on your machine, they won't install AI on your machine, they'll use the remote shell software to pop a shell session, and ask AI they're running on one of their machines to look around in the shell for anything sensitive.

If an attacker has access to your unlocked computer, it is already game over, and LLM tools is quite far down the list of dangerous software they could install.

Maybe we should ban common RAT software first, like `ssh` and `TeamViewer`.

replies(2): >>45060474 #>>45060523 #
22. UltraSane ◴[] No.45060451{3}[source]
To me "highly confidential" would mean "isolated from the internet" or else it isn't going to be "highly confidential" for very long.
replies(1): >>45060743 #
23. renewiltord ◴[] No.45060468{7}[source]
Yes but Claude Code could install malware when I'm not paying attention. And when I remove with MalwareBytes it will return because LLMs are not AGI.
replies(1): >>45064514 #
24. TheDong ◴[] No.45060509{6}[source]
You know, I should have realized this was a troll account with the previous comment.

I guess that's on me for being oblivious enough that it took this obvious of a comment for me to be sure you're intentionally trolling. Nice work.

25. viraptor ◴[] No.45060510[source]
This is not a realistic concern. If you're working on highly confidential code (in a serious meaning of that phrase), your while environment is already either offline or connecting only through a tightly controlled corporate proxy. There's no accidental leaks to AI from those environments.
replies(2): >>45060556 #>>45060732 #
26. ygritte ◴[] No.45060520[source]
This could change on a daily basis, and it's a valid concern anyway.
27. jumploops ◴[] No.45060523{5}[source]
> They won’t install AI on your machine

Actually they’ll just the AI you already have on your machine[0]

In this attack, the malware would use Claude Code (with your credentials) to scan your own machine.

Much easier than running the inference themselves!

[0]https://semgrep.dev/blog/2025/security-alert-nx-compromised-...

28. dijit ◴[] No.45060556[source]
thanks for giving the security department more reasons to think that way.

I spent the last 6 months trying to convince them not to block all outbound traffic by default.

replies(1): >>45060765 #
29. doctorpangloss ◴[] No.45060668[source]
many enterprises store their code on GitHub, owned by Microsoft, operator of Copilot

you can use Claude via bedrock and benefit from AWS trust

Gemini? Google owns your e-mail. Maybe you're even one of those weirdos who doesn't use Google for e-mail - I bet your recipient does.

so... they have your code, your secrets, etc.

30. troupo ◴[] No.45060732[source]
There are ranges of security concerns and high confidentiality.

For most corporate code (that is highly confidential) you still have proper internet access, but you sure as hell can't just send your code to all AI providers just because you want to, just because it's built into your IDE.

31. troupo ◴[] No.45060743{4}[source]
Have you seen a lot of code from Klarna, Storytel, Spotify (companies I've worked at)?

None of these companies are isolated from the internet.

replies(1): >>45063374 #
32. postalcoder ◴[] No.45060765{3}[source]
The right middle ground is running Little Snitch in alert mode. The initial phase of training the filters and manually approving requests is painful, but it's a lot better than an air gap.
replies(1): >>45061052 #
33. dijit ◴[] No.45061052{4}[source]
that’s what I do, but since it’s in my control the security teams don’t like it. ;)
34. c_ehlen ◴[] No.45061092[source]
Most of the big corporations will have a special contract with the AI labs with 0 retention policies.

I do not think this will be an issue for big companies.

35. lalo2302 ◴[] No.45061318[source]
Gitkraken does
36. factorialboy ◴[] No.45061438[source]
> There is no AI integration that runs automatically on a codebase.

Don't be naive.

37. Mashimo ◴[] No.45061687[source]
On IDEA the organisation who controls the license can disable the build in (remote) AI. (Not the local auto complete one)

But I guess the user could still get a 3rd party plugin.

38. TiredOfLife ◴[] No.45061966[source]
This is HN. 10 years ago that would be true, but now I expect 99% of commenters to have newer used the thing they are talking about or used it once 20 years ago for 10 minutes, or even nkt read the article.
39. UltraSane ◴[] No.45063374{5}[source]
I bet their devs are.
40. BalinKing ◴[] No.45064514{8}[source]
Isn't the general advice that if malware has been installed specifically due to physical access, then the entire machine should be considered permanently compromised? That is to say, if someone has access to your unlocked machine, I've heard that it's way too late for MalwareBytes to be reliable....
41. tcoff91 ◴[] No.45070755{3}[source]
That seems perfectly fine and noncontroversial then. Good on Apple for doing it that way.
42. ◴[] No.45072466[source]