Most active commenters

    ←back to thread

    Claude Sonnet will ship in Xcode

    (developer.apple.com)
    485 points zora_goron | 12 comments | | HN request time: 0.745s | source | bottom
    1. AndyKelley ◴[] No.45059401[source]
    Apple.com advertising a Mac Mini:

    > Built for Apple Intelligence.

    > 16-core Neural Engine

    These Xcode release notes:

    > Claude in Xcode is now available in the Intelligence settings panel, allowing users to seamlessly add their existing paid Claude account to Xcode and start using Claude Sonnet 4

    All that dedicated silicon taking up space on their SoC and yet you still have to input your credit card in order to use their IDE. Come on...

    replies(7): >>45059455 #>>45059523 #>>45059667 #>>45059674 #>>45059881 #>>45060309 #>>45060720 #
    2. geor9e ◴[] No.45059455[source]
    To run a model locally, they would need to release the weights to the public and their competitors. Those are flagship models.

    They would also need to shrink them way down to even fit. And even then, generating tokens on an apple neural chip would be waaaaaay slower than an HTTP request to a monster GPU in the sky. Local llms in my experience are either painfully dumb or painfully slow.

    replies(1): >>45059919 #
    3. aetherspawn ◴[] No.45059523[source]
    I bet Apple are working on it, it’s just not ready yet and they want to see how much people actually use it.

    It’s the Apple way to screw the 3rd party and replace with their own thing once the ROI is proven (not a criticism, this is a good approach for any business where the capex is large…)

    4. jama211 ◴[] No.45059667[source]
    Trust me, you wouldn’t want to use a model for agentic code editing that could fit on a Mac mini at this stage.
    replies(1): >>45060314 #
    5. ◴[] No.45059674[source]
    6. isodev ◴[] No.45059881[source]
    "Apple Intelligence", at least the part that's available to developers via the Foundation Models framework is a tiny ~3B model [0] with very limited context window. It's mainly good for simple things like tagging/classification and small snippets of text.

    [0] https://github.com/fguzman82/apple-foundation-model-analysis

    replies(1): >>45060588 #
    7. hu3 ◴[] No.45059919[source]
    Hence the "come on".
    replies(1): >>45078004 #
    8. jdgoesmarching ◴[] No.45060309[source]
    Local models and any OpenAI-compatible APIs are available to the Xcode Beta assistant. This is just a dedicated “sign in with x” rather than manual configuration.
    9. esafak ◴[] No.45060314[source]
    A 128GB Mac Mini M5 would be sweet.
    10. alwillis ◴[] No.45060588[source]
    Yes, but the Foundation Model framework can seamlessly use Apple's much larger models via Private Cloud Compute or switch to ChatGPT.

    When macOS 26 is officially announced on September 9, I expect Apple to announce support for Anthropic and Google models.

    11. ◴[] No.45060720[source]
    12. geor9e ◴[] No.45078004{3}[source]
    Not if they knew how terrible it would be.