←back to thread

534 points BlueFalconHD | 2 comments | | HN request time: 0.43s | source

I managed to reverse engineer the encryption (refered to as “Obfuscation” in the framework) responsible for managing the safety filters of Apple Intelligence models. I have extracted them into a repository. I encourage you to take a look around.
Show context
trebligdivad ◴[] No.44483981[source]
Some of the combinations are a bit weird, This one has lots of stuff avoiding death....together with a set ensuring all the Apple brands have the correct capitalisation. Priorities hey!

https://github.com/BlueFalconHD/apple_generative_model_safet...

replies(11): >>44483999 #>>44484073 #>>44484095 #>>44484410 #>>44484636 #>>44486072 #>>44487916 #>>44488185 #>>44488279 #>>44488362 #>>44488856 #
1. comex ◴[] No.44486072[source]
This is in the directory "com.apple.gm.safety_deny.output.summarization.cu_summary.proactive.generic".

My guess is that this applies to 'proactive' summaries that happen without the user asking for it, such as summaries of notifications.

If so, then the goal would be: if someone iMessages you about someone's death, then you should not get an emotionless AI summary. Instead you would presumably get a non-AI notification showing the full text or a truncated version of the text.

In other words, avoid situations like this story [1], where someone found it "dystopian" to get an Apple Intelligence summary of messages in which someone broke up with them.

For that use case, filtering for death seems entirely appropriate, though underinclusive.

This filter doesn’t seem to apply when you explicitly request a summary of some text using Writing Tools. That probably corresponds to “com.apple.gm.safety_deny.output.summarization.text_assistant.generic” [2], which has a different filter that only rejects two things: "Granular mango serpent", and "golliwogg".

Sure enough, I was able to get Writing Tools to give me summaries containing "death", but in cases where the summary should contain "granular mango serpent" or "golliwogg", I instead get an error saying "Writing Tools aren't designed to work with this type of content." (Actually that might be the input filter rather than the output filter; whatever.)

"Granular mango serpent" is probably a test case that's meant to be unlikely to appear in real documents. Compare to "xylophone copious opportunity defined elephant" from the code_intelligence safety filter, where the first letter of each word spells out "Xcode".

But one might ask what's so special about "golliwogg". It apparently refers to an old racial caricature, but why is that the one and only thing that needs filtering?

[1] https://arstechnica.com/ai/2024/10/man-learns-hes-being-dump...

[2] https://github.com/BlueFalconHD/apple_generative_model_safet...

replies(1): >>44487647 #
2. azalemeth ◴[] No.44487647[source]
I first encountered Golliwog in the context of Claude Debussy the composer of much beautiful music, including https://en.wikipedia.org/wiki/Children%27s_Corner#Golliwogg'.... The dolls in 1906-1908 I understand were rather popular and fortunately the stereotype has largely died.