←back to thread

321 points denysvitali | 1 comments | | HN request time: 0s | source
Show context
denysvitali ◴[] No.45108483[source]
Report: https://github.com/swiss-ai/apertus-tech-report/raw/refs/hea...

Key features

Fully open model: open weights + open data + full training details including all data and training recipes

Massively Multilingual: 1811 natively supported languages

Compliant: Apertus is trained while respecting opt-out consent of data owners (even retrospectivey), and avoiding memorization of training data

replies(3): >>45109373 #>>45113812 #>>45142610 #
lyu07282 ◴[] No.45113812[source]
Their struggle with Nvidia driver bugs they had to work around was very relatable. You'd think if someone buys 10,752 of their high-end GPUs you'd get some support with it.
replies(3): >>45142497 #>>45144974 #>>45150592 #
_zoltan_ ◴[] No.45142497[source]
did I miss a blog on this?
replies(1): >>45144029 #
lllllm ◴[] No.45144029{3}[source]
we didn't have time to write one yet, but there is the tech report which has a lot of details already
replies(1): >>45147782 #
menaerus ◴[] No.45147782{4}[source]
Report is packed with interesting details. Engineering challenges and solutions chapter especially show how things which are supposed and expected to work break when put through a massive scale. Really difficult bugs. Great writeup.
replies(1): >>45148399 #
1. lllllm ◴[] No.45148399{5}[source]
thank you!