←back to thread

Meta's open AI hardware vision

(engineering.fb.com)
212 points GavCo | 5 comments | | HN request time: 0.425s | source
Show context
Gee101 ◴[] No.41851793[source]
Zuckerberg and Facebook gets a lot of hate but at least they invest a lot into engineering and open source.
replies(6): >>41852156 #>>41852273 #>>41852671 #>>41852972 #>>41853469 #>>41855820 #
1. amelius ◴[] No.41852273[source]
I personally think our academia should be training and curating these kinds of models and the data they are based on, but this is an acceptable second best.
replies(3): >>41853103 #>>41853566 #>>41854970 #
2. 123yawaworht456 ◴[] No.41853103[source]
there's already https://www.goody2.ai/chat
3. asdff ◴[] No.41853566[source]
IMO there are much better ways to spend 300 million in research beyond firing up a cluster for 60 days to train on internet content.
replies(1): >>41855080 #
4. mistrial9 ◴[] No.41854970[source]
basically no one in the entire world was willing to spend the kind of money on massive compute and data centers that Meta did spend, is spending and will spend. The actual numbers are (I think) rare to find and so large that it is hard to comprehend it.
5. nickpsecurity ◴[] No.41855080[source]
Spending $100 million one time on a GPT4-level model that is open-source would help with a lot of that research. Especially after all the 3rd party groups fine-tuned it or layered their tools on it.

I think the Copilot-equivalent tools alone would make it quickly pay itself off in productivity gains. Research summaries, PDF extraction, and OCR would add more to that.