←back to thread

321 points jhunter1016 | 1 comments | | HN request time: 0s | source
Show context
mikeryan ◴[] No.41878605[source]
While technical AI and LLMs are not something I’m well versed in. So as I sit on the sidelines and see the current proliferation of AI startups I’m starting to wonder where the moats are outside of access to raw computing power. Open AI seemed to have a massive lead in this space but that lead seems to be shrinking every day.
replies(10): >>41878784 #>>41878809 #>>41878843 #>>41880703 #>>41881606 #>>41882000 #>>41885618 #>>41886010 #>>41886133 #>>41887349 #
1. wongarsu ◴[] No.41878809[source]
Data. You want huge amounts of high quality data with a diverse range of topics, writing styles and languages. Everyone seems to balance those requirements a bit differently, and different actors have access to different training data

There is also some moat in the refinement process (rlhf, model "safety" etc)