←back to thread

Francois Chollet is leaving Google

(developers.googleblog.com)
377 points xnx | 7 comments | | HN request time: 0s | source | bottom
Show context
geor9e ◴[] No.42131955[source]
If I were to speculate, I would guess he quit Google. 2 days ago, his $1+ million Artificial General Intelligence competition ended. Chollet is now judging the submissions and will announce the winners in a few weeks. The timing there can't be a coincidence.
replies(2): >>42132119 #>>42133232 #
paxys ◴[] No.42132119[source]
More generally, there is unlimited opportunity in the AI space today, especially for someone of his stature, and staying tied to Google probably isn't as enticing. He can walk into any VC office and raise a hundred million dollars by the end of the day to build whatever he wants.
replies(2): >>42132971 #>>42133388 #
1. hiddencost ◴[] No.42132971[source]
$100M isn't enough capital for an AI startup that's training foundation models, sadly.

A ton of folks of similar stature who raised that much burnt it within two years and took mediocre exits.

replies(3): >>42133404 #>>42133563 #>>42134587 #
2. NitpickLawyer ◴[] No.42133404[source]
I think we'll start to see a differentiation soon. The likes of Ilya will raise money to do whatever, including foundation models / new arch, while other startups will focus on post-training, scaling inference, domain adaptation and so on.

I don't think the idea of general foundational model from scratch is a good path for startups anymore. We're already seeing specialised verticals (cursor, codeium, both at ~100-200m funding rounds) and they're both focused on specific domains, not generalist. There's probably enough "foundation" models out there to start working on post-training stuff already, no need to reinvent the wheel.

3. zxexz ◴[] No.42133563[source]
Interesting, I think $100M is totally enough to train a SotA "foundation model". It's all in the use case. I'd love to hear explicit arguments against this.
replies(1): >>42133765 #
4. hiddencost ◴[] No.42133765[source]
There's a bunch of failed AI companies who raised been $100M and $200M with the goal of training foundation models. What they discovered is that they were rapidly out paced by the large players, and didn't have any way to generate revenue.

You're right that it's enough to train one, but IMO you're wrong that it's enough to build a company around.

replies(2): >>42134050 #>>42135726 #
5. AuryGlenz ◴[] No.42134050{3}[source]
I imagine Black Forest Labs (Flux) is doing alright, at least for now. I still feel like they’re missing out on some hanging fruit financially though.

But yeah, you’re not going to make any money making yet another LLM unless it’s somehow special.

6. versteegen ◴[] No.42134587[source]
Chollet is a leading skeptic of the generality of LLMs (see arcprize.org). He surely isn't doing a startup to train another one.
7. ak_111 ◴[] No.42135726{3}[source]
can you please name names? I can't think of any (but am not an expert on the space).