←back to thread

Interview with gwern

(www.dwarkeshpatel.com)
308 points synthmeat | 1 comments | | HN request time: 0.201s | source
Show context
YeGoblynQueenne ◴[] No.42135916[source]
This will come across as vituperative and I guess it is a bit but I've interacted with Gwern on this forum and the interaction that has stuck to me is in this thread, where Gwern mistakes a^nb^n as a regular (but not context-free) language (and calls my comment "not even wrong"):

https://news.ycombinator.com/item?id=21559620

Again I'm sorry for the negativity, but already at the time Gwern was held up by a certain, large, section of the community as an important influencer in AI. For me that's just a great example of how basically the vast majority of AI influencers (who vie for influence on social media, rather than research) are basically clueless about AI and CS and only have second-hand knowledge, which I guess they're good at organising and popularising, but not more than that. It's easy to be a cheer leader for the mainstream view on AI. The hard part is finding, and following, unique directions.

With apologies again for the negative slant of the comment.

replies(10): >>42136055 #>>42136148 #>>42136538 #>>42136759 #>>42137041 #>>42137215 #>>42137274 #>>42137284 #>>42137350 #>>42137636 #
dilap ◴[] No.42136759[source]
Regarding your linked comment, my takeaway is that the very theoretical task of being able to recognize an infinite language isn't very relevent to the non-formal, intuitive idea of "intelligence"

Transformers can easily intellectually understand a^nb^n, even though they couldn't recognize whether an arbitrarily long string is a member of the language -- a restriction humans share!, since eventually a human, too, would lose track of the count, for a long enough string.

replies(2): >>42136846 #>>42136925 #
1. raverbashing ◴[] No.42136925[source]
I agree with your assessment

Yes, LLMs are bad at this. A similar example: SAT solvers can't solve the pigeonhole problem without getting into a loop

It is an exceptional case that requires "metathinking" maybe, rather than a showstopper issue

(can't seem to be able to write the grammar name, the original comment from the discussion had it)