←back to thread

521 points hd4 | 1 comments | | HN request time: 0s | source
Show context
hunglee2 ◴[] No.45643396[source]
The US attempt to slow down China's technological development succeeds on the basis of preventing China from directly following the same path, but may backfire in the sense it forces innovation by China in a different direction. The overall outcome for us all may be increase efficiency as a result of this forced innovation, especially if Chinese companies continue to open source their advances, so we may in the end have reason to thank the US for their civilisational gate keeping
replies(17): >>45643584 #>>45643614 #>>45643618 #>>45643770 #>>45643876 #>>45644337 #>>45644641 #>>45644671 #>>45644907 #>>45645384 #>>45645721 #>>45646056 #>>45646138 #>>45648814 #>>45651479 #>>45651810 #>>45663019 #
segmondy ◴[] No.45643618[source]
may backfire? it's a bit too late for that.

go to 2024, western labs were crushing it.

it's now 2025, and from china, we have deepseek, qwen, kimi, glm, ernie and many more capable models keeping up with western labs. there are actually now more chinese labs releasing sota models than western labs.

replies(4): >>45643764 #>>45646364 #>>45650725 #>>45650819 #
1. rasz ◴[] No.45650725[source]
Have you tried using those models? qwen for example cant even do something as basic as clustering analysis on a list of integers, hell it goes off the rails when just reading said integers from a file - starts babbling about determining number of digits, indexes, tries concatenating numbers together into one big string, no idea wtf is going on with that model.