←back to thread

Google is winning on every AI front

(www.thealgorithmicbridge.com)
993 points vinhnx | 2 comments | | HN request time: 0.469s | source
Show context
antirez ◴[] No.43661765[source]
Gemini 2.5 pro is as powerful as everybody says. I still also use Claude Sonnet 3.7 only because the Gemini web UI has issues... (Imagine creating the best AI and then not allowing to attach Python or C files if not renamed .txt) but the way the model is better than anyone else is a "that's another league" experience. They have the biggest search engine and YouTube to leverage the power of the AI they are developing. At this point I believe too that they are likely to win the race.
replies(8): >>43662040 #>>43662082 #>>43662153 #>>43662206 #>>43662400 #>>43662640 #>>43662949 #>>43663630 #
1. discordance ◴[] No.43662040[source]
Instead of renaming files to .txt, you should try Gemini 2.5 pro through OpenRouter with roo, Cline or using Github Copilot. I've been testing GH Copilot [0] and it's been working really well.

0: https://github.blog/changelog/2025-04-11-copilot-chat-users-...

replies(1): >>43670712 #
2. antirez ◴[] No.43670712[source]
I know perfectly I can use the API with any wrapper. I don't do that for choice, my human+AI development style is in the form of the chat, and since I discovered that many models behave differently (especially Gemini 2.5) based on where you invoke them (I don't know what Google is doing internally, if they change temperature / context size / ...) I stick with using the default way a model is provided to the public by a given provider. Besides, while I write a lot of code with the assistance of AI, my use case is mainly code reviews, design verification / brainstorming, and so forth, not much "write this code for me" (not that I believe there is anything wrong with it, just a matter of preferences -- I do it for things like tests, or to have a template when the coding task is just library calls that are boring to put together: typical use case, "generate the boilerplate to load a JPEG file with libjpeg"). So I keep using the web chat :)