←back to thread

229 points modinfo | 1 comments | | HN request time: 0.216s | source
Show context
simonw ◴[] No.40835549[source]
If this is the API that Google are going with here:

    const model = await window.ai.createTextSession();
    const result = await model.prompt("3 names for a pet pelican");
There's a VERY obvious flaw: is there really no way to specify the model to use?

Are we expecting that Gemini Nano will be the one true model, forever supported by this API baked into the world's most popular browser?

Given the rate at which models are improving that would be ludicrous. But... if the browser model is being invisibly upgraded, how are we supposed to test out prompts and expect them to continue working without modifications against whatever future versions of the bundled model show up?

Something like this would at least give us a fighting chance:

    const supportedModels = await window.ai.getSupportedModels();
    if (supportedModels.includes("gemini-nano:0.4")) {
        const model = await window.ai.createTextSession("gemini-nano:0.4");
        // ...
replies(7): >>40835678 #>>40835703 #>>40835717 #>>40835757 #>>40836197 #>>40836971 #>>40843533 #
j10u ◴[] No.40835678[source]
I'm pretty sure that with time, they will be forced to let users choose the model. Just like it happened with the search engine...
replies(2): >>40835820 #>>40835903 #
1. zelphirkalt ◴[] No.40835903[source]
Only that it will take 5-10y to be regulated, until they will have to pay a measly fine, and let users choose. But then we will have the same game as with GDPR conformity now, companies left and right acting as if they just misunderstood the "new" rules and are still learning how this big mystery is to be understood, until a judge tells them to cut the crap. Then we will have the big masses, that will not care and feed all kinds of data into this AI thing, even without asking people it concerns for consent. Oh and then of course Google will claim, that it is all the bad users doing that, and that it is so difficult to monitor and prevent.