←back to thread

1246 points adrianh | 9 comments | | HN request time: 1.034s | source | bottom
1. simonw ◴[] No.44491402[source]
I find it amusing that it's easier to ship a new feature than to get OpenAI to patch ChatGPT to stop pretending that feature exists (not sure how they would even do that, beyond blocking all mentions of SoundSlice entirely.)
replies(4): >>44491560 #>>44491579 #>>44492066 #>>44498535 #
2. hnlmorg ◴[] No.44491560[source]
I think the benefit of their approach isn’t that it’s easier, it’s that they still capitalise on ChatGPTs results.

Your solution is the equivalent of asking Google to completely delist you because one page you dont want ended up on Googles search results.

3. mudkipdev ◴[] No.44491579[source]
systemPrompt += "\nStop mentioning SoundSlice's ability to import ASCII data";
replies(1): >>44492192 #
4. PeterStuer ◴[] No.44492066[source]
Companies pay good money to panels of potential customers to hear their needs and wants. This is free market research!
replies(1): >>44497467 #
5. simonw ◴[] No.44492192[source]
Thinking about this more, it would actually be possible for OpenAI to implement this sensibly, at least for the user-facing ChatGPT product: they could detect terms like SoundSlice in the prompt and dynamically append notes to the system prompt.

I've been wanted them to do this for questions like "what is your context length?" for ages - it frustrates me how badly ChatGPT handles questions about its own abilities, it feels like that would be worth them using some kind of special case or RAG mechanism to support.

replies(1): >>44496561 #
6. garfij ◴[] No.44496561{3}[source]
Probably less sensible than you think. How many terms would they need to do this over? How many terms would they need to do it for _at once_? How many tokens would that add to every prompt that comes in?

Let alone that dynamically modifying the base system prompt would likely break their entire caching mechanism given that caching is based on longest prefix, and I can't imagine that the model's system prompt is somehow excluded from this.

7. bobbylox ◴[] No.44497467[source]
But they wouldn't have wanted this particular thing if the AI hadn't told them to.
replies(1): >>44498352 #
8. PeterStuer ◴[] No.44498352{3}[source]
You mean they didn't want infinite free personalized guitar practice lessons they can play along with?

Clearly the users are already using ChatGPT for generating some guitar practice, as it is basically infinite free personalized lessons. For practicing they do want to be able hear it to play along at variable speed, maybe create slight variations etc.

Soundslice is a service that does exactly that. Except that before people used to have sheet music as the source. I know way back when I had guitar aspirations, people exchanged binders of photocopied sheet music.

Now they could have asked ChatGPT to output an svg of the thing as sheet music (it does, I tested). Soundslice could have done this behind the scenes as a half hour quick and dirty fix while developing a better and more cost effective alternative.

Look, if at the turn of the century you were a blacksmith living of changing horseshoes, and you suddenly have people mistakenly showing up for a tire change on their car, are you going to blame the villagers that keep sending them your way, or open a tire change service? We know who came out on top.

9. LinXitoW ◴[] No.44498535[source]
If you gave a junior level developer just one or two files of your code, without any ability to look at other code, and asked them to implement a feature, none of them would make ANY reasonable assumptions about what is available?

This seems similar, and like a decent indicator that most people (aka the average developer) would expect X to exist in your API.