←back to thread

490 points todsacerdoti | 2 comments | | HN request time: 0s | source
Show context
BurningFrog ◴[] No.44383219[source]
Would it make sense to include the complete prompt that generated the code with the code?
replies(3): >>44383344 #>>44384164 #>>44386462 #
astrobiased ◴[] No.44383344[source]
It would need to be more than that. A prompt for one model can have different results vs another. Even when the model has different treatment for inference, eg quantization, the same prompt for the unquantized and quantized model could differ.
replies(1): >>44383438 #
1. verdverm ◴[] No.44383438[source]
Even more so, when you come back to understand in a few years, the model will no longer be available
replies(1): >>44383820 #
2. galangalalgol ◴[] No.44383820[source]
One of several reasons to use an open model even if it isn't quite as good. Version control the models and commit the prompts with the model name and a hash of the parameters. I'm not really sure what value that reproducibility adds though.