←back to thread

200 points simonw | 1 comments | | HN request time: 0.198s | source
Show context
syntaxing ◴[] No.45649620[source]
Ehh, is it cool and time savings that it figured it out? Yes. But the solution was to get a “better” version prebuilt wheel package of PyTorch. This is a relatively “easy” problem to solve (figuring out this was the problem does take time). But it’s (probably, I can’t afford one) going to be painful when you want to upgrade the cuda version or specify a specific version. Unlike a typical PC, you’re going to need to build a new image and flash it. I would be more impressed when a LLM can do this end to end for you.
replies(2): >>45652848 #>>45662953 #
1. cat_plus_plus ◴[] No.45662953[source]
You can still upgrade CUDA within forward compatibility range and install new packages without reflashing.