It's like saying rm -rf / should have more safeguards built in. It feels unfair to call out the AI based tools for this.
It's like saying rm -rf / should have more safeguards built in. It feels unfair to call out the AI based tools for this.
> For example, if a user with appropriate privileges mistakenly runs ‘rm -rf / tmp/junk’, that may remove all files on the entire system. Since there are so few legitimate uses for such a command, GNU rm normally declines to operate on any directory that resolves to /. If you really want to try to remove all the files on your system, you can use the --no-preserve-root option, but the default behavior, specified by the --preserve-root option, is safer for most purposes.
https://www.gnu.org/software/coreutils/manual/html_node/Trea...
* "unreliable" may not be the right word. For all we know, the agent performed admirably given whatever the user's prompt may have been. Just goes to show that even in a relatively constricted domain of programming, where a lot (but far from all) outcomes are binary, the room for misinterpretation and error is still quite vast.
Any system capable of automating a complex task will by need be more complex than the task at hand. This complexity doesn't evaporate when you through statistical fuzzers at it.
I hypothesize that a $(git fetch --mirror) would pull down the "orphaned" revision, too, but don't currently have the mental energy to prove it