I think building some processing off of Vulkan 1.3 was the right move. (Aside, I also just noticed yesterday that Asahi Linux on Mac supports that standard as well.)
I think building some processing off of Vulkan 1.3 was the right move. (Aside, I also just noticed yesterday that Asahi Linux on Mac supports that standard as well.)
FFmpeg arguments, the original prompt engineering
The only options you ever need are tar -x, tar -c (x for extract and c for create). tar -l if you wanna list, l for list.
That's really it, -v for verbose just like every other tool if you wish.
Examples:
tar -c project | gzip > backup.tar.gz
cat backup.tar.gz | gunzip | tar -l
cat backup.tar.gz | gunzip | tar -x
You never need anything else for the 99% case.Surely you mean -t if you wanna list, t for lisT.
l is for check-Links.
-l, --check-links
(c and r modes only) Issue a warning message unless all links to each file are archived.
And you don't need to uncompress separately. tar will detect the correct compression algorithm and decompress on its own. No need for that gunzip intermediate step.Whoops, lol.
> on its own
Yes.. I'm aware, but that's more options, unnecessary too, just compose tools.
Examples:
tar -cf archive.tar.gz foo bar # Create archive.tar.gz from files foo and bar.
tar -tvf archive.tar.gz # List all files in archive.tar.gz verbosely.
tar -xf archive.tar.gz # Extract all files from archive.tar.gz
fwiw, `tar xzf foobar.tgz` = "_x_tract _z_e _f_iles!" has been burned into my brain. It's "extract the files" spoken in a Dr. Strangelove German accent
Better still, I recently discovered `dtrx` (https://github.com/dtrx-py/dtrx) and it's great if you have the ability to install it on the host. It calls the right commands and also always extracts into a subdir, so no more tar-bombs.
If you want to create a tar, I'm sorry but you're on your own.
I don't use tape, so I don't need a tape archive format.
One would use gemini-cli (or claude-cli),
- and give a natural language prompt to gemini (or claude) on what processing needs to be done,
- with the correct paths to FFmpeg and the media file,
- and g-cli (or c-cli) would take it from there.
Is this correct?
Principle of least surprise and all that.
Gzip only compresses a single file, so .tar.gz lets you bundle multiple files. You can do the same thing with zip, of course, but...
Zip compresses individual files separately in the container, ignoring redundancies between files. But .tar.gz (and .tar.zip, though I've rarely seen that combination) bundles the files together and then compresses them, so can get better compression than .zip alone.
"also always extracts into a subdir" sounds like a nice feature though, thanks for sharing another alternative!
You don't need the z, as xf will detect which compression was used, if any.
Creating is no harder, just use c for create instead, and specify z for gzip compression:
tar czf archive.tar.gz [filename(s)]
Same with listing contents, with t for tell: tar tf archive.tar.gz
tar -caf foo.tar.xz foo
Will be an xz compressed tarball.
A common use case is:
$ cpio -pdumv args
See: $ man cpio
and here is an example from its Wikipedia page, under the "Operation and archive format" section, under the Copy subsection:Copy
Cpio supports a third type of operation which copies files. It is initiated with the pass-through option flag (p). This mode combines the copy-out and copy-in steps without actually creating any file archive. In this mode, cpio reads path names on standard input like the copy-out operation, but instead of creating an archive, it recreates the directories and files at a different location in the file system, as specified by the path given as a command line argument.
This example copies the directory tree starting at the current directory to another path new-path in the file system, preserving files modification times (flag m), creating directories as needed (d), replacing any existing files unconditionally (u), while producing a progress listing on standard output (v):
$ find . -depth -print | cpio -p -dumv new-path
I also use it very infrequently compared to tar -- mostly in conjunction with swupdate. I've also run into file size limits, but that's not really a function of the command line interface to the tool.