Dealing with some minified json, switching to iTerm, doing `pbpaste | json_pp | pbcopy` and having a clean output is _so_ nice.
Dealing with some minified json, switching to iTerm, doing `pbpaste | json_pp | pbcopy` and having a clean output is _so_ nice.
It’s so ingrained, I’m more likely than not to just write it out that way even when I know exactly what I’m doing from the onset.
< foo.json jq | pbcopy
e.g.
I need to grab some info from textfile.txt to use as arguments to a function.
cat textfile.txt
looks like its comma delimited.
cat textfile.txt | cut -d, -f 2-5
ah, its the third and fourth column i need
cat textfile.txt | cut -d, -f 3-4 | grep '123456'
perfect
cat textfile.txt | cut -d, -f 3-4 | grep 123456 | tr , ' '
myfunc $(cat textfile.txt | cut -d, -f 3-4 | grep 123456 | tr , ' ')
lot's of times we sort of know what we are working with, but don't remember the particulars especially
I still uselessly use cat though, it's such a nice way to build a pipeline.
Many of my programs and scripts start output with the line: # cmd arg1 arg2 arg3 ...
and simply echo back lines that start with '#'. That way, I have an internal record of the program that was run and the data file that was read (as well as previous parts of the analysis chain).
And, 'R' ignores lines starting with '#', so the record is there, but does not affect later analyses.