upvote
I'm not GP, I use jq all the time, but I each time I use it I feel like I'm still a beginner because I don't get where I want to go on the first several attempts. Great tool, but IMO it is more intuitive to JSON people that want a CLI tool than CLI people that want a JSON tool. In other words, I have my own preconceptions about how piping should work on the whole thing, not iterating, and it always trips me up.

Here's an example of my white whale, converting JSON arrays to TSV.

cat input.json | jq -S '(first|keys | map({key: ., value: .}) | from_entries), (.[])' | jq -r '[.[]] | @tsv' > out.tsv

reply

    <input.json  jq -S  -r '(first | keys) , (.[]| [.[]]) | @tsv'
    <input.json  # redir
    jq
    -S           # sort
    -r           # raw string out
    '
    (first | keys) # header
    ,              # comma is generator
    (.[] |           # loop input array and bind to .
    [                # construct array
     .[]             # with items being the array of values of the bound object
     ])           
     | @tsv'        # generator binds the above array to . and renders to tsv
reply
oh my god how could I have been doing this for so long and not realize that you can redirect before your binary.

I knew cat was an anti-pattern, but I always thought it was so unreadable to redirect at the end

reply
it seems smart until you accidently type >input.json and nuke the file
reply
Here's an easier to understand query for what you're trying to do (at least it's easier to understand for me):

    cat input.json | jq -r '(first | keys) as $cols | $cols, (.[] | [.[$cols[]]]) | @tsv'
That whole map and from entries throws it off. It's not a good use for what you're doing. tsv expects a bunch of arrays, whereas you're getting a bunch of objects (with the header also being one) and then converting them to arrays. That is an unnecessary step and makes it a little harder to understand.
reply
Thanks for sharing, this is much better, though I actually think it is the perfect example to explain something that is brain-slippery about jq

look at $cols | $cols

my brain says hmm that's a typo, clearly they meant ; instead of | because nothing is getting piped, we just have two separate statements. Surely the assignment "exhausts the pipeline" and we're only passing null downstream

the pipelining has some implicit contextual stuff going on that I have to arrive at by trial and error each time since it doesn't fit in my worldview while I'm doing other shell stuff

reply
Honestly both of those make me do the confused-dog-head-tilt thing. I'd go for something sexp based, perhaps with infix composition, map, and flatmap operators as sugar.
reply
I find it much harder to remember / use each time then awk
reply
> I dream of a world in which all CLI tools produce and consume JSON and we use jq to glue them together.

that world exists and mature (powershell)

reply
I'm often having trouble with figuring out in advance what the end result will be when processing an input array: an array of mapped objects or a series of self-contained JSON objects? Why? Which one is better? What if I would like to filter out some of the elements as part of the operation?
reply
Sound similar to how power shell works, and it’s not great. Plain text is better.
reply