If you use jq[0] and are wondering why Gron, the answer is at the very bottom of the readme:
jq is awesome, and a lot more powerful than gron, but with that power comes complexity. gron aims to make it easier to use the tools you already know, like grep and sed. gron's primary purpose is to make it easy to find the path to a value in a deeply nested JSON blob when you don't already know the structure; much of jq's power is unlocked only once you know that structure.
The reason is that tostream and fromstream can handle multiple top-level JSON texts, since jq normally does too, but then there's an ambiguity issue to resolve by having a sort of an object terminator. Filtering tostream's output with grep loses the terminators, and so fromstream cannot operate normally.
But it should be possible to define a function that does allow this, by, e.g., requiring just one top-level JSON text.
The other thing is that a path-based encoding that does not require quotes and commas would be handier -- tostream's output is itself JSON, so it's not shell-friendly. This is gron's brilliant innovation: it's got a path-based encoding of JSON that is easy to deal with in a shell script. (Mind you, I'm not sure that using brackets to denote array indices is all that easy to use, but the need to disambiguate object keys that look like numbers is critical. Also, there's an ambiguity as to keys that have embedded periods ('.') in them. And lastly, even gron can't shake off the string quotes for values.) That jq has the builtin functionality needed to do the same is not good enough if it doesn't actually do it out of the box.
It occurs to me that the way to get rid of quotes in string values is to not include the quotes but print the actual string with newlines (and maybe other characters, like double-quotes) escaped.
And the way to get rid of ambiguity regarding object keys that contain periods or " = " (and also square brackets) is to escape them: ".." and " == " or similarly.
Example:
.foo.bar[0].baz == ..blah = this is a\ntwo-line string
where the last key in the path is "baz = .blah".
Also, " = " is a bit annoying. I'd prefer ": ":
.foo.bar[0].baz: this is a\ntwo-line string
The the quoting rule for the special chars in keys can then be generic: double them.
.foo.bar[0].baz[[5]]:: ..blah: this is a\ntwo-line string
Here the last key in the path is "baz[5]: .blah". Mind you, this is still not trivial to deal with in a shell script, so perhaps we need some other escaping mechanism -- one that doesn't reuse the escaped characters, such as \u escaping.
What exactly was the benefit of removing quote characters? It does allow for easier grepping in some instances, but as you already covered, the brackets from array notation generally need to be quoted as well. I think you're better off just assuming the need for single quotes when grepping this output, since the fact the output is valid javascript is very useful, IMO (especially with autovivification in JS), and losing that to make it slight easier to search is regression in my eyes.
But maybe I'm missing the benefit you're seeing, and it's not about searching?
First off, you'd still have to escape newlines, and probably keep all other escapes required by JSON. But then the quotes would be unnecessary, thus wasteful. In particular, if I wanted to print a raw string (with escapes) at a particular path I could first use grep(1) to extract that path, then I'd have to write a fairly complex sed(1) command to first remove the path, then remove the quotes, whereas I could otherwise use grep(1) and cut(1) only.
Mind you, I'm sticking to jq, as I know it really well. But I'm thinking of other users here. I think the value of a path-based transformation of JSON is ease of use, which motivates me to think about making it even easier to use, such as by removing those quotes.
For me, it's a toss up whether I would use that or Perl, since chances are I'm doing it as a first step in some other process, and I can just continue on in Perl for the rest of the process anyway.
I find keeping the output as valid JS extremely useful though, since I can just paste a grepped entry into a developer console to get a valid object to play with on a page. That's cutting out a pipe to a js prettifier, pipe to less, search for identifying text, and careful cut and paste to get the enclosing block of text for what ends up being a semi-common action for me. On the other hand, I can get raw strings, but barely ever have need of that, and could fairly easily make an alias for that if it became common.
and this sounds awesome, I'll definitely test Gron. I love jq, but oh boy how complex it is. I find its syntax getting very confusing very quickly as soons as you try to be a little fancy.
It's a full-blown, dynamically-typed functional programming language. Well, not quite full-blown: it's missing closures of indefinite extent, but still.
jq is awesome, and a lot more powerful than gron, but with that power comes complexity. gron aims to make it easier to use the tools you already know, like grep and sed. gron's primary purpose is to make it easy to find the path to a value in a deeply nested JSON blob when you don't already know the structure; much of jq's power is unlocked only once you know that structure.
[0] - https://stedolan.github.io/jq/