It pays to buy into the Unix philosophy. Knowing how to pipe commands and comfortability with Unix commands is a great investment of time as a developer.
Here’s an example. Imagine I have a command that is outputting an incredible amount of data, such as a noisy log output from a server:
If your log is outputting hundreds of lines a second, you’re going to be overwhelmed, and miss the part that matters: the “ERROR” log line that was output in the middle of the log.
You can pipe the output to grep to filter out the lines you don’t care about:
That works great, but what if you also have “DEBUG” lines that you want to see? For instance:
Now, you’re going to want to see the “DEBUG” lines as well. You can pipe the output to check for “DEBUG” or “ERROR” and then pipe that to grep:
This is a great way to filter out noisy log lines. You can also use grep to filter out lines that don’t match a pattern, like this:
You can also use grep to filter out lines that match a pattern, like this:
This will output all lines that match either “DEBUG” or “ERROR”. But now, we’re whitelisting every individual thing we want to see in the logs. This works, but you may want to filter out the things you_don’t want to see: excluding lines may be a more effective approach. We can do this with grep -v:
This will output all lines that don’t match either “DEBUG” or “ERROR”. This is a great way to filter out noisy log lines. You can also use grep to filter out lines that don’t match a pattern, like this:
You can also use grep to filter out lines that match a pattern, like this:
I use this pattern all the time. When running a Solana validator, it outputs a massive amount of metric logs:
I can filter out these lines to get just the stuff I care about:
As you can see, this approach works really well for structured logs, or any logs that have a consistent format. Since all Solana validator log lines are classified by where the logs are coming from, we can easily and effectively filter out log lines.