I wish I could grep my car keys sometimes.
Linux
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
My 5 cents:
-
When piping output of
find
toxargs
, always use-print0
option offind
and-0
option ofxargs
. This allows processing files with any allowed characters in names (spaces, new lines etc.). (However I prefer-exec
.) -
There's an
i
command to insert a line insed
, it is better to use it instead ofs/^/...\n/
. It makes code more readable (if we can talk about readability ofsed
code, huh). -
If you want to split a delimiter separated line and print some field, you need
cut
. Keepawk
for more complicated tasks.
- If you want to split a delimiter separated line and print some field, you need cut. Keep awk for more complicated tasks.
Depends on the delimiter too! For anyone else reading this, sed accepts many kinds of delimiters. sed "s@thing@thing2@g" file.txt
is valid. I use this sometimes when parsing/replacing text with lots of slashes (like directory lists) so I can avoid escaping a ton of stuff.
I know, but it is not the case I was talking about. I meant widely used commands like awk '{print $2}'
that can be replaced with cut -f2
.
I know you know, as you already demonstrated your higher understanding. I just wanted to add a little bonus trick for anyone reading that doesn't know, and is learning from your examples.
the two are valid and no one is more correct than the other sooo...
agree with one and two and younger me would have agreed with your third point but I think I don't anymore.
yes cut is the simpler and mostly functional tool you need for those tasks.
but it is just so common to need a slight tweak or to want to substitute something or to want to do a specific regex match or weird multi character delimiter or something and you can do it all easily in awk instead of having to pipe three extra times to do everything with the simplest tool.
I've only ever found a use for sed once two decades into my career, and that was to work around a bug due to misuse of BigInt for some hash calculations in a Java component; awk remains unused. Bash builtins cover almost everything for which I find those are typically used.
find and grep see heavy daily use.
That's wild to me, as I used sed all the time. Quickly and easy changes in configs? Bam sed. Don't even need to open vi when I can grep for what I need, then swap it with sed. Though I imagine more seasoned vi nerds would be able to do this faster.
sed
is not for daily use, it is for reusable scripts. For other purposes interactive editors are more convinient.
What software did you use to put the slide deck together? It seems to work so nicely when placed on a webpage, too...
I don't know what OP used, but it could be any one of the Markdown presentation tools.
I like reveal.js
Your presentation can go in git, looks good anywhere, and easily shared. It's just html rendered.
I always found "find" very confusing. Currently, I'm using "fd", which I think has a more sensible UX
Using un*x since the 90s, this is all I know. I like awk but it can go fucking complicated, I once maintain a 5000 lines script that was parsing csv to generate JavaScript...
Someone used the wrong tool for the job. If an awk script gets more than a few dozen lines, it's time to use another language/tool to process that data.
Way too many ads on that link for me to read the actual content.
I've gotten tired of weird regex stuff in awk, sed, and grep, so I've moved to perl -E for all but the most basic of things.
In most cases extended POSIX regexes are enough and looks the same as perl regexes.
I also used perl until I needed to write highly portable scripts that can be run on systems without perl interpreter (e.g. some minimal linux containers). Simple things are also simple to do with grep/sed/awk, more complex things can be done with awk but require a longer code in comparison with perl.
I've dealt with systems that lack sed and awk. Bash builtins and other standard tools like cut and tr take care of ... well, everything.
Systems with bash but without standard POSIX utils? I know some without bash (freebsd by default, busybox based distros etc.) and with grep, sed and awk, but not vice versa.