this post was submitted on 02 Aug 2023
1512 points (97.0% liked)

Programmer Humor

19623 readers
92 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] thelastknowngod@lemm.ee 16 points 1 year ago (2 children)

I think I really only use GUIs if I am learning something new and trying to understand the process/concepts or if I'm doing something I know is too small to automate. Generally once I understand a problem/tool at a deeper level, GUIs start to feel restrictive.

Notable exceptions are mostly focused around observability (Grafana, new relic, DataDog, etc) or just in github. I've used gh-dash before but the web ui is just more practical for day to day use.

For context, I'm in SRE. I feel like +90% of my day is spent in kubernetes, terraform, or ci/cd pipelines. My coworkers tend to use Lens but I'm almost exclusively in kubectl or the occasional k9s.

[–] JackbyDev@programming.dev 4 points 1 year ago (3 children)

Searching a log file? I want less. Searching all log files? I want log aggregation lol.

[–] Semi-Hemi-Demigod@kbin.social 2 points 1 year ago (1 children)

If I knew what I was looking for I could grep all the log files and pipe the output to another file to aggregate them.

[–] JackbyDev@programming.dev 2 points 1 year ago (1 children)

The problem is that they're all on different servers. Once you use log aggregation stuff like DataDog, Splunk, or Kibana you get it, but before it's hard to see the benefits. Stuff like being able to see a timestamp of when an error first appeared and then from the same place see what other stuff happened around the same time.

[–] Semi-Hemi-Demigod@kbin.social 2 points 1 year ago (1 children)

If I had dozens or hundreds of servers that would make a huge difference, but for under a dozen I think the cost of setting that all up isn't worth the added benefit. Plus if the log aggregation goes down (which I've seen happen with some really hairy issues) you're back to grepping files so it's good to know how.

[–] JackbyDev@programming.dev 3 points 1 year ago

Totally. I'm talking more from the enterprise perspective. Even apart from that I'm not sure if the cost is worth it at that scale. Even using foss solutions the dev hours setting it up might not be worth it.

[–] docAvid@midwest.social 2 points 1 year ago (2 children)

One log file, or all, I want grep or awk, maybe with find in front, possibly throw some jq on top if something is logging big json blobs.

[–] JackbyDev@programming.dev 2 points 1 year ago

I feel you. The problem with a lot of Elastic style document search engines is that they don't ever let you search by very explicit terms because of how the index is built. I believe the pros outweigh the cons but I often wish I could "drop into" grep, less, and others from within the log aggregation tool.

[–] thelastknowngod@lemm.ee 2 points 1 year ago

That's a lot slower at scale than something like Loki.

[–] SpaceNoodle@lemmy.world 3 points 1 year ago (1 children)

Github's UI is total garbage compared to basic git commands, though.

[–] thelastknowngod@lemm.ee 1 points 1 year ago

You can't manage pull requests, github actions, repo collaborators, permissions, or any number of the dozens of other things github does just from basic git commands.