jlh

joined 1 year ago
[–] jlh@lemmy.jlh.name 1 points 1 hour ago

Ideally busses would be cheaper/subsidized, but driving for recreation is valid. Carpool if you can.

[–] jlh@lemmy.jlh.name 2 points 1 hour ago

Maverick is not a car, it's a two-ton truck that uses 3.8gal/100 MI. Also, there's insurance and maintenance to worry about.

[–] jlh@lemmy.jlh.name 6 points 8 hours ago
[–] jlh@lemmy.jlh.name 1 points 10 hours ago

I think suburb in English can mean nearby cities, too. For example, Alexandria, Virginia is a suburb of Washington, DC.

[–] jlh@lemmy.jlh.name 11 points 13 hours ago (1 children)

If I wanted a device with a plastic screen and a fragile hinge, I'd carry around a Nintendo DS

[–] jlh@lemmy.jlh.name 25 points 22 hours ago

That symbol just means the water is not potable. You see it on the taps on trains in Sweden.

[–] jlh@lemmy.jlh.name 1 points 1 day ago* (last edited 1 day ago) (1 children)

I think they added some compatibility in the past year or so but I had issues detecting my microphone on Linux just 2 weeks ago. I've had some smaller ecommerce sites fail to load properly on Firefox/Librewolf, Red Hat's Training website doesn't work on Firefox, and also some features on apps like Google Meet and Miro are unavailable. It's nothing that makes firefox unusable, and I can always open up ungoogled chromium when needed, but it is a serious issue for browser diversity and competition that the web has defaulted to chrome now.

[–] jlh@lemmy.jlh.name 16 points 1 day ago (1 children)

Don't see it here in Sweden.

[–] jlh@lemmy.jlh.name 1 points 1 day ago (3 children)

I have to switch to chromium often, unfortunately. Various websites are untested with Firefox, and many apps such as Teams are not compatible with FF. Probably better than the early 2000's but still really bad.

[–] jlh@lemmy.jlh.name 4 points 1 day ago* (last edited 1 day ago)

Ukraine is not attacking civilian infrastructure, what you're describing are war crimes. This is not what the debate over long range missiles is about.

Here is a recent video from William Spaniel about the debate: https://www.youtube.com/watch?v=tM0ZTEz7Bzc

[–] jlh@lemmy.jlh.name 7 points 1 day ago

19% interest rates is a crisis.

[–] jlh@lemmy.jlh.name 14 points 1 day ago (9 children)

I think performance was part of Chrome's success, but there was also all the memes in 2010 about installing chrome to replace IE, and the ads that Google ran on their search page. I don't think Pocket came out until Firefox was already deep into the decline. I do think Chrome held onto those users because of their ram efficiency at the time, and nice features like built-in translate. Now, users can't switch because the web depends on Chrome, just like back in the IE days.

 

https://web.archive.org/web/20240719155854/https://www.wired.com/story/crowdstrike-outage-update-windows/

"CrowdStrike is far from the only security firm to trigger Windows crashes with a driver update. Updates to Kaspersky and even Windows’ own built-in antivirus software Windows Defender have caused similar Blue Screen of Death crashes in years past."

"'People may now demand changes in this operating model,' says Jake Williams, vice president of research and development at the cybersecurity consultancy Hunter Strategy. 'For better or worse, CrowdStrike has just shown why pushing updates without IT intervention is unsustainable.'"

 

Seems like a really serious vulnerability, any container attack or malicious image could take over a container host if there's no hardening on the containers.

 

I wanted to share an observation I've seen on the way the latest computer systems work. I swear this isn't an AI hype train post 😅

I'm seeing more and more computer systems these days use usage data or internal metrics to be able to automatically adapt how they run, and I get the feeling that this is a sort of new computing paradigm that has been enabled by the increased modularity of modern computer systems.

First off, I would classify us being in a sort of "second-generation" of computing. The first computers in the 80s and 90s were fairly basic, user programs were often written in C/Assembly, and often ran directly in ring 0 of CPUs. Leading up to the year 2000, there were a lot of advancements and technology adoption in creating more modular computers. Stuff like microkernels, MMUs, higher-level languages with memory management runtimes, and the rise of modular programming in languages like Java and Python. This allowed computer systems to become much more advanced, as the new abstractions available allowed computer programs to reuse code and be a lot more ambitious. We are well into this era now, with VMs and Docker containers taking over computer infrastructure, and modern programming depending on software packages, like you see with NPM and Cargo.

So we're still in this "modularity" era of computing, where you can reuse code and even have microservices sharing data with each other, but often the amount of data individual computer systems have access to is relatively limited.

More recently, I think we're seeing the beginning of "data-driven" computing, which uses observability and control loops to run better and self-manage.

I see a lot of recent examples of this:

  • Service orchestrators like Linux-systemd and Kubernetes that monitor the status and performance of services they own, and use that data for self-healing and to optimize how and where those services run.
  • Centralized data collection systems for microservices, which often include automated alerts and control loops. You see a lot of new systems like this, including Splunk, OpenTelemetry, and Pyroscope, as well as internal data collection systems in all of the big cloud vendors. These systems are all trying to centralize as much data as possible about how services run, not just including logs and metrics, but also more low-level data like execution-traces and CPU/RAM profiling data.
  • Hardware metrics in a lot of modern hardware. Before 2010, you were lucky if your hardware reported clock speeds and temperature for hardware components. Nowadays, it seems like hardware components are overflowing with data. Every CPU core now not only reports temperature, but also power usage. You see similar things on GPUs too, and tools like nvitop are critical for modern GPGPU operations. Nowadays, even individual RAM DIMMs report temperature data. The most impressive thing is that now CPUs even use their own internal metrics, like temperature, silicon quality, and power usage, in order to run more efficiently, like you see with AMD's CPPC system.
  • Of source, I said this wasn't an AI hype post, but I think the use of neural networks to enhance user interfaces is definitely a part of this. The way that social media uses neural networks to change what is shown to the user, the upcoming "AI search" in Windows, and the way that all this usage data is fed back into neural networks makes me think that even user-facing computer systems will start to adapt to changing conditions using data science.

I have been kind of thinking about this "trend" for a while, but this announcement that ACPI is now adding hardware health telemetry inspired me to finally write up a bit of a description of this idea.

What do people think? Have other people seen the trend for self-adapting systems like this? Is this an oversimplification on computer engineering?

 

The latest patch today, 13.23 makes the game instacrash after champ select, be warned. Don't start a match on Linux until it's fixed.

https://leagueoflinux.org/

 

Awful to see our personal privacy and social lives being ransomed like this. €10 seems like a price gouge for a social media site, and I'm even seeing a price tag of 150SEK (~€15) In Sweden.

view more: next ›