smee

joined 3 weeks ago
[–] smee@poeng.link 1 points 1 day ago
[–] smee@poeng.link 3 points 1 day ago

The Game

Damn.

[–] smee@poeng.link 5 points 1 day ago

Did you draw that on a phone? Very nice!

👌😩🤌

[–] smee@poeng.link 6 points 1 day ago (1 children)

It's too late mang, Cæsar died a while back. Stabbed in the back they say. And the way he was murdered? Stabbed in the back they say.

[–] smee@poeng.link 4 points 1 day ago

My father drowned in one of them giant beer brewery vats. He did come up for air a couple of times first though.

[–] smee@poeng.link 2 points 1 day ago

Fun fact, lagging is a known way of cheating where a player creates artificial lag for periods of time to mess everything up.

[–] smee@poeng.link 3 points 1 day ago

Shit map, shit team, bad ping, hackers on the other side.

That should about cover it

[–] smee@poeng.link 1 points 1 day ago

Brother by another mother perhaps?

[–] smee@poeng.link 4 points 1 day ago

I feel "👁️👄👁️" - meaning one's seen the message would be more appropriate.

[–] smee@poeng.link 5 points 1 day ago

My local grocery store is staffed by remote workers at night.

[–] smee@poeng.link 2 points 1 day ago

It's possible to run local AI on a Raspberry Pi, it's all just a matter of speed and complexity. I run Ollama just fine on the two P-cores of my older i3 laptop. Granted, running it on the CUDA-accelerator (GFX card) on my main rig is beyond faster.

[–] smee@poeng.link 1 points 1 day ago

Ollama recently became a flatpak extension for Alpaca but it's a one-click install from the Alpaca software management entry. All storage locations are the same so no need to re-DL any open models or remake tweaked models from the previous setup.

view more: next ›