Simplixt

joined 1 year ago
[–] Simplixt@alien.top 1 points 11 months ago

Haha, for my parents it's better to change an unvisible setting in the router, instead of placing a raspberry pi in their network or installing WireGuard on every device. They are paranoid with any software that must be installed :D

 

Hi all,
I have the following challenges:
- I have friends and family that want to access my services
- But they cannot / don't want to install any VPN
- But I don't want to open my services to the complete internet

My idea: It's good enough, if they can access the services at home. They have internet provider with dynamic IP address. In the router, you easily set a DynDNS-Service.

So: Why not just have a rule, that only allows connections of the IP-addresses of the DynDNS-Domains. Of course, the proxy/firewall would recheck the DNS entry regulary.

Someone has tried such a setup? What solution are you using? Do you think it's similar secure as a VPN? (of course HTTPS connections only via Letsencypt cetificiate)

[–] Simplixt@alien.top 1 points 11 months ago

"These plug-ins from github are out of the question, because I don't even know how to run them and I'm not always at home, so that the server (my computer) is always on and watching the sites."

You are asking on "Selfhosted" for an alternative solution to a free selfhosted app, because you are too lazy for "Selfhosting". Wow.

[–] Simplixt@alien.top 1 points 11 months ago

Ask your IT-Guy. If you don't have one, you shouldn't selfhost at all as a company.

[–] Simplixt@alien.top 2 points 11 months ago

Because we can. Just because we can.

[–] Simplixt@alien.top 1 points 11 months ago

Some day I might setup the Intrusion Prevention System of my OpnSense Firewall ...

But I don't have the feeling I need it. I'm trusting the devices in my HomeNet, and my IoT devices are in a separate VLAN without Internet Access.

[–] Simplixt@alien.top 1 points 11 months ago

Yes, it's really a shame, it's an awesome project, but they would really need someone who is integrating a better CalDAV library.

[–] Simplixt@alien.top 2 points 11 months ago (3 children)

"with an nice client?"

You gave the answer to yourself. There aren't any user friendly multi-platform clients with synchronization / conflict resolution / versioning.

[–] Simplixt@alien.top 1 points 11 months ago

Proxmox Backup

I'm using Proxmox Backup Server for my local daily backups.
But for external cloud backup, it's 5 GB for docker config + persistent data vs. 60GB for the complete LXC containers. So it's more convenient to just backup the data to the cloud that you really need for an emergency-restore.

Uptime Kuma

I try to minimize the number of applications that have access to the docker.socket, as this can be a high potential security risk (e.g. a malicious container update because of a hacked github account) .

If I can achieve the same goal with just a simple bash script and without additional software, it's the better solution for me :)

[–] Simplixt@alien.top 1 points 11 months ago (1 children)

You have a WiFi 6E Mesh, that's awesome, would be really stupid to replace it.

However, you could look how to combine the Asus WiFi Mesh with a self-hosted firewall.
E.g. using a OPNsense-VM as Gateway / DHCP / DNS server für all clients in the network ..

But that's more for playing around.

[–] Simplixt@alien.top 1 points 11 months ago

Versioning is helping, I can go one year back in history.

 

Hi all,

I must say: ChatGPT (or generative AI in general) is such a game changer for me doing SelfHosting.

Today, I wrote following scripts with GenAI:

  1. A bash script, that logins in all my LXC container on Proxmox, stops all currently running docker container, exports all persistent docker data to my Proxmox-Host, starts the Container again, and makes a backup via Borg. Afterwards, I get a success notification via Gotify.
  2. A python script, that uses the Portainer API to export the stack configuration of all my 10 docker environments into easy to backup .yml files ( e.g. backup-host_name-container_name.yml )
  3. A bash scripts, that checks if the defined containers are running on my docker hosts, and if not, sends a notification to Healthcheck.io
  4. Different Templates for Obsidian, e.g. Daily / Weekly / Monthly notes with Task-Management etc.

It's really fun, as I now can find solutions to more complex problems, with just limited program knowledge. Not perfect yet, e.g. it used the Portainer API wrong, and I had to direct it to the right direction via "error messages" and "API documentations".Sometimes, it also say's "it's not possible", but when you gave some hints (e.g. could you use pct exec instead of ssh) it gives you the right solution .

What are your experiences? Are you using GenAI for your SelfHosting and if yes - where?

My next step will be Ansible, as I had to much respect for it yet, but with some GenAI support I might get everything running ...

[–] Simplixt@alien.top 2 points 11 months ago (2 children)

My backup strategy:

Data:
- Sycnthing with 1x Copy with my Clients and 1x Copy on my Server accessible via Nextcloud
- Daily Push-Backup with of my Nextcloud-Data-Folder via Kopia to Backblaze
- Daily Pull-Backup of my Nextcloud-Data-Folder via QNAP-NAS in the basement

VM:
- Daily Backup of my VM's to a Proxmox Backup Server running on QNAP-NAS
- Daily Backup of my VM's to BackBlaze (but encrypted before)

Still, I'm not fan of having just one Cloud-Backup. So I think I will also get Hetzner Cloud Storage for Borg Backup additional to Kopia.

Goal:
- Different Hardware (Server, QNAP, etc.)
- Different Backup software (Syncthing, Kopia, Borg)
- Different Backup technique (Push, Pull, Snapshots)
- Different Locations

[–] Simplixt@alien.top 2 points 11 months ago (1 children)

"On the other hand, if I run Watchtower first, I'm backing up the latest version."

This makes no sense. I hope you are backing up the persistent data, not the Docker-Images.So before and after Watchtower is identically in the best case scenario.

(In the worst case scenario, after the docker update your persistent data are corrupted because of an bad version update. So the backup should always be before)

 

Hey all,

I just found healthcheck.io - and it's awesome.
I wrote multiple bash-scripts (via ChatGPT as I'm no programmer), e.g.
- Check every LXC-Container if the defined docker containers are still running
- Update every LXC-Container
- Refresh the Letsencrypt-Certificat in every LXC-Container

If the tasks is successful, they Ping my healthcheck.io instance running on a VPS connected via WireGuard. If the VPS don't get a ping in the defined timeframe, I get an e-mail notification.

I really love, that I don't have to install anything on the hosts I want to monitor, so it's always working.

Any ideas of other metrics that are worth to monitor, and that I could realize with simple bash-scripts?

(I'm using Proxmox with multiple LXC containers running Docker and apps like Bitwarden, Nextcloud, etc.)

 

Hi all,

as I'm running a lot of docker containers in my "self-hosted cloud", I'm also a little bit worried about getting malicious docker containers at some points. And I'm not a dev, so very limited capabilities to inspect the source code myself.

Not every docker container is a "nextcloud" image with hundred of active contributors and many eyes looking at the source code. Many Self-Hosted projects are quite small, and Github accounts can be hacked, etc. ...

What I'm doing in the moment, is:

Project selection:
- only select docker projects with high community activity on GitHub and a good track record

Docker networks:
- use separate isolated networks for every container without internet access
- if certain APIs need internet access (e.g. Geolocation data), I use an NGINX-proxy to forward this domain only (e.g. self-made outgoing application firewall)

Multiple LXC containers:
- I split my docker containers into multiple LXC instances via Proxmox, some senitive containers like Bitwarden are running on their own LXC instance

Watchtower:
- no automatic updates, but manual updates once per month and testing afterwards

Any other tips? Or am I worrying too much? ;)

 

Hi all,

question to you: How many of your selfhosted Apps are improving your life? Which apps are you really using on a daily/weekly basis?

Many of my running containers are just for ... running containers.

Portainer, Nginx Proxy Manager, Authentik, Uptime-Kuma, Wireguard ... they are not improving my life, they are only improving Selfhosting. But we are not doing selfhosting just for the sake of it? Do we? ...

Many of my running containers ... are getting replaced by Open Source client software eventually

  • I've installed Trilium Notes - but I'm using Obsidian (more plugins, mobile apps, easy backup)
  • I've installed Vikunja - but I'm using Obisdian (connecting tasks with notes is more powerful)
  • I've installed Snapdrop - but I'm using LocalSend (more reliable)
  • I've installed Bitwarden - but I'm using KeePass (easy backups, better for SSH credentials)
  • I've installed AdGuard - but I'm using uBlock (more easy to disable for Shopping etc.)
  • ...

So the few Selfhosted Apps, that improve my life

File Management

  • Paperless NGX - all my documents are scanned and archived here
  • Nextcloud - all my files accessible via WebUI (& replaced Immich/Photoprism with Photos plugin)
  • Syncthing - all my files synchroniced between devices and Nextcloud
  • Kopia - Backup of all my files encrypted into the cloud

And that's a little bit sad, right? The only "Job to be done" self-hosting is a solution for me is ... file management. Nothing else.

What are your experiences? How makes self-hosting your life better?

( I'm not using selfhosting for musc / movies / series nowadays, as streaming is more convenient for me and I'm doing selfhosting mainly because of privacy and not piracy reasons - so that usecase is not included in my list ;)My only SmartHome usecase is Philips Hue - and I'm controlling it with Android Tasker )

view more: next ›