To add, if you take the time to make GPTs for specific subjects and just shovel in pdfs you basically have expert level references without ChatGPTs tendency to make up stuff. I’ve been testing this by combining pdfs/books into single pdf documents to maximize GPTs document restrictions . If you can find books that are relatively small in size the 20 pdf file restriction can easily be over 400 books with each book roughly 400 pages. The hardest part is finding enough books for a given subject
Self-Hosted Main
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
For Example
- Service: Dropbox - Alternative: Nextcloud
- Service: Google Reader - Alternative: Tiny Tiny RSS
- Service: Blogger - Alternative: WordPress
We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.
Useful Lists
- Awesome-Selfhosted List of Software
- Awesome-Sysadmin List of Software
Same here , I really like that it give you kind of direction in scripting how something could be done. It helped me to write a python script which listens for incoming mails and triggering commands based on subject. Without chatgpt i would have spend days to figure out how to do it. Sure the example code isn't perfect , but good enough as template
I had gotten stuck on some code, and had someone ask me if I had asked ChatGPT to write a python script to do some BLE stuff in Win11. So we got on a call, and started asking ChatGPT.
It wrote the python code.... for linux. Then decided to dump some WinRT stuff in there to ....make it more windows like? Not sure what it was doing, it wasn't for ble, or GATT. We went through a few times trying to explain that we didn't want anything with linux and it needed to run in Win11.
It told us to install wsl, install python, and run its script through wsl. Ya, that won't work for the environment this is running in. So I spent an hour, searched a bit for the one bit of python code I was getting stuck on, and finished the script. Works for what I need it to do. Doesn't require wsl, and runs in python, along with being converted to an exe that runs on the systems I need it to run on.
Having said all that, it DID do a good job on a powershell script I needed it for a few days later. I think it got hung up in some logic loop thinking that you HAD to have linux somewhere to even do anything with python.
Probably should have started over with a new chat, priming it with some simple windows scripting in python.
That might have worked, anyway.
This is where I learned to create new session. Take the prompts that got somewhere. Made sure to clearly tell it what I didn't want in the first prompt. Also switch between Bard and ChatGPT.
I now can find solutions to more complex problems, with just limited program knowledge.
I would be very careful if you don't know 100% what the proposed solution does. That is why I prefer to read the Docs and follow guides or tutorials. Once you understand it, it will be easier to use than asking to an AI and hope for the best.
Well, it's still easier to ask the AI, especially for the boring parts. But you should still always understand the code the AI wrote, otherwise you're going into dangerous territory.
For what it's worth I am a developer who is fluent in multiple languages and I definitely agree. Even for things that I could do myself I'll often ask ChatGPT to do because at worst, it produces something a new hire would do and I'm very used to code review. At best, it introduces me to new tools or ways of doing things. It feels like I've got a personal intern now.
Just make sure you aren’t passing in external ips, ports, dns names into GPT unless you are running it locally…
None of those things you mentioned are private or secret. Pretty much the only thing that shouldn’t be passed are static credentials.
It's great for creating home assistant blueprints too. Just click together an automation, feed it the YAML and ask it to make a blueprint with XY entity configurable.
Nope.
Nope and nope. It’s just too scary that an AI can make in 10s what I would craft manually with all care and love the work deserves, and that includes the bugs I’d introduce, flaws that make my scripts very personal and unique
Not doing it.
People said the same thing when cars were invented and replacing horses for transportation. It's not about fearing something new, it's about learning to use it correctly so it benefits you.
How long time did it take to do that?
100% agree. I have used chat gpt multiple times from simple notifications to doing stuff with if/then thoughts. Or even just a simple rclone script to make automating multiple sync easier. I love it and very thankful for it.
Same. I've written many custom bash scripts with it and just about the only thing I know about bash is that it starts with a #!/bin/sh line at the top.
It's also very good at explaining things even though you have to prod it many times to give it direction otherwise it can get lost in its own cloud.
Would like to see your obsidian templates
against the grain opinion: once you reach a certain level, it's actually a lot slower to get gpt to generate code for you and then go through the slog of proofreading its obvious errors, and scary non-obvious ones. i gave it a try for a period of time because of the praise it was receiving for automation, but i wouldn't really trust it for non-basic code gen. great for explainers though.
Maybe offtopic but why don’t you use the Proxmox backup feature? For me it creates a weekly backup of all my VMs and LXC container and restoring is easy-peasy. You can also make use of the Proxmox Backup Server if you have one for offsite backups (or just copy the backup archives).
Also you may be interested in Uptime Kuma for easy uptime / health overview and alerting.
Proxmox Backup
I'm using Proxmox Backup Server for my local daily backups.
But for external cloud backup, it's 5 GB for docker config + persistent data vs. 60GB for the complete LXC containers. So it's more convenient to just backup the data to the cloud that you really need for an emergency-restore.
Uptime Kuma
I try to minimize the number of applications that have access to the docker.socket, as this can be a high potential security risk (e.g. a malicious container update because of a hacked github account) .
If I can achieve the same goal with just a simple bash script and without additional software, it's the better solution for me :)
No