

wg-easy is what you want
wg-easy is what you want
DOWN:
I’m currently fighting with my OliveTin config file. I added a simple new config for a button action and ylthe whole thing just shit the bed. Now OliveTin won’t load at all. Even after removing the new config. Stupid yaml.
UP:
After reading the Jellyfin docs and their Hardware Encoder Quality section which states
Apple ≥ Intel ≥ Nvidia >>> AMD*
I decided to spin up a test server on the m1 mini that’s been sitting unused in my basement for a couple of months now to see if I can get better performance out of jellyfin on the m1 vs where it’s running currently, which is on an i7 Intel that’s going on 10ish years old now.
I also spun up baserow and directus containers to see which one I want to use for my database needs.
Audiobookshelf for sure. It handles audiobooks fabulously, and it also does handle ebooks.
I use it to manage my eBook library, but not as the reader. You can set up a “send to ereader” option to email the ebooks to your reader of choice. So I just shoot them off to my pocketbook ereader when I want to read one.
Easiest way would be to just add a sleep 15
command at the top of the script. Time how long it takes your wifi to come up, and adjust the sleep time with like a 2-3 second buffer in case it takes linger for some reason.
More exact would be to create a systemd service for your script that depends on network connectivity to execute.
Another for radicale. Been using it for years now. Its great.
I use gnu stow and my self hosted got forge to manage and back up my config files. With a 3-2-1 backup strategy on the gitforge of three copies, at least two mediums, with one offsite.
Same here. luks encrypted drive in my work locker.
Right now I sneaker net it. I stash a luks encrypted drive in my locker at work and bring it home once a week or so to update the backup.
At some point I’m going to set up a RPI at a friend’s house, but that’s down the road a bit.
That’s okay. This is the first one of this I saw, and I’m going to try to organize something for this at my local library
I don’t use AI at all. What you described is the principal reason. I also don’t like how these giant corpos are sucking up the entirety of human output to train these models without a care to the implications of it.
Right! It’s kinda wild when you do see them. I always equate it to the feeling of being in a casino.
What really throws me is tv commercials. When I do see one, like in a waiting room or something, all I can think is, “people fall for this?”
I don’t know of Amy open source projects that do this, but if you’re at all comfortable with bash (or another scripting language) and a little self hosting you can roll your own with Olivetin, which is what I did for all my personal data tracking.
I have it running on a raspberry pi zero w and access it through the pwa.
Hey now. In harsh3466 land only I have the authority to mint pinecones!
I am always shocked when I have to use a browser without an ad blocker. How do people tolerate it?
I mean, I get it. I know many people have no idea about adblocking, etc. But goddam. It’s so awful without it.
+1
I self host vaultwarden and its great. Its an easy self host, and in my experience, it has never gone down on me.
That being said, my experience is anecdotal. If you do go the vaultwarden route, realize that your vault is still accessible on your devices (phone, whatever) even if your server goes down, or if you just lose network connectivity. They hold local (encrypted at rest) copies of your vault that are periodically updated.
Additionally, regardless of the route you take you should absolutely be practicing a good 3-2-1 backup strategy with your password vault, as with any other data you value.
Doesn’t solve the whole problem, but here’s a great resource for car manuals called OPERATION charm
ffs. Of course they had to slap AI on top of it. Goddammit.
I’d encourage learning. The more you understand the better you can control your data and maintain your services. You don’t need to be an expert but I’d encourage working towards relying less on gpt.