When Amazon Web Services went offline, people lost control of their cloud-connected smart beds, getting stuck in reclined positions or roasting with the heat turned all the way up.
My jellyfin can stream 4K just fine, even remotely through a vpn so i am not sure what you mean.
Depending on transcoding you might require a gpu but still not a standard “gaming spec” pc cant handle.
Come to think of it, my internet provider does allow upload up to 25mb/s and this is the highest end available for consumers in my area. Technically thats a subscription but realistically its bill similar to water/electricity.
The upload limit is also purely and artificial cap, they could easily quadruple it if they wanted.
Also Realistically usecase for 4k movies is usually your home couch so it be streamed on Lan speed. Quality is often better than common stream providers because they do cheat to keep bandwidth down.
This isn’t meant as a slight, but I take it you don’t work in IT. You are way underestimating what it takes to run a service at the scale these large companies do. Homelabbing is cool and a great way to get off these providers, but we as individuals have completely different requirements. A proper cloud service is incredibly complex with multiple environments, rigid change controls, global availability, zero allowable downtime, etc. You can’t just wing it with a few desktops.
Must be different requirements indeed. But yours don’t sound like typical consumer requirements. Why do we need the same scale as a large corporation?
I can respect the corporate ability to serve thousands at a time but a typical household simply doesn’t need that.
Me and a few of my friends all work in IT and each have a dedicated proxmox machine that runs all of these things just fine. Nextcloud has so far only failed me once when i needed it and it was actually a cloudflare issue and still worked locally.
Navidrome i use all day every day and need accessible from anywhere. I have not updated or checked the container since setup and it has been stable as a rock. Fuck spotify which doesn’t have the bootlegs i listen to anyway.
The endgoal, which i archived is that i have no need for subscriptions and actually own my data which is the point right?
My actual hobbyist goal is to create something that can persist locally if the internet one day disappears.
I think 4K media streaming does need a fair bit of infrastructure management.
My jellyfin can stream 4K just fine, even remotely through a vpn so i am not sure what you mean.
Depending on transcoding you might require a gpu but still not a standard “gaming spec” pc cant handle.
Come to think of it, my internet provider does allow upload up to 25mb/s and this is the highest end available for consumers in my area. Technically thats a subscription but realistically its bill similar to water/electricity.
The upload limit is also purely and artificial cap, they could easily quadruple it if they wanted.
Also Realistically usecase for 4k movies is usually your home couch so it be streamed on Lan speed. Quality is often better than common stream providers because they do cheat to keep bandwidth down.
This isn’t meant as a slight, but I take it you don’t work in IT. You are way underestimating what it takes to run a service at the scale these large companies do. Homelabbing is cool and a great way to get off these providers, but we as individuals have completely different requirements. A proper cloud service is incredibly complex with multiple environments, rigid change controls, global availability, zero allowable downtime, etc. You can’t just wing it with a few desktops.
Must be different requirements indeed. But yours don’t sound like typical consumer requirements. Why do we need the same scale as a large corporation?
I can respect the corporate ability to serve thousands at a time but a typical household simply doesn’t need that.
Me and a few of my friends all work in IT and each have a dedicated proxmox machine that runs all of these things just fine. Nextcloud has so far only failed me once when i needed it and it was actually a cloudflare issue and still worked locally.
Navidrome i use all day every day and need accessible from anywhere. I have not updated or checked the container since setup and it has been stable as a rock. Fuck spotify which doesn’t have the bootlegs i listen to anyway.
The endgoal, which i archived is that i have no need for subscriptions and actually own my data which is the point right?
My actual hobbyist goal is to create something that can persist locally if the internet one day disappears.