I self host a lot, but I host a lot on cheap VPS's, mostly, in addition to the few services on local hardware.
However, these also don't take into account the amount of time and money to maintain these networks and equipment. Residential electricity isn't cheap; internet access isn't cheap, especially if you have to get business class Internet to get upload speeds over 10 or 15 mbps or to avoid TOS breaches of running what they consider commercial services even if it's just for you, mostly because of of cable company monopolies; cooling the hardware, especially if you live in a hotter climate, isn't cheap; and maintaining the hardware and OS, upgrades, offsite backups for disaster recovery, and all of the other costs. For me, VPS's work, but for others maintaining the OS and software is too much time to put in. And just figuring out what software to host and then how to set it up and properly secure it takes a ton of time.
This is a point many folks don't take into account. My average per Kwh cost right now is $0.41 (yes, California, yay). So it costs me almost $400 per year just to have some older hardware running 24x7
I solved this by installing solar panels. They produce more electricity than I need (enough to cover charging an EV in when I get one in the future), and I should break even (in terms of cost) within 5-6 years of installation. Had them installed last year under NEM 2.0.
I know PG&E want to introduce a fixed monthly fee at some point, which throws off my break-even calculations a bit.
Some VPS providers have good deals and you can often find systems with 16GB RAM and NVMe drives for around $70-100/year during LowEndTalk Black Friday sales, so it's definitely worth considering if your use cases can be better handled by a VPS. I have both - a home server for things like photos, music, and security camera footage, and VPSes for things that need to be reliable and up 100% of the time (websites, email, etc)
This sounds excessive, that's almost 1.1$/day, amounting to more than 2kWh/24hrs, ie ~80W/hr? You will need to invest in a TDP friendly build. I'm running a AMD APU (known for shitty idle consumption) with Raid 5 and still hover less than 40W/h.
This sounds excessive, that’s almost 1.1$/day, amounting to more than 2kWh/24hrs, ie ~80W/hr? You will need to invest in a TDP friendly build. I’m running a AMD APU (known for shitty idle consumption) with Raid 5 and still hover less than 40W/h.
This isn't speculation on my part, I measured the consumption with a Kill-a-watt. It's an 11 year old PC with 4 hard drives and multiple fans because it's in a hot environment and hard drive usage is significant because it's running security camera software in a virtual machine. Host OS is Linux MInt. It averages right around 110w. I'm fully aware that's very high relative to something purpose built.
I think the main culprit is CPU/MB, so that's the only thing needed a replacement. Many cheap alternatives (less than 200$) that can half the consumption and would pay itself in a year of usage easily. There is a Google doc floating around listing all the efficient CPUs and their TDPs. Just a suggestion, I'm pretty sure after a year it would payoff its price, there is absolutely no need for a 110w/h unless you're running LLMs on that and even then it shouldn't be that high.
Omg, I pay 30€ for 1Gb/0.7Gb (ten more for symmetrical 10Gb, I don't need it and can't even use more than 1Gb/s but my inner nerd wants it) and 0.15€/KWh.
BTW the electricity cost is somewhat or totally negated when you heat your apartment/house depending on your heating system. For me in the winter I totally write it off.