This is a point many folks don’t take into account. My average per Kwh cost right now is $0.41 (yes, California, yay). So it costs me almost $400 per year just to have some older hardware running 24x7
This sounds excessive, that’s almost 1.1$/day, amounting to more than 2kWh/24hrs, ie ~80W/hr? You will need to invest in a TDP friendly build. I’m running a AMD APU (known for shitty idle consumption) with Raid 5 and still hover less than 40W/h.
This sounds excessive, that’s almost 1.1$/day, amounting to more than 2kWh/24hrs, ie ~80W/hr? You will need to invest in a TDP friendly build. I’m running a AMD APU (known for shitty idle consumption) with Raid 5 and still hover less than 40W/h.
This isn’t speculation on my part, I measured the consumption with a Kill-a-watt. It’s an 11 year old PC with 4 hard drives and multiple fans because it’s in a hot environment and hard drive usage is significant because it’s running security camera software in a virtual machine. Host OS is Linux MInt. It averages right around 110w. I’m fully aware that’s very high relative to something purpose built.
I think the main culprit is CPU/MB, so that’s the only thing needed a replacement. Many cheap alternatives (less than 200$) that can half the consumption and would pay itself in a year of usage easily. There is a Google doc floating around listing all the efficient CPUs and their TDPs. Just a suggestion, I’m pretty sure after a year it would payoff its price, there is absolutely no need for a 110w/h unless you’re running LLMs on that and even then it shouldn’t be that high.
Omg, I pay 30€ for 1Gb/0.7Gb (ten more for symmetrical 10Gb, I don’t need it and can’t even use more than 1Gb/s but my inner nerd wants it) and 0.15€/KWh.
BTW the electricity cost is somewhat or totally negated when you heat your apartment/house depending on your heating system. For me in the winter I totally write it off.
This is a point many folks don’t take into account. My average per Kwh cost right now is $0.41 (yes, California, yay). So it costs me almost $400 per year just to have some older hardware running 24x7
This sounds excessive, that’s almost 1.1$/day, amounting to more than 2kWh/24hrs, ie ~80W/hr? You will need to invest in a TDP friendly build. I’m running a AMD APU (known for shitty idle consumption) with Raid 5 and still hover less than 40W/h.
This isn’t speculation on my part, I measured the consumption with a Kill-a-watt. It’s an 11 year old PC with 4 hard drives and multiple fans because it’s in a hot environment and hard drive usage is significant because it’s running security camera software in a virtual machine. Host OS is Linux MInt. It averages right around 110w. I’m fully aware that’s very high relative to something purpose built.
Right, and spend even more money.
I think the main culprit is CPU/MB, so that’s the only thing needed a replacement. Many cheap alternatives (less than 200$) that can half the consumption and would pay itself in a year of usage easily. There is a Google doc floating around listing all the efficient CPUs and their TDPs. Just a suggestion, I’m pretty sure after a year it would payoff its price, there is absolutely no need for a 110w/h unless you’re running LLMs on that and even then it shouldn’t be that high.
Omg, I pay 30€ for 1Gb/0.7Gb (ten more for symmetrical 10Gb, I don’t need it and can’t even use more than 1Gb/s but my inner nerd wants it) and 0.15€/KWh.
BTW the electricity cost is somewhat or totally negated when you heat your apartment/house depending on your heating system. For me in the winter I totally write it off.