i’ve always been curious how much the rating matter with the wattage
it’s always going to depend on usage, but i’ve heard it estimated just above $100
I always find questions like this interesting because… it’s always based off usage, I guess people forget time is a variable.
But for the fun:
To find the cost for any amount of time, the formula is Time (hours) * Cost ($/kWH) * Usage (W/efficiency) = total cost
1KWH is 0.13 cents in my area. There are 8760 hours in a year. Lets assume it’s running at full load and half load.
At full load, it will cost 8760 hours * 0.13 cents per kWh * 1 kW / 0.87 = $1,308.74
At half load, it will cost 8760 hours * 0.13 cents per kWh * 0.5 kW / 0.90 = $632.99
Now realistically, lets say you don’t touch grass ever and play for 16 hours per day to give you optimum sleeping condition of 8 hours per day, and you’re playing the latest AAA online multiplayer game, “Beer, Grass, and Anime Girls” which draws 600W on average.
At this load, it will cost 5840 hours * 0.13 cents per kWh* 0.6 kW / 0.90 = $506.13
But this is unrealistic because we all know that gamers spend loads of money to just watch youtube and their favorite websites So lets say 100W.
At this new load ;), it will cost 5840 hours * $0.13 / kWh * 0.1 kW / 0.87 = $87.27
I have my PC plugged into a UPS, it logs how much energy has been consumed, for my PC’s lifetime (around two months) it has logged $11.61 and 89.3kWh used with an average of 615.84 Wh.
if you can heat your room with the PC you’re playing too much
uh oh…
jesus, this is what i asked for but dam. this is a lot (thank u @elaniselan )
Thank you PSU Sensei