I'm really starting to think that Data Centers should be required to invest in the electricity generation to cover their own usage in order to build/expand. Not just pass the cost onto the local utilities, but actually build out power generation (solar, wind, etc) and maybe even require in excess of 100% generation.
Utilities require long term planning. A 22% increase in 1 year is an extreme outlier by any normal grid planning standard. Where I live, the high point in the last 5 years was a 2.8% annual increase. ( See https://www.ieso.ca/power-data/demand-overview/historical-de... ) Building large amounts of new capacity takes years to decades.
If hyperscalers keep pushing that kind of increase they will eventually learn that the supply and cost of energy becomes a limiting factor. When demand outstrips supply, prices go bonkers and rolling blackouts become a fact of life. See the California grid crisis back in 2000-2001, although in that case the supply shortage was the result of blatant market manipulation by Enron.
In a way, usually utilities are state controlled and there are different levels of funding for expansion projects, as well as variations to levels of efficiency, corruption or manipulation.
I'm perfectly fine if to satisfy that requirement the Data Center operator pays the Utility Company for the cost of expanding existing infrastructure, but I want those expenses more direct to the specific consumer (data center) likely to consume more energy in a day than most of the residents of the city do via taxes or direct fees against increased demands.
This is not even close to how anything works. There's some state regulation but most GSOs/TSOs/DSOs/IPPs are private companies, SGEs, or nonprofit co-ops. All RTOs/ISOs are nonprofits but own no grid assets.
In most areas, large commercial customers get preferential lower rates than residential customers do. So it follows that hyperscalers are likely to get even lower rates and neighbors of previously rural/small towns can "enjoy" the constant noise, utility rate increases, and tax increases to subsidize bitcoin mining and AI pipedreams.
The circular financing deals for Nvidia chips indicate false demand. So are these chips being used and if so where are the circular deals for electricity?
I'm really starting to think that Data Centers should be required to invest in the electricity generation to cover their own usage in order to build/expand. Not just pass the cost onto the local utilities, but actually build out power generation (solar, wind, etc) and maybe even require in excess of 100% generation.
Aren't utilities better equipped to plan, build, and operate sources of electricity?
Don't they already pass the cost of that onto consumers?
Utilities require long term planning. A 22% increase in 1 year is an extreme outlier by any normal grid planning standard. Where I live, the high point in the last 5 years was a 2.8% annual increase. ( See https://www.ieso.ca/power-data/demand-overview/historical-de... ) Building large amounts of new capacity takes years to decades.
If hyperscalers keep pushing that kind of increase they will eventually learn that the supply and cost of energy becomes a limiting factor. When demand outstrips supply, prices go bonkers and rolling blackouts become a fact of life. See the California grid crisis back in 2000-2001, although in that case the supply shortage was the result of blatant market manipulation by Enron.
In a way, usually utilities are state controlled and there are different levels of funding for expansion projects, as well as variations to levels of efficiency, corruption or manipulation.
I'm perfectly fine if to satisfy that requirement the Data Center operator pays the Utility Company for the cost of expanding existing infrastructure, but I want those expenses more direct to the specific consumer (data center) likely to consume more energy in a day than most of the residents of the city do via taxes or direct fees against increased demands.
> usually utilities are state controlled
This is not even close to how anything works. There's some state regulation but most GSOs/TSOs/DSOs/IPPs are private companies, SGEs, or nonprofit co-ops. All RTOs/ISOs are nonprofits but own no grid assets.
It's cheaper to raise rates for consumers than it is to build new generation capacity
In most areas, large commercial customers get preferential lower rates than residential customers do. So it follows that hyperscalers are likely to get even lower rates and neighbors of previously rural/small towns can "enjoy" the constant noise, utility rate increases, and tax increases to subsidize bitcoin mining and AI pipedreams.
All the recent reporting shows investment greater than 10X revenues. Who's using all the electricity?
AI
The circular financing deals for Nvidia chips indicate false demand. So are these chips being used and if so where are the circular deals for electricity?