That setup is a dream for a lot of people, but it's not always easy to make happen depending on state regulations (and how powerful the utilities there are)...
Since the local power company here is only paying 10 cents per kw for solar power (which they resell at greater profit), I decided to run a small crypo miner and I still have excess power on a 22kw system.
I don't know of anywhere where its not legal to be solar powered but there were several thousand in costs associated with engineer plans and permits.
I think this is a common reason for disappointment in solar incentives. At least half of your power bill pays for transmission, and the half that pays for generation needs to be constructed such that the overall supply must meet the demand at all times, rather than simply supplying a number of kWh per day regardless of instantaneous demand. You can’t consider the “price” per kWh that you pay commercially to be the value of supplying a kWh to the grid, it’s much more likely that the utility is making a (subsidized) loss paying you 10c per solar kWh.
Electricity on the local distribution node has a value equal to the cost of generation plus the distribution. That's the value of it, what we pay. So by supplying the kWh locally to neighbors, the grid costs have been avoided. But the value is still the same.
Now, the T&D infrastructure has already been built, and the utility wants to get paid no matter what, but if they were a private company and not a monopoly, they wouldn't have a right to get compensated for their investment no matter what, because every company buys capital at risk. And that's for the good of the economy.
There needs to be some sort of forcing function to incentivize this cheaper form of power delivery, that avoids a lot of transmission and distribution costs. And that forcing function is the price that we pay those who generate the electricity.
The utility of course loses on every kWh they don't generate, because they want to sell more electricity. However, since they have a monopoly, we need other regulation to ensure that innovation that results in lower overall costs actually results in lower prices for consumers.
So far, the utilities have snowed the public and the PUCs such that they get away with murder on this transition. We need a grid, but we do not need the utility. And if the utility can not come up with a business model that works as a regulated monopoly when we have local generation, then we need to change the regulatory model, most likely eliminating the monopoly.
There's a lot to learn from Texas here for the rest of the country.
The infrastructure has not “already been built” - it is constantly under expansion and maintenance, and the bonds used to fund construction also need to be repaid.
I think your mind frame is that the reason the grid is not smart enough to pay you what you think your excess unreliable power is worth (which you stated to be the entire retail cost of power, including transmission and distribution) is because of incompetence and corruption of the utility monopolies. I think that is a pretty uncharitable take. It’s a hard problem and people generally want reliable and cheap. You can’t make microgrids reliable and plentiful without a ton of diverse generation (which already exists on the macro-grid) OR a ton of storage, both of which are very expensive. It is a problem worth solving but it needs to be considered with a realistic view on what people are actually paying for when they pay their power bill.
Grids are sized for peak, and without solar that peak is midday in most places, meaning that distributed behind-the-meter solar makes the grid cheaper.
Utilities, when they argue that solar is worth less, are not arguing on the merits of the issue but only selectively advancing arguments that benefit them. They will never present the totality of the issue.
It is up to others to push back against utilities' narrow views with a more complete view of the picture and what's possible.
Having a residential power connection from the grid allows you to demand up to 200Amps of power, at any time of day or night, 365 days a year, with zero notice. The power company has to build the lines to support that potential demand, whether you use it or not. Over all of California, distributed solar probably has reduced the expenditures we would have need to have made on new transmission and generation facilities compared to a world without distributed solar, but that doesn't affect the baseline cost of a ubiquitous grid that serves from Crescent City to the border with Arizona at Yuma, and all points between.
No they haven't. The grid cost is to build and maintain the wires and equipment. Your solar output isn't reliable enough for them to downsize the grid, so even though selling to a neighbor bypasses the grid it doesn't reduce the cost of having a grid.
What you could do is split out the grid cost, make it be a fixed fee per location instead of per-kWh. That would drop the price of buying a kWh until it's much closer to the price of selling.
But if you do that, someone with a lot of solar panels would end up with even less money in their pocket, since their reduced kWh purchases used to let them skimp on grid fees, and now that no longer happens.
> Mrs. Wells changed her housework habits because for part of the year it costs her more than six times as much to use electricity from 8 A.M. to 11 A.M. and 5 P.M. to 9 P.M. as it costs during the rest of the day.
https://www.nytimes.com/1975/06/29/archives/experimenting-wi...
Current CAISO data shows that overall demand still peaks in the late afternoon to early evening. I picked a day in mid-august, and demand at 7pm is 40% higher (39GW) than at solar noon of 1pm (29GW).
In conclusion, the retail price of your electricity includes the engineering and infrastructure required to make your power delivery reliable most of the time, which is much more valuable than the raw kilowatts coming off of your solar panels.
The United States electric grid data is freely available and pretty neat: https://www.eia.gov/electricity/gridmonitor/dashboard/electr... Choose a grid or a state to get regional time and you can see that region's peak will usually be 4-7pm. You can even see that weekend peaks are a bit lower, and that there's a second peak at ~10am when people get to work.
Historically in Califorinia, peak load has been in the afternoon, which I count as midday. At least, it's when solar panels are still pumping out a ton of power:
https://www.caiso.com/Documents/CaliforniaISOPeakLoadHistory...
You're posting a random day in winter in California, where overall consumption is low even at its highest, because there's very little demand for cooling. True peak for the California grid is ~50GW, not 25GW like today. You're also omitting all the residential solar that never gets on the grid that drives down midday demand in that graph.
Texas also has midday peaks, here's today and you'll see that even though its winter and very little AC is needed, peak is midday:
https://www.ercot.com/content/cdr/html/loadForecastVsActualC...
My statement was qualified with "most places." There will undoubtedly be some places with other peaks for which solar will not shave the peak. But in most places distributed solar will shave the peak.
And if you didn't know that, and think that I'm too "us vs. them," then you should go look at the arguments made in regulatory proceedings and IRPs etc.
The utilities invoke preposterous technical arguments all the time. Yes, the grid should be reliable, but making it more decentralized and adding storage all over will make it far more reliable.
Industrial scale battery packs are quite often cheaper than new transmission lines. And we're going to need a lot more transmission or transmission alternatives in the future as more of our energy needs are electrified.
I don't dispute that some distribution might need to be upgraded to fully take advantage of the cost savings that distributed solar and storage present.
But you'll never find the utilities making the case for engineering a more reliable cheaper system, if that system is cheaper, because they will make less money. It would be financially irresponsible for them to make that case, and in fact they must try their hardest to increase the amount of money that is spent on fixed grid assets, that they can directly rate base.
This is not being overly "us vs. them" this is simple economics and incentives of regulated monopolies. Utilities are great at responding to the financial incentives put before them. Sometimes those financial incentives are making the grid reliable. But I don't know of a single regulated monopoly that has been financially incentivized to lower grid costs.
https://www.caiso.com/Documents/CaliforniaISOPeakLoadHistory...
(Note also in your visualisation that all times are Eastern and should be adjusted for different localities. And if you go to a summer week rather than a winter week, you'll find the true peak, which is much higher, and which has a pretty standard curve with a peak that overlaps sunlight hours.)
Transmission savings are the big thing with distributed solar and storage. And transmission is the bottleneck for most projects looking to connect to the grid right now. Not only is it expensive, it's slow to build.
They have this summer's data too, though no way to link directly, and it still peaks at ~7pm: https://i.imgur.com/16mssuH.png . Using the 16th as an example, a peak demand of 44,008 megawatthours @ 20H PDT. Comparing that to their generation graphs, which you can separate into sources, like solar. On the 16th, peak solar generation is at 11 @ 13,201 megawatthours. By 6PM, it's down to 853 megawatthours. By peak time, it's nothing. My own residential solar matches that curve on that date.
It's not that they have sunk costs, it's that they have ongoing costs. The grid cost does not drop when you send excess solar to a neighbor. To actually avoid grid costs you need to reduce your max watts in a way that the power company can rely on.
> Transmission savings are the big thing with distributed solar and storage. And transmission is the bottleneck for most projects looking to connect to the grid right now. Not only is it expensive, it's slow to build.
Storage can save on transmissions but it has to be set up the right way. Solar and storage working together can do even better, but also have to be set up the right way. Solar by itself doesn't make a big difference in peak transmissions.
Mid day is the middle of the day, as in noon. You might as well be arguing that you define three as five.
5PM is not "mid day". So you're cherry-picking time frames, making up definitions for things, and still not showing a mid day peak energy use, you're showing a late afternoon energy use.
You're just seeing the data for today. You can select any day you want.
Let's look at a really generous day for you, the peak annual usage from 2020: 47,121 MW on August 18 @ 15:57. On this day, the peak was indeed at 15:57. However, the demand remains high for hours past that. Demand is above 99% of peak until 5:30pm and above 90% of peak until almost 9pm. Solar production is down to under 1000MW by 6:45pm. Thus we have over 2 hours of near-peak demand when solar is not helping at all. No amount of additional solar (without batteries) will ever cover that 6:45-9pm period of high (if not peak, but it's close) demand.