Category Archives: SG Investment Cost

Related to the costs and benefits of SG deployments

“Shape-Shifting” Load Curves

Final Avatar 80x80-Logo-SG-1-and-2-and-IX-LOGO-e1363114874895-150x150


Dom Geraghty

Sub-Title: Achieving a “Net Zero” or “Net-Negative” Capacity Build

In previous dialogs we have analyzed and forecast the capital investment requirements of the power system through 2030 ($400 billion for SG applications and $1.6 trillion for power system infrastructure), and presented strategies for reducing those costs such as the 80/20 rule for SG applications deployment, and the potential for SG asset management applications to optimize the replacement of aging infrastructure.

We have also written about the shift to a “net” load forecast for dispatching decisions, and the increasing uncertainty of the “net” load forecast as self-optimizing customers make autonomous decisions about local energy use and distributed resource activation.

While replacement of failing aged equipment can’t be avoided, what about capacity needed for demand growth?  A major longer-term benefit of certain SG applications is the deferral of capital expenditures for new power plants through better use of existing plants and equipment.

IMG_3497 150x150How does that work? Many of us who have worked in the power sector look at the average utilization rates of 50% - 65% for existing capacity (generation, transmission, and distribution) and, even allowing for peak to average demand ratios, can’t help but think that the utilization rates could be increased, maybe substantially.

Could we even achieve a “net-zero” capacity build for the next decade or two with the help of SG applications? Could we reduce the amount of replacement required with a “net-negative” capacity build? What would the implications be for the current utility business model?

The New Load Curve

Dispatchers do not dispatch to the traditional load forecast any more, i.e., the gross demand of the customer, they use the “net” load forecast:  the load that the utility sees at the point of common coupling.

The utility or ISO can itself shape/manage the “net” load through dispatchable demand response, distributed generation (DG), distributed storage (DS), and Virtual Power Plant programs.

However, self-optimizing customers will shape their own load curve autonomously (with little to no visibility to the utility or ISO) by dispatching DG or DS, programming price-responsive devices (including high-speed M2M control loops), and/or charging EVs.

B&W Pole w/Wires 150x150The uncertainty of the “net” load forecast will be increasing due to the unknown price-elastic behavior of customers in dynamic pricing regimes and the intermittency of distributed PV generation.

The transactive energy concept aims to create dynamic equilibrium in an integrated power market for all participants including end-use customers, where prices will clear at all nodes in the power system simultaneously (or close to real-time), resulting in a continuing matching of power supply and demand. We are very far from that capability at present (both technologically and policy-wise), and some less optimal and possibly more practical approaches may intervene, such an orderly backlog management/queuing approach based on traffic engineering, see here.

By How Much Can Future Gross and Net Load Curves Differ? Continue reading

The Elephant in the Room: Addressing the Affordability of a Rejuvenated, Smarter Grid

Final Avatar 80x80-Logo-SG-1-and-2-and-IX-LOGO-e1363114874895-150x150


Dom Geraghty


Adding Up All Of The Costs

The cost of transitioning to Smart Grid (SG) 2.0 is high. The cost of replacing aging power system infrastructure is high. The cost of complying with new energy policy and regulations is high. For the most part, all three of these major cost burdens on the electric power system have been analyzed separately. We need to look at the whole picture.

In the tables below, we construct a 30-year cumulated budget and net benefits forecast for the U.S. power system, integrate all three of the above major cost burdens, and suggest a practical cost management strategy that could save as much as $750 billion.

Business-As-Usual Cost

If we take a business-as-usual approach to the replacement of aging power system infrastructure and concurrent deployment of the SG, the 30-year total net costs (costs minus benefits) for the transition to the SG are estimated to be about $1.2 trillion, based on our integration and curation of numerous credible studies -- see here and here. So, despite the considerable benefits generated by the SG, they are dwarfed by the costs of deployment.

Making Electricity Bills More Affordable in the SG Era

The net result is a large and economically debilitating increase in consumer’s electricity bills, and questions around the grid operators’ ability to maintain present-day levels of service reliability.

We were sufficiently concerned by the current "silo-ed" approach to costs, and the size of totaled costs, that we decided to explore an optimized least-cost deployment strategy, the goal of which would be to trade-off ubiquitous deployment of the SG against a more “surgical” deployment that optimizes the value of SG applications.

According to our estimates, if we optimize the deployment of SG applications in a least cost deployment strategy, the net costs of this new smart/automated power system can be reduced substantially. In the calculations below, we estimate that the net costs over the 30-year period could be as low as $450 billion, plus or minus some as-yet unaccounted-for costs and savings as presented at the end of this dialog.

8. DSC_0873-150x150Even with a least cost strategy, electricity bills will not be reduced from today’s levels (see also here) – but they will grow at a slower rate, and it would seem that they will be manageable, based the assumptions below.  Note that in all scenarios, there is no avoiding the front-end loading characteristics of the transition to the SG. What we can do, though, is to first deploy those SG applications that have the highest potential for creating near-term cash savings.

Here is a sequence of seven fairly self-explanatory tables that tell the story. Assumptions used in the forecasts are provided after the tables at the end of this dialog. All dollar amounts in the tables below are in nominal $ billions, undiscounted.

Cost of Replacing Aging Infrastructure and Deploying SG 2.0

Tables 1 and 2 below present business-as-usual net cash flows for rejuvenating power system infrastructure and deploying SG 2.0.

Table 1: Budget Forecast for Power System Infrastructure*

Cash Flow Through Each 5-Year Period

Year 1- 5

Year 6 - 10

Year 11-15

Year 16-20

Year 21-25

Year 26-30

Power system infrastructure investments -- Aging Infrastructure Replacement + Capacity to Serve Demand Growth (G,T, and D) ($1.6 trillion in total)







Cyber-security ($22 billion in total) -- also, see here

- 4

- 4

- 4

- 4

- 4

- 4

Environmental compliance ($110 billion) -- see also here and here and here







Total infrastructure investment by period







Grand total (nominal, undiscounted $)







Table 2: Budget Forecast for Transitioning to SG 2.0* Continue reading

A Supply/Demand Curve for “Grid Flexibility” Products – The Price of “Optionality”

Final Avatar 80x80-Logo-SG-1-and-2-and-IX-LOGO-e1363114874895-150x150



Dom Geraghty


Operators of the power system are facing a paradox – on the one hand, the grid is becoming smarter which gives operators an extra “edge” in managing the grid, but, on the other hand, policy changes associated with the promotion of renewable power and customer choice are creating increased uncertainty in both supply and demand. Yet operators must still deliver power with an average service reliability of 3 x 9s (the LOLP or LOEP target in system planning models), while also maintaining short-term stability and security of supply, even as the inertia of the system decreases.

Supply Uncertainty

Increases in supply uncertainties are caused by the steadily increasing penetration of renewable energy production as a result of RPS mandates. Delivery uncertainties are on the rise due to lagging transmission construction leading to congestion.

Demand Uncertainty

Pole w/Wires 150x150Demand uncertainty is increasing as self-optimizing end-users install distributed energy and storage, smart appliances that use M2M controls and chargers for EVs, and take advantage of time-differentiated pricing.

Electricity dispatch is based on minute-to-minute forecasting and clearing of customers’ “net load”. That is, the end-use customer’s load minus any local power generation or discharging of energy storage devices.

But the utility or system operator usually does not have visibility into, or interconnection with, this generation behind the meter – creating demand uncertainty and an inability to take advantage of the distributed generation in reliability emergencies.

Price Volatility and Price Elasticity-Created Uncertainty

The market as structured also creates short-term operational challenges for the grid operators. Prices in wholesale markets can change rapidly over short periods of time, e.g., 15 minutes, leading to sharp changes in the availability of supply and in the level of demand, driven by price elasticity. These impacts of price volatility, combined with the increased percentage of intermittent resources, creates the need for additional fast-acting reserves to maintain the grid operator’s target service reliability level.

Managing Uncertainty

How can the power system operator cope with the increased physical- and market-driven uncertainties?

We suggest that a “least cost” coping approach can consist of a combination of (1) investments in the “smart grid” (smart sensors, advanced controls, data analytics), and (2) investments in flexibility products (to be defined below). Continue reading

Providing 99.87% Reliability* Is Going to Cost a Lot More – Are There Related SG 2.0 Business Opportunities?

Final Avatar 80x80-Logo-SG-1-and-2-and-IX-LOGO-e1363114874895-150x150


Dom Geraghty


Historically, utilities have provided about “3 x 9s” reliability. The cost of this reliability is currently bundled into the price of electricity. This includes the cost of maintaining reserves, contingency plans, and automated generation control to cover the stochastic behavior of forced outages and electricity demand.

This cost is going up. Why?

Bulk Power Supply Uncertainty Is Affecting Reliability

The implementation of the RPS mandates is increasing the proportion of intermittent power production plants and by default decreasing the inertia, i.e., the damping ability, of the power system. As a result, a substantial amount of extra generation reserves and ancillary services are required to cover the increased uncertainty of supply while maintaining “3 x 9s” reliability levels. Recognizing this, most ISO markets trade various reserve and ancillary service products.

Trans-15-almost-purple-New-Image-150x150The transmission system is becoming more congested and there is widespread resistance against building new transmission lines. As a result, to maintain target levels of reliability and system security, more contingencies and remedial action plans and systems are needed to cover the increased uncertainty of delivery capability. Recognizing this, the ERCOT wholesale market trades month-ahead “congestion revenue right” products.

Real-World Examples of Related Supply-Side Reliability Events

A recent article by Dr. Paul-Frederik Bach, an expert in power system operations, discusses the impact of renewables penetration on the German power Grid. “The number of interventions has increased dramatically from 2010-2011 to 2011-2012…….

Bottlenecks are often detected in local grids. It makes no difference to the owner of a wind turbine if local or national grids are congested…………..In an attempt to establish an impression of the extent of interventions in Germany, EON Netz will be used as an example………..

During the first quarter of 2012, EON Netz has issued 257 interventions. The average length was 5.7 hours. Up to 10 interventions have been issued for the same hour. A total of 504 hours had one or more interventions. Thus, there have been interventions active for 23.1 percent of the hours during the first quarter of 2012..........

The total amount of curtailed energy from wind and CHP is probably modest, but the observations seem to indicate that German grids are frequently loaded to the capacity limits. Strained grids have a higher risk of cascading outages caused by single events.”

Another informative and very detailed analysis of a widespread outage in Europe in 2006 -- one which overloaded power lines and transformers in Poland by 120% and 140%, respectively -- can be found here. It includes a very interesting map of the European interconnected system showing voltage phase angle differences between substations varying from +60° to -50° across the region.

Demand-Side Uncertainty Is Also Affecting Reliability

Limited band-width, short-term frequency and voltage control is provided by traditional power plants.

However, the power industry does not have closed loop control between demand and supply. Continue reading

How Do You Spend $400 Billion? Part V: A Business Opportunity – Identifying the “Top 20%” High-Value Nodes in the Power System

Final Avatar 80x80-Logo-SG-1-and-2-and-IX-LOGO-e1363114874895-150x150


Dominic Geraghty


In Part IV of this dialog series, we presented a Least Cost Strategy for deploying SG 2.0. The present dialog discusses how we identify and screen the most valuable projects within the Least Cost Strategy and how we address the risks and uncertainties inherent in the deployment of these projects.

In all situations, it is the business case that provides the basis for the go/no-go investment decision of an SG 2.0 application in a particular location. As we’ve said previously, a complete business case will include market, regulation, power system impact, other technical considerations, and an assessment of the risks/uncertainties. It will also include an assessment of qualitative factors.

The primary prioritization metric for projects is the benefit-to-cost ratio of the business case, subject to meeting hard constraints such as service reliability and environmental compliance.

Identifying High-Potential SG 2.0 Opportunities

Developing a business case is a very resource-intensive task. We don’t want to do it for every potential SG 2.0 application. Is there an efficient way to develop a first-cut list of the “Top 20%” (most valuable) applications? Yes, power system operators can speed-up the process.

10. DSC_1334-150x150These power system operators, i.e., vertically-integrated utilities, utilities, distribution companies, ISOs, and RTOs, will be the main users of SG 2.0 applications. They are the power system experts. They should be able to develop an initial list of high-potential opportunities for SG 2.0 applications based on their knowledge of stressed system nodes and likely optimal locations.

Project and power system simulation models can then be used to confirm (or not) the value of the applications for these locations. This is a business opportunity for software developers and SG 2.0 vendors.

As part of this identification process, the benefits that an SG 2.0 application may bring to the broader power system needs to be considered (see the discussion on power system simulation models below).

Developing Business Cases

Having screened for the most valuable SG 2.0 application opportunities first, detailed business cases are developed for these opportunities. Some re-ranking in the initial list based on the results of the business cases will result. Certain trade-offs will still be inherent in the process, based in part on the level of risk that decision-makers are willing to tolerate, e.g., their valuation of short-term versus long-term savings, their views on an acceptable level of risk.

It is entirely likely that the business cases will exhibit our desired characteristic discussed in Part IV, i.e., conforming to the 80%/20% rule that projects that 20% of the projects will provide 80% of the benefits of SG 2.0 applications. This is a critical element in achieving our “least cost” goal to create a significant reduction in cost relative to a less selective, less discriminating, deployment approach. Continue reading

How Do You Spend $400 Billion? Part IV: A Least Cost Strategy for SG 2.0

Final Avatar 80x80-Logo-SG-1-and-2-and-IX-LOGO-e1363114874895-150x150


Dom Geraghty


Given the high costs and risks associated with SG 2.0, we recommended in Part III that SG 2.0 be implemented based on a “Managed Deployment Strategy”.

What is a Managed Deployment Strategy? It comprises:

(1) Picking SG 2.0 applications that provide the best benefit to cost ratio, prioritized within an affordable budget,

(2) Building-in sufficient flexibility to minimize the impact of major future uncertainties, and

(3) All the while maintaining the target level of service reliability.

That is, our Managed Deployment Strategy combines budgeted high-benefit-to-cost-ratio investment commitments (abbreviated to “Least Cost Strategy”) with a risk management strategy.

In this dialog, we look at the Least Cost Strategy component. In the following dialog we will talk about identifying high-value SG 2.0 applications and evaluating their business cases within the context of a risk-managed, Least Cost Strategy.

Developing a Least Cost Strategy for SG 2.0 Deployment

DSC_1310-150x150It has been estimated in several studies that a national deployment of SG 2.0 would cost about $400 billion and provide a benefit to cost ratio of 3:1, including the value of qualitative benefits (which comprised a substantial portion of the total benefits). The studies assume significant national market penetrations for different SG 2.0 applications.

Some studies of service-area deployments by utilities have been less optimistic about the benefits, suggesting a benefit to cost ratio of approximately 1:1. These studies have included less, or no, qualitative benefits, which could likely account for the lower benefit to cost ratio. These studies also include fairly significant market penetrations for different SG 2.0 applications.

In both the national and the utility cases, the costs of various SG 2.0 applications have been derived from similar data sources. We don’t think that there is a lot of disagreement about the “ball-park” costs of deploying various SG 2.0 applications. However, there is much less consensus about the benefits.

We have suggested that the SG 2.0 costs could be much less than the top-down forecast as a result of the 80%/20% rule, i.e., experience in similar situations indicates that one may be able to obtain perhaps 80% of the benefits of SG 2.0 applications by deploying into about 20% of the power system.

Assuming that this rather convenient rule applies here, we could reduce the investment budget for SG 2.0 deployment substantially below the projected $400 billion above, and yet still derive most of its benefits. We’ll see – it sounds somewhat optimistic, but we do buy into the idea of “surgical” deployment of SG 2.0 applications, i.e., focusing on, and limiting the deployments to, high-benefit situations.

Caveat Emptor!

We’ve received a warning. The original business cases for AMI have proven to date to be weak. Continue reading

How Do You Invest $400 Billion? – Part III: Planning Assumptions and Trade-Offs

Final Avatar 80x80-Logo-SG-1-and-2-and-IX-LOGO-e1363114874895-150x150


Dominic Geraghty


What Is the Transition to SG 2.0 About, in Reality?

The transition is not just about an orderly, four-stage SG architecture change, driven by logical change-outs of information and communications technology, as presented in Part II. It is important to have thought this through, but it is not enough, and the transition will likely look quite different. This road-map will have detours.

It won’t happen that way because economics will determine in what parts of the system the transition will occur – the changes will take place through “cherry-picking” the highest benefit-cost ratio SG 2.0 applications. Realizing cash benefits will be a priority to offset the high capital requirements of power systems, i.e., to create “avoided costs”.

IMG_2597-150x150Neither will the transition be just about architecture and economics – it will be about the timeliness of supportive regulatory and policy changes. Regulations and policy deeply affect the economic incentives and outcomes of SG 2.0 applications.

The transition will be also mediated by tougher requirements on technology readiness. To date, value-added applications that were supposed to be provided by AMI installations have had mixed delivery results. SG 2.0 vendors will be increasingly required to demonstrate value propositions, to “prove” them. Demand for “pilot demonstrations” will increase, and “system acceptance tests” will become more stringent. Certification of functionality and interoperability will become the norm.

And lastly, a successful transition is about end-use customers, who are concerned about the size of their electricity bills and about receiving an adequate level of service reliability – will the continued polarization of the political processes that govern power system investments create a reliability crunch in stressed locations? Customers have already exhibited hardening resistance to AMI installations, in part because they do not see benefits. Will customers increasingly “self-optimize” by building more DG in the light of higher bills and the potential for lower reliability?

To understand how the transition will occur, and at what pace, we need to build in assumptions about all of the above determining factors as we develop corporate and individual business cases for SG 2.0 applications.

Competing for the Capital Required for Utility SG 2.0 Installations Continue reading

How Do You Invest $400 Billion? Part II: The CEO of the SG Develops a Budget Forecast

Final Avatar 80x80-Logo-SG-1-and-2-and-IX-LOGO-e1363114874895-150x150


Dom Geraghty


This is the second part of a four-part dialog series.

The first dialog in this series presented, from a top-down perspective, the costs and benefits of SG 2.0 deployment plans created by five different, credible groups.

This second dialog prepares and discusses a conceptual multi-decade budget for a national SG 2.0 deployment, an approach that could also be used by individual utilities.

The third dialog will provide a holistic budget forecast for the deployment of SG 2.0 and the replacement of aging infrastructure for the power system as a whole. Our final dialog will suggest a “Managed Deployment Strategy” for SG 2.0 that combines a least cost approach and risk management.

10. DSC_1334-150x150We have already noted that the only meaningful way to choose which individual SG 2.0 applications to deploy was from a bottom-up, project-oriented approach within the context of the regulatory, market and business environment, including taking account of the costs and benefits of any power system impacts. Our mission here at SGiX is to facilitate these project choices.

As a first step in fulfilling this mission, the “ground-work” as it were, we feel it would be useful to create an understanding of the “big picture” -- the national context within which these individual SG 2.0 business cases are to be evaluated and financed. That is the purpose of this current series of dialogs.

The CEO of the “National Smart Grid” Does Some SG 2.0 Deployment Planning

Imagine for a minute that you are the CEO for the entire ‘national smart grid’. You are concerned about the large capital investment requirement over a multi-decade period of time, how you are going to finance this multi-decade investment, and perhaps the “softness” of some of the benefits.

How would you go about evaluating an investment commitment, and determine the recommendations you’d want to present to your Board?

Well, for sure you’d want to prepare a budget forecast.

But to develop a budget, you would first need a road map of the expected deployment of SG 2.0.  So, you would review the road-maps that have been developed by various credible entities, and talk to the leadership of these efforts.

Some SG 2.0 Deployment Road-Maps

DSC_0026 150x150One credible road-map for the transition has been presented in the 2011 joint report by Cisco/IBM/SCE (link) entitled: “Smart Grid Reference Architecture: Volume 1”, referencing and making use of previously published seminal work by NETL, GridWise®, NIST, and IEEE. The Cisco/IBM/SCE proposed transition moves gradually through four stages -- a senior Cisco executive estimated that the transition would take about 30 years to accomplish:

(a)    Stage 1:  Today’s silo-ed architecture, e.g., metering/billing and EMS/SCADA as separate architectures Continue reading

How Do You Invest $400 Billion (in the SG)? — Part I

Final Avatar 80x80-Logo-SG-1-and-2-and-IX-LOGO-e1363114874895-150x150


Dominic Geraghty


SGiX is about business cases. Obviously, one needs a very strong set of business cases to commit to an investment of as much as $400 billion for SG 2.0, even if it is over a period of 30 years.

DSC_1310-150x150We are going to propose that a “Managed Deployment Strategy” might give us the best chance of success, i.e., delivering prioritized, well-defined benefits over time coupled with a commensurately-paced investment rate. We will define and present the “Managed Deployment Strategy” in Part 2 of this dialog series.

In all of this, we are assuming that the above required capital for SG 2.0 can be made available even in the context of an industry that appears to need an additional ~$1.7 trillion for infrastructure investments over the same period of time, estimated as follows:

Estimate of the Total Capital Requirements of the Power Sector through 2030

Investment Category

$ Billions


SG 2.0


See Dialog, here
Traditional Power System Infrastructure (G,T, and D)


See Dialog, here


Zpryme, Pike market research reports cited by GreenTechMedia
Environmental Compliance


EEI 2011
Policy-Driven Subsidies

$ ?

For example: RPS, EVs, DG, Storage, etc.



Caveat: the above estimates and those below are based on data from a variety of different sources. The sources reflect somewhat different or overlapping definitions of the costs (capital investments), and it is possible that some of the benefits may be double-counted. Some other costs and benefits were not counted -- see below.

Analyzing SG Benefits and Costs

Let’s first review what we know about the “top-down” aggregate benefits and costs of the SG, and identify the “gaps” in terms of inclusion and quantification. We are then positioned for our following Part II dialog about the desirable attributes of a “Managed Deployment Strategy” for SG 2.0.

From a business case perspective, we know that it is meaningless to evaluate an aggregated investment of $400 billion in terms of aggregated top-down benefits. Continue reading

Business Case for the SG is About Automation and Control

Final Avatar 80x80-Logo-SG-1-and-2-and-IX-LOGO-e1363114874895-150x150


Dom Geraghty


A Final "Pivot" in the Definition of SG 2.0

We’ve come some way in defining what the SG is, and what it is not, but we are not quite there yet  - it is time for a (hopefully) final “pivot”, the purpose of which is to propose a definition of the SG that provides a solid and clear foundation upon which to develop our SG 2.0 business cases.

DSC_1310-150x150Here, we’ll first summarize our key conclusions derived from the series of previous dialogs about the “State of the Smart Grid”. Then we’ll propose a new (narrower) definition of SG 2.0 applications. Please click on the Smart Grid 2.0 "Category" to the right if you would like to see all of the previous SG dialogs.


Some Key Conclusions So Far About the SG, Based on Our Previous Dialogs

SG Costs/Deployment Duration

(1)    It will cost about $400 billion to implement the SG nationally

(2)    Required power system infrastructure replacement will cost about $1.6 trillion over the same period

(3)    Full implementation of the SG will take about 30 years, and will evolve as a hybrid of legacy and new systems, with increasing interoperability being supported by a combination of custom APIs and the development and promulgation of new standards

(4)    The total cost estimate above likely includes everything but the kitchen sink, and we might expect that the costs, while very substantial, will not be quite as high, based on a more thorough, and more granular, evaluation of a practical and economically viable deployment plan. We will suggest such an approach in what we are calling “A Managed Deployment Strategy for the SG” in our next dialog

SG Definition

(5)    In everyday conversations, the definition of the SG is plastic – the SG is viewed as including many elements that are only peripherally, at best, “smart”. For example, depending on the individual, the SG connotes or includes renewable energy, sustainability, CleanTech, electric vehicles, distributed generation, AMI, energy storage, distribution automation, and/or demand response

(6)    We’ve pointed out that AMI is not the SG – it is infrastructure – see the previous presentation of our new definition of SG 2.0

Power System Control

(7)    Power systems have used closed loop control for decades for generation and transmission in the form of the AGC software application on an EMS. ISO dispatch decisions are based on load forecasts (every 5 minutes, hour, day) and tight, reactive, management of Area Control Error (ACE) and system frequency. The electric distribution system does not use closed loop control.

(8)    Demand forecasts have become increasingly uncertain and volatile as customers begin to self-optimize their power usage


(9)    Policy changes necessary to enable the realization of SG benefits have lagged the deployment of the SG, thus negatively impacting its ability to achieve its own fundamental policy goals

Policy and the SG

(10) The SG and CleanTech policies are symbiotic – while the SG is not CleanTech, some CleanTech elements, e.g., RPS mandates, end-use customer choice, require that the electricity grid be “smarter” if we are to maintain our present service level reliability

(11) SG capability is also needed because of other policy-created changes in the power system, e.g.,  increasingly dynamic loads, increased intermittency of distributed power production, charging of EVs, penetration of ADR, smart appliances and HANs, and the increased potential for electricity distribution system instabilities -- we will discuss this latter concern in an upcoming dialog


As we’ve shown, the SG is not infrastructure, or CleanTech, or AMI.

The real business of the SG consists of automation and control systems:

  1. Sensors with embedded smart control firmware for local control
  2. Communications to enable systems control for a variety of time domains
  3. Control software with embedded algorithms for operations management
  4. M2M (fast response) and hybrid M2M/human control loops (slower response)
  5. “Big data” mining for critical control loop information
  6. Power system and sub-system control loop simulations and analysis (including customer response to market prices -- market response is one of the control loops and it interacts with, and affects, physical system control loops)

Trans-15-almost-purple-New-Image-150x150Thus, SG 2.0 provides the requisite control systems to support and integrate the operations of (a) CleanTech power installations, (b) the traditional power system infrastructure, and (c) power markets.

SG 2.0 automation sits on top of these three operations. It is a prerequisite to the success of the Smart Grid and power-related CleanTech policy, broadly defined.

Ironically, if we consider AMI to be a system of sensors, then it can be viewed as falling under the rubric of “automation” since AMI provides data that can be used for control systems with slower required response times. That is, under our new “stripped-down” definition of SG 2.0 as automation and control -- if the SG is really a smart control system, AMI is part of the SG’s system control infrastructure.

SG 2.0 As “Automation and Control”: Business Opportunities and Cases

Defining SG 2.0 as automation and control disentangles the evaluation of SG 2.0 applications businesses from investments in traditional power infrastructure, AMI, and CleanTech.

It provides us with a logical connection between SG 2.0 and existing AMI systems that provide some of the necessary inputs for SG 2.0 automation applications.

It clarifies and focuses the context within which we must develop and evaluate business cases for SG 2.0 applications.

There are numerous automation and control business opportunities across the entire SG value chain. We will present the more interesting of these in subsequent dialogs.

As always, comments are appreciated, in the box below.