The electric grid is the high voltage system that connects power generators to consumers through the power delivery system. Power stations generate medium voltages that are stepped up so that electricity can be efficiently transmitted. Stepdown transformers (not all of which are shown in the figure) decrease the voltages delivered to consumers. The grid represents the high voltage transmission system that connects bulk power generation with the medium voltage distribution systems that supply most consumers. (Some large, typically industrial consumers are connected directly to the grid.)
Controlling the dynamic behavior of the interconnected electricity system is a great engineering and operational challenge. After all, power flow responds to the laws of physics: It flows freely over all available paths, roughly in inverse proportion to the magnitude of the impedance. And demand for electricity is constantly changing. Millions of consumers switch lights or air conditioners on and off; businesses cycle their office equipment and production processes. Generation and demand must be kept in balance over large regions to ensure that voltage and frequency are maintained within narrowly prescribed limits?€”from 59.98 to 60.02 Hz, say, for frequency. If either voltage or frequency strays too far from its prescribed range, the resulting mechanical stresses can severely damage power?ˆ’system equipment. Thus the electric grid requires a protective overlay to minimize damage and to ensure that system operators can rapidly restore power when problems arise. — “Transforming the Electric Infrastructure“
The previous post announcing the release of the Center for Smart Energy report stimulated some Internet recursion. I dove into the World Changing archives. An article entitled, “Smart Grids, Grid Computing, and the New World of Energy“, referenced by the more recent article, notes that “the success of distributed energy is ultimately dependent upon the increasing availability of computer-enabled power networks, or ‘smart grids.’ And smart grids for distributed power, in turn, will increasingly rely upon the availability of distributed computing.”
In the New World of Energy article, World Changing editor Jamais Cascio reports that a connection of smart grids and grid computing is already underway in the UK. The idea of smart grids for me resonated with previous notes about Collective Intelligence and, more specifically, with informational intelligence, i.e., how well systems can maintain reliability and security by the collection, exchange and analysis of data and metadata. Can you say Metadata? Sure, I knew you could.
There is another issue. The innovators suggest that distributed electric power can be shared in a manner similar to distributed processing or distributed backup. In “Fixing the Power Grid” Jamais Cascio used the distributed file sharing program Bit Torrent as an analogy and possible model for grid ties and distributed electric power systems. Another example could be DIBS (Distributed Internet Backup System), although I personally found the lack of implementation peer finder service unsatisfactory and, instead, went with duplicity, which offers similar security, and perhaps greater efficiency and requires pre-existing arrangement for the remote storage.
DIBS is a backup system not a file sharing system and is intended “to provide a simple, secure, and robust system for backing up data by exchanging it with peers on the Internet.” DIBS automatically encrypts all data transmissions so that peers cannot access your data. DIBS aims to provide high robustness to failures by “use of Reed-Solomon (RS) codes, which are an optimal type of erasure correcting code similar to what is used in high level RAID.”
One reason that I bothered to suggest DIBS as an example is that depends upon a Peer Finder Service. The software matches an advertised request with a proposed service. The match is automatic within the specified parameters of the request and offeratory contracts. The difference between distributed file backup and the trading of a distributed electric power system is that with a DIBS contract you set it and forget it whereas transmission loading transactions require real-time flexibility, particular in terms of pricing. So, in some ways, transmission loading trade is more like automated trading systems used by investors.
In the New World of Energy article, Jamais Cascio suggested that the Grid needed fixing. The basis for this observation came from the important Physics Today December 2004 article. The authors, Gellings and Yeager, logically presented their case that the aging infrastructure must be modernized.
Among the numerous challenges facing the electricity industry are the rapid increase in wholesale transactions between such entities as independent power producers and distribution utilities; increasing grid congestion; continuing low levels of infrastructure investment; the application of technology to allow more options for consumers; the growing need for better grid security; and the precision power requirements of a digital society.
Characteristics of a Modernized Grid
Some sort of load sharing among the North American power network has existed for more than 50 years. Modernization means the Grid has the capability to rapidly detect, analyze, respond and restore from perturbations.
The goal is carbon-neutral power, i.e., avoiding adding to the amount of carbon dioxide in the atmosphere. The electric power generation from biomass is an example of renewable energy. The larger picture is a reduction of emissions that pollute the environment. Which from my perspective means avoiding nuclear power that produces remarkably toxic and long-lasting pollution. Instead, this means greater use of solar, wind, and wave power.
For example, as previously noted, “just one-tenth of one percent of the bulit-up surface area covered with five percent efficient plastic solar would mean a total energy generation potential of over 5.6 gigawatt.
As mentioned previously, grid optimization is synonymous with IT (Information Technology) that can help the industry monitor and control the transmission of electric power in a cost-effective manner.
In 2001 an important article, The Energy Web, appeared in Wired magazine. Steve Sillberman writes that the plan for the distribution of electricity in North America is for every node in the network to be “awake, responsive, adaptive, price-smart, eco-sensitive, real-time, flexible, humming – and interconnected with everything else.”
Notes the Physics Today article:
The North American power delivery system, however, was not designed to meet the pace and rigor of competitive markets. The large number of wholesale transactions breaks down the cohesiveness of the system. Kirchhoff’s law, not federal law, governs power flow, and so the industry had to quickly come up with a method to adjudicate contracts between buyers and sellers of electricity. The system now in place allows for any of the more than 150 grid area operators in the US to declare that they are unable to accommodate a transaction: That declaration is made through a request for transmission loading relief.
The article goes on to observe that the number of level two (“I cannot fully accommodate the transaction”) or higher calls for transmission loading relief in North America have increased about tenfold over the past seven years. The Wired article, written before the Northeast blackout of 2002, describe the preceding 1965 blackout that prompted thinking about a smarter energy network.
To a satellite in orbit, it must have looked like a major constellation was being snuffed out. First Toronto went black, then Rochester, Boston, and finally New York City. In just 13 minutes, one of the crowning achievements of industrial engineering – the computer-controlled power grid of the 80,000-square-mile Canada-United States Eastern Interconnection area – was toast.
The blackout of 14 August 2003 encompassed an even greater segment of the North American grid. Did it signify a failure of the new, improved system that researchers at incubators of our energy future, such as the Pacific Northwest National Laboratory and the Bonneville Power Administration, had begun to describe with phrases like the Intelligent Grid, the Energy Net, and the Energy Web? Yes and no… The introduction to the 2003 Electricity Technology Roadmap suggested that the most recent Northeast blackout was a symptom of already strained infrastructure and others have claimed that within the failure there was success: the time taken to restore power was much less than in 1965.
Research by EPRI (the Electric Power Research Institute) has shown that poor reliability by the Grid cost the U.S. economy an estimated $100 billion. That was in a good year, without a blackout. An improvement in power system reliability would reduce this economic burden primarily carried by industry and passed along to the consumer.
One improvement is a switch from electromechanical relays, the Wired article describes “trying to precisely manage activity on the grid with electromechanical relays has become the art of narrowly averting disaster.” The North American power industry is investing in a switch from electromechanical relays to solid-state devices as means of improving power system reliability.
Tapping the new fleet of energy resources will require something that is already hard to come by for system operators – the ability to tell power where to go. FACTS (flexible AC transmission system), a breed of solid-state devices developed by EPRI and Westinghouse that was 20 years in the making, promises to give transmission companies and system operators the capacity to deliver measured quantities of power to specified areas of the grid. In the real-time interactive energy marketplace, technologies like FACTS will allow system operators to send power along “transactional pathways,” rather than just down the paths of least resistance.
There is a third issue that relates to the above characteristics and investment of resources. Some networks are scalable and some are “scale-free” and any distributed system designed to facilitate load sharing among a distributed power network would need to accommodate both. Using approaches common to distributed systems promises improve efficiencies. Current distributed electric power systems have a ways to go before they can be fully responsive to future requirements for electricity.