Public and private sectors must partner to address generative AI’s interdependent energy and security requirements
Unlocking what pundits tout as the visionary potential of generative AI may require unprecedented amounts of electrical power, but it is unclear if our current energy infrastructure is up to the task.
That question is especially important now given the rise of gen AI and the competing demands the country faces with other high-priority goals, such as developing electric vehicle charging, prioritizing domestic advanced manufacturing, and pursuing climate-change initiatives.
Addressing the need for more power will require more partnership between the public and private sectors, with participants ranging from federal agencies and state public utility commissions to private sector data centers and energy companies.
Putting the energy problem into perspective
Both data center owners and public utilities are already grappling with the challenges of powering gen AI at scale. A computer server performing large language model data training, which is at the heart of gen AI, can use up to seven times as much power as a server used for cloud computing or e-commerce. Additionally, these gen AI servers often run at full speed, 24/7, unlike variable power seen in other uses.
To support gen AI power requirements, the handful of companies that operate the hyperscale data centers that house gen AI operations are already envisioning their next generation of facilities as campuses the size of small towns. These new facilities will feature a dedicated on-site power plant, unlike the individual city-block-sized data centers we see today.
To build this energy infrastructure will likely demand new technologies, faster building permitting and approval processes, and a strong focus on energy efficiency at every stage, from power generation and transmission through its use. Data centers also require considerable quantities of water, since they can emit as much of 98% of their power used as heat rather than computing power.
Gen AI security
Security is a critical concern due to gen AI’s potential impact to our daily lives, economic prosperity and national security. The focus has primarily been on securing the access and integrity of the foundational AI models, the mathematical weights that shape their accuracy, the training data of large language models, and the queries made against these models, which can reveal sensitive information about users and organizations.
On balance, gen AI security to date has been a narrative of “good news, bad news.” The good news is that there are reasonably effective controls for each area of gen AI security. The bad news is that successful malicious exploitation has occurred in these areas when gen AI providers and users assumed others were responsible for security.
While adequate security measures are available, they must be specified and ordered a la carte. We lack the equivalent of the widely understood shared security responsibility model that exists for cloud computing. However, it took years of breaches and incremental progress for cloud security to mature. We can ill afford to repeat the same timeline given the growing prominence of gen AI. Government can help drive the evolution and adoption of such a model by explicitly addressing security expectations in its contracts for gen AI services, by tying funding grants to adherence to best security practices, or potentially by regulation.
The electrical grid is a diverse environment, consisting of large, well-resourced utilities and small rural producers who may lack a full-time IT department or dedicated cybersecurity staff, all operating on the same regional power grid. Utility providers already face significant challenges; they’re breached twice as often as other types of organizations and are generally slower to respond, incurring higher costs than other organizations.
How the federal government can help
At the federal level, a veritable alphabet soup of agencies will likely play active roles in addressing power or security for gen AI. Some AI data center complexes plan to use on-site nuclear power in the form of small modular nuclear reactors, which would involve the Nuclear Regulatory Commission and Department of Energy, which recently approved the conceptual design of a plant focused on advanced nuclear fuel recycling.
Smaller agencies like the North American Electric Reliability Corporation and the Federal Energy Regulatory Commission will continue to play important roles in ensuring grid resilience and reliability. In terms of security, the resources and insight provided by the Cybersecurity and Infrastructure Security Agency will be invaluable, while the National Institute of Standards and Technology develops models and taxonomies that will enhance both cybersecurity and technical/operator interoperability.
The convergence of power and security
Cybersecurity is about ensuring the confidentiality, availability and integrity of information and information-driven services. While we usually think of attacks on availability as denials of service to flood a provider with bogus requests and crowd out legitimate user activity, gen AI’s availability could also be attacked indirectly through the power grid. While data centers have onsite backup power, a sustained attack on power generation or transmission leading to a prolonged outage could cause an interruption of gen AI’s availability. Both nation-state adversaries and criminal groups already target our energy grid, and as gen AI becomes more important, the energy sector will become an even more attractive target.
The energy sector is a regulated industry where state and local governments, alongside federal agencies, playing an important role in the regulatory process. Energy providers can’t simply convince their board of directors to allocate funds for cybersecurity, because utility rates are typically set at the state level by public utility commissions. Similarly, permitting for construction of energy facilities often involves multiple levels of government, including local planning commissions and building inspectors. Solutions can be further complicated by NIMBY attitudes and other non-technical factors.
The public and private sector must partner to address both the power and security implications of gen AI. Organizations, including government agencies, shouldn’t be blind to the challenges that inevitably arise when new, unknown, and evolving technology such as gen AI is integrated into our everyday life, our critical infrastructure, and the essential functions of government.
Jim Richberg is head of cyber policy and global field CISO at Fortinet, and a Fortinet federal board member. He previously served as the national intelligence manager for cyber in the Office of the Director of National Intelligence, and oversaw the implementation of comprehensive national cybersecurity initiatives under Presidents Barack Obama and George W. Bush.