Advertisement

New cyber reality: With great interdependence comes great liability

Biden's cybersecurity strategy rightly advocates for more regulation. For companies doing security right, there’s no need to panic.
Getty Images

For more than a decade, government leaders have grappled with an insurmountable reliance on digital technologies and communications without an aggressive approach to security. Technology vendors have pushed their products to market under the guise that liability shifts once products are delivered, bolstering their position in the marketplace with security by design or after-market protections. Security products and partnerships offer a complex add-on tapestry to backstop the black hole that is identifying and mitigating every potential threat or exploit.  

In security consulting, there’s an adage suggesting a 60/40 rule when analyzing sectors’ willingness to sink costs into impending regulation without a forcing mechanism. Sixty percent of companies will likely wait and see how 40% of leading companies respond. For cybersecurity regulations, it’s more likely 80/20. The national cybersecurity strategy released Thursday decidedly states that’s not good enough. While there’s clearly room for improvement at every level, companies already taking cybersecurity seriously should not be panic-stricken by the new strategy document.  

Mobilizing concurrent themes  

The national cybersecurity strategy was not released in a vacuum. U.S. agencies such as the Cybersecurity and Infrastructure Security Agency or the National Institute for Standards and Technology have updated various strategies, standards, recommendations and best practices in the past year. The new Network and Information Security Directive, or NIS2, in Europe tasks member states with a strategic cyber reassessment. It suggests entities assess the proportionality of their risk management activities, consider their individual degree of exposure to risks and the societal and economic impacts stemming from potential incidents. Meanwhile, the United Nations is attempting to improve international law enforcement capacity in cybersecurity, most notably by centering on legal specificity around “intent” and “intentionality” when actors or groups carry out potential criminal activities. 

Advertisement

Owners and operators of critical infrastructure — oil and gas, power and utilities, water treatment and purification facilities, manufacturing, transportation, hospitals, connected buildings and more — are responsible for securing their operations and processes from the inside out, with assorted regulatory and compliance requirements within and across each sector. Critical infrastructure cybersecurity presents a massive needle in a haystack problem. Where information technology sees many vulnerabilities likely to be exploited in similar ways across mainstream and ubiquitous systems, operational technology is often a proprietary case-by-case distinction. The oversimplification of their differences leads to a contextual gap when translating roles and responsibilities into tasks and capabilities for government, and business continuity and disaster recovery for industry. 

Visibility gaps across critical and interdependent sectors allow for the threat landscape to continue to grow. The prevailing argument that market forces are not enough is married to the fact that some data does suggest that regulation can work. As cybersecurity experts regularly point out, it has always been the how that matters most. If we’re talking about available data, we only have regulatory data from regulated industries. In the same vein, we have more attack data in sectors where technologies and communications provide data to analyze, i.e., have security logs, tools and monitoring. 

Sector-specific security mandates have been on the table predating the release of the national strategy. The strategy overwhelmingly welcomes private sector input, workforce enhancement, vendor cooperation and security by design or cyber-informed engineering indoctrination. How to measure success will mark the real impacts and/or failures of the strategy in the years to come. That said, the agencies and authorities outlined above have always struggled with getting cyber metrics right, even with audacious goal setting. 

Stones thrown at glass houses  

The new strategy begs a holistic and prudent use of institutional baggage and attention to security neglect. Unfortunately, there is no single source of truth to turn to for advice across the myriad of agencies and authorities interoperating at the federal level. Each company therefore must identify internal teams or champions who act as their own independent advisors, having to do literature reviews and consensus mapping cross-referencing relevant standards, regulations, suggestions and best practices. Security leaders and teams then must map: 

Advertisement
  • The status of their security program, risk ownership, and visibility gaps 
  • Existing management and mitigation tools, resources, and capacity
  • The development environment of third-party products and security management of suppliers  
  • Enterprise content management, data security and PII
  • Operational products and services, hardware, software, IoT, cloud offerings, etc.  
  • Upstream and downstream supply chain 
  • Operational technology and cyber-physical security   
  • The sea of available add-on security products 

This status quo continues to confuse stakeholders — by accident or design — as to which party is in the best position to avoid losses. It comes as no surprise that this model has not served any industry well. Risk management has countless starting points with no finish line. Risk tolerance therefore is a cycle of entities mapping necessary security components of their organization, attempting to understand how those components fulfill various portions of existing standards, regulations, suggestions and best practices, while hoping compliance regimes measure the right things as necessary to have — which are ultimately industry-specific and therefore recreate the cycle.  

The forest or the trees?  

The government has an undeniable duty to direct efforts and regulations to avoid worst-case scenarios, in the physical world, cyberspace and their convergence. To date, roles and responsibilities from government efforts have not translated into appropriate tasks and capabilities for implementation. Government entities want software and hardware inventories, mapped CVEs, tracked threat intelligence and bottom-up situational awareness, but lack the capacity to collect this security census data on a national level. Industry has the individual capacity to collect this information. The government’s latest strategy implores them to do so.  

Advertisement

There is an indisputable disconnect in functional context for overarching federal cyber governance. Debating the hot-button issues of vendor hacking-back and making companies liable for software insecurity doesn’t help us help more asset owners across critical infrastructure get security right. As previously pointed out, “there is a thriving global private sector for cybersecurity products and solutions which is increasingly lucrative and largely unregulated.” If iron sharpens iron, software vendors and technology manufactures should absolutely hold themselves to a standard of care. If mistakes are made along the way, heads do not need to roll for better lessons to be learned and for those lessons to lead to actionable outcomes. Regardless of the national strategy, the “basics” for which components of a security program are necessary remain relatively unchanged. 

Danielle Jablanski is an OT cybersecurity strategist at Nozomi Networks and a nonresident fellow at the Cyber Statecraft Initiative of the Atlantic Council’s Scowcroft Center for Strategy and Security.

Latest Podcasts