In February 2021, the water supply in a coastal municipality near Tampa, Florida, USA, received major attention. According to Pinellas County Sheriff Bob Gualtieri, the amount of sodium hydroxide (lye) in the water supply for approximately 15,000 people was adjusted from 100 parts per million (ppm) to more than 11,100 ppm. He gravely reported that this was “dangerous stuff.” Lye is commonly used to purify water supplies, so its presence alone was not noteworthy, but as chemists often say, “the dose makes the poison.” At the levels indicated, it was indeed a dangerous situation. So, how did it happen?
The investigation revealed that there was some type of remote access enabled on the industrial control systems (ICS), which were composed of five computers and several iPads. This was not immediately cause for concern because the administrators used remote access tools regularly. During the approximately five-minute intrusion, the perpetrator was able to launch nonstandard programs and change the levels of lye that were released into the water supply.
This event was connected to concerns raised by cybersecurity professionals the world over, namely inadequately funded cybersecurity efforts concerning critical infrastructure including water, wastewater, food, agriculture and dams. Many of these sectors rely on operational technology (OT) as opposed to IT. This distinction is made to show how IT is used as the primary value proposition and elevates it from a purely informational asset to an operational one. The underlying technology of OT systems is often outdated or is designated for remote access to balance the availability of resources with security. It is this fact that made the water supply event a forecasted cyber tragedy that had a real-world impact. Thankfully, the event was stopped before the water sickened or poisoned anyone. The lesson for all was that the event needed to be taken seriously and budgets needed adjusting to ensure that such events never happen again, a refrain one often hears about such tragedies.
Except, it appears this entire event never happened—or at least, the supposed motivations behind it. According to recent reporting, internal staff accessed the system and clicked on the wrong amount of lye to add. Once they immediately fixed the error, it was dutifully reported to a supervisor. The media reporting made the connection to adversarial cybersecurity concerns and framed it as a hack. This triggered a 4-month investigation by the US Federal Bureau of Investigation (FBI) and US Secret Service. That investigation concluded with no evidence of a hack.
It is easy for those in the cybersecurity profession to be energized by talk of cyberattacks. After all, if we are the network defenders that we are told we are, this is our raison d’etre. However, risk professionals must take a probabilistic look at things and understand that, energized or not, there is a need to allocate resources to the most probable and impactful areas of concern.
Hanlon’s Razor is an adage that says, “Never attribute to malice that which can be adequately explained by stupidity.” Although somewhat harsh in this context (after all, everyone makes mistakes), it does capture the essence of what happened during the water supply incident. After the event occurred, the immediate belief was that there had to be some nefarious actor dedicated to the poisoning of thousands of Floridians. In reality, someone clicked the wrong button. Much less exciting, but considerably more likely. I often joke that, as I age, my injuries are much less associated with fantastical feats of bravado. No longer is a gash on my leg from defending my family from a wild coyote attack, but instead, I bumped into the coffee table. The real joke, however, is that it was always that way, and thankfully so.
In cybersecurity, we oftentimes pay scant attention to methods of preventing user errors. Why should there be a button that allows a user to dispense lethal doses of lye in water supplies and we merely hope that it is never pushed? Instead, can we programmatically implement features to avoid that scenario altogether? We should never ignore our external adversaries in cybersecurity, but neither should we ignore the adversary in ourselves.
Jack Freund, Ph.D., CISA, CISM, CRISC, CGEIT, CDPSE, NACD.DC
Is a vice president and head of cyberrisk methodology for BitSight, coauthor of Measuring and Managing Information Risk, 2016 inductee into the Cybersecurity Canon, ISSA Distinguished Fellow, FAIR Institute Fellow, IAPP Fellow of Information Privacy, (ISC)2 2020 Global Achievement Awardee and the recipient of the ISACA® 2018 John W. Lainhart IV Common Body of Knowledge Award.