Let’s say it’s four weeks until Election Day 2020 and threat actors are busy at work planning to disrupt the election. Election officials are working to thwart their nefarious plans: Who will come out on top? That’s the biggest question Operation BlackOut sought to answer in a simulated election event held on Aug. 19.
More than 40 cybersecurity professionals gathered virtually in the event organized and managed by Cybereason to see whether the essential mechanisms of an election were subject to successful attack and whether they could, to any significant extent, be defended. The participants came from law enforcement, government, security consulting, and academic organizations, divided into red and blue teams contending over a successful election in the mythical city of Adversaria.
One of the rules of the simulation was that voting machines were off-limits: The red team couldn’t assume that the equipment for actual voting was vulnerable.
And that led to an interesting situation and philosophical question: If what the adversary was attacking and what the defenders were protecting were two different things, could both sides claim success?
Four weeks to election
Each team submitted one development (a capability being developed) and two actions in each of four turns: Each turn represented one week of time. From the beginning, it was obvious that the two teams had different priorities.
The blue team focused on providing a safe voting environment that was available to as many voters as possible. The red team sought to disrupt voting and reduce confidence in the election’s results.
Both teams shared an early realization: communication was the primary battleground. The blue team sought to get in front of misinformation with clear instructions from authoritative sources such as the mayor and supervisor of elections, while the red team turned to hacking social media influencer accounts, official municipal accounts, and disinformation robocalls to sow confusion.
One of the lessons drawn by both sides was how inexpensive it was for the red team to have an impact on the election process. There was no need to “spend” a zero-day or invest in novel exploits. Manipulating social media is a known tactic today, while robocalls are cheap-to-free.
Countering the red team’s tactics relied on coordination between the various government authorities and ensuring communication redundancy between agencies. Anticipating disinformation plans that might lead to unrest also worked well for the blue team, as red team efforts to bring violence to polling places were put down before they bore fruit.
The red team also tried to interfere with voting by mail; they hacked a major online retailer to send more packages through the USPS than normal, and used label printers to put bar codes with instructions for resetting sorting machines on a small percentage of those packages. While there was some slowdown, there was no significant disruption of the mail around the election.
Lessons learned
The blue team was successful in providing safe voting for the election and in making sure that ballots cast, either in person or by mail, were counted. The red team was successful in raising doubts about the election’s validity, especially among voters from one party. Those doubts in 30% of the voters persisted for at least a year after the election.
So what lessons did participants take away from the exercise?
One lesson is that the low cost of sowing confusion means that it doesn’t take a nation-state: Any group with an interest in disrupting an election or the faith in its results has a shot at doing so. That possibility forms part of the next lesson.
No one can account for every possibility; the attacker still has an advantage. To even the odds a bit, defenders must consider as many possibilities as they can and make sure that lines of communication and responsibility are clear. Tabletop exercises (like this one) can be crucial in broadening defenders’ thinking about just what is possible.
Those lines of communication need to go between governmental agencies and between government and private sector organizations so that the possibility of learning and communicating as much as possible are realized. Defenders should also develop playbooks for many scenarios and distribute those playbooks to the organizations and individuals responsible for enacting the defense tactics they contain.
A final lesson was the extent to which “winning” is a game of perception: The public perception must be that the election is safe, accurate, and reliable. In this case, the blue team had police officers deployed to polling places to visibly reassure voters that voting is safe. Other locations and environments might require other actions to produce the same results. Defenders who know the population they’re defending can put that knowledge to use for the best results in a critical activity.
Â
Curtis Franklin Jr. is Senior Editor at Dark Reading. In this role he focuses on product and technology coverage for the publication. In addition he works on audio and video programming for Dark Reading and contributes to activities at Interop ITX, Black Hat, INsecurity, and … View Full Bio
Â
Recommended Reading:
More Insights