COLUMBUS (WCMH) — Everyone knows that you have to have good cybersecurity to protect data from falling into the hands of those who would steal it.
Often that means added security measures that may require software or hardware to block access.
Those security measures are only as good as the people using them, and when it comes to elections and their security, nothing is more important.
Ohio Secretary of State Frank LaRose has a passion for elections and making sure they are fair, secure, and unimpeachable is his mission.
He has done about as much as he can to ensure the primaries that will be held here in a few weeks, and the General Election set for November, will be accurate and that every person eligible to vote will get to, and that their vote will be counted.
His efforts, and those of lawmakers at the Statehouse, are being lauded by a partnership between University of Southern California-Berkley and Google that is taking a cybersecurity training course around the country to all 50 states.
Their target audience: campaign professionals, to impress upon them the need for security; election officials, to provide best practices some of which are coming from Ohio; and the public, to educate them on another aspect of election security that they are both subject to and play a role in.
LaRose points out the reality of the situation here in Ohio.
“All the work at the Board of Elections, that’s done in a bipartisan way, where there is two locks and two keys for all these doors,” LaRose said. “I always joke, it’s like those 1980’s submarine movies where it takes two keys to launch the torpedo. That’s how it is at the Board of Elections. The room where the tabulation equipment is stored has a Republican lock and a Democratic lock. The room where the machines are stored has a Republican lock and a Democratic lock. Everything is supervised by bipartisan teams. The machines are never connected to the internet. They’re tested before the election, by bipartisan teams of experts, and they’re audited after the election where we require a post-election audit. All of those safeguards that go into place. Our foreign adversaries know that. They know that they can’t actually change the results of an election or tinker with the numbers or that kind of thing. But what they can do is they can sow doubt.”
That sowing of doubt is where the second half of Monday’s training focused. Partially for campaigns and partially for the public, the information provided shows just how easy it can be for someone to manipulate another based on biases.
LaRose went on to describe how social media can be used against us.
“This is an example from Texas where a foreign operative, a foreign adversary, created a Black Lives Matter Facebook group and also created a White Supremacist Facebook group, and got them both followers and got people engaged in this, and then a few months later, they said, ‘We’re gonna have a rally,’” said LaRose. “Well, guess what? They scheduled the White Supremacist at the same time and location as the Black Lives Matter rally. What do you think would have happened if that had gone down?”
In that instance, law enforcement sniffed out the duplicitousness of the dual rallies and was able to head off the confrontation.
In some ways, there is a great risk of election interference coming from social media than through cyberattacks on election equipment and organizations.
According to USC Annenberg fellow for communication security Marc Ambinder, when people ingest information, it affects different parts of their brain. Positive information they agree with triggers a part of the brain that chemically rewards them. At the same time, there is another part of the brain that is triggered by things that disgust people, and the reaction likewise rewards the person chemically.
Ambinder said that is why trying to reason with people over something like a deeply held belief is extremely difficult to do with facts alone. He added they may ultimately accept the facts to be true, but may find ways to rationalize things so they can still believe what they want in spite of them.
Ambinder also says things that are said loudly and with confidence are more often believed, and so messages are being constructed to influence people with that in mind.
“Nobody wants to be fooled by disinformation, so the more that you can sensitize to the fact that they might be fooled, and say, ‘Hey, do you really want to fall for this?,’ that’s actually a really effective tactic against people who are loud and people who are certain and people who are bullies,” said Ambinder.
Things are getting even trickier, however, with technology opening up the possibility of manipulating videos so well, it can be difficult to recognize for deception.
“We have now simple, readily accessible tools to create really good looking fake videos, and so candidates are all going to be facing people out there who will create videos of, whether it’s Donald Trump or whoever the Democrats nominate, saying something they never said, doing things they never did,” said USC Annenberg Exec. Dir. for the Election and Cyber Security Initiative Adam Clayton Powell III.
Powell said it can be really difficult to spot by the public because it requires a great bit of expertise to sniff out. Several universities have set up a kind of video fact checker that will analyze the video to determine its voracity.
The bad news is, information moves lightning fast today, so by the time they get ahold of it and can analyze it, the damage will have already been done.
Powell hopes that quick responses to these false videos will help minimize the damage that can be done.
Something similar to this happened not too long ago. A video was posted by someone claiming it was Iran firing missiles into Iraq at American forces. The video was several years old and from another part of the world.
Despite that, the video, and the message it carried, picked up 150 impressions on Twitter in an hour and was retweeted by verified users including some who worked for news networks.
According to Ambinder, the video migrated to a number of different online platforms within minutes.
Another impact outside influencers want to have on our elections and on the democratic process as a whole is for Americans to simply give up and not participate because they think it’s pointless or too complicated.
They bombard with disinformation to put you in a position of civic paralysis, according to Ambinder. That disinformation can be amplified by people of prominence that you trust, if they themselves are fooled into thinking it is true.
If you don’t participate in the election process because you have been convinced your voice doesn’t count or it doesn’t matter, then the enemies of democracy have won. That is why it is important to insist that disinformation is rejected by everyone.
I asked both major political parties if they would commit to rejecting disinformation spread by anyone, including one of their own candidates or their campaigns, as a way to assure the people of Ohio they have their concerns over security and fairness of our elections squarely in mind.
Ohio Democratic Party Chairman David Pepper made the commitment without stipulation: ”Yeah, I’m against that and if someone on my side is doing it, I would basically not allow for it.”
Ohio Republican Party Chairman Jane Timken also made a commitment. “As much as my job is to protect our party brand and our candidates from disinformation and misinformation, I will do that,” she said.
LaRose made a commitment, but his statement limited it to calling out any disinformation he sees about our elections and the security of them.
The one thing everyone involved agrees on is Ohio voters need to vote, and that sitting on the sidelines because you have been disenfranchised is not in anyone’s best interest.
Here are some tools provided by Ambinder for protecting yourself from being duped, he recommends bookmarking these:
Using these tools can help you figure out if the messages you are seeing on social media are true or intentionally created to deceive or manipulate you.