For Cyber Week, at Tel Aviv University last month, experts from some fifty countries came to discuss computer-based threats to the economy, population, and government. As speaker Menny Barzilay pointed out, these threats aren’t accompanied on the evening news by pictures of crying babies or galumphing tanks, so they seem a little less immediate than they are. Barzilay is Chief Technical Officer of the Blavatnik Interdisciplinary Cyber Research Center, and “Interdisciplinary” does sound like a way of saying “this is too fuzzy to even belong anywhere specific.”
But the immediacy was stressed by Dr. Tehilla Shwartz Altshuler of the Israel Democracy Institute. First she reminded the audience to bear in mind that we’re already accustomed to being tracked by our computers. We know that if on the web today we research claw hammers, tomorrow when we look for socks on Amazon.com we’ll be offered a claw hammer or two. And we don’t feel too disturbed; our sensitivities regarding privacy have worn away where shopping is concerned. But then Dr. Altshuler pointed out that some day the same technique could be applied where thinking is concerned. What if a computer algorithm tracked our opinions and made sure to respond with what it considered appropriate material? We’d all have difficulty finding our way around the marketplace of ideas, and where would that leave our democracy?
Next at the rostrum, Noa Elefant Loffler spoke about the problem of offensive videos on YouTube. She’s a Senior Public Policy Manager at Google Israel, and Google owns YouTube. On the one hand, she said, algorithms and human inspectors can spot harassment, hate, nudity, and violence, and anything forbidden by local law or by YouTube policy is removed. Police and some NGOs have the status of “trusted flaggers” whose complaints take priority, but complaints from the public receive attention too. On the other hand, there are videos that are nearly but not exactly bannable — for example, a video that doesn’t incite to violence but contains a message of religious extremism or racial superiority. Such borderline videos are consigned to a “view only” status on YouTube. They are allotted no comment area, they aren’t allowed to carry advertising, they can’t be shared, and they don’t pop up in YouTube’s recommendations to anyone. In fact, rather than being steered toward similar videos, those who watch a “view only” video will be steered toward videos that carry a more acceptable message — perhaps something about the dangers of fanaticism or the importance of tolerance.
In other words, computer programs are already attempting to control our opinions. YouTube’s actions sound benign, but who decides when religious belief amounts to extremism? Who decides when sociological observations border on racism?
When hackers make a website collapse by overloading it with input, the term is “denial of service.” Yigal Unna of the National Cyber directorate speaks of a “cognitive denial of service” occurring when our minds are overwhelmed by tendentious input such as fake news, or even news of real events timed to panic us at a critical moment.
MK Yair Lapid warned that if the recent American elections and French elections were seen as targets for interference, certainly the next Israeli elections will also be. It doesn’t need to be computerized interference in the tallying. He mentioned the famous election-day moment when Bibi Netanyahu called on his followers to counterbalance the Arab vote and electronic media amplified the call. Suppose someone deceived the public with a false rumor during voting hours. Or suppose someone simply turned off the traffic lights.
Apparently it will be a long while before all the threats can be addressed, from computer intrusion to manipulative media, but I’d like to raise one relatively low-tech suggestion.
There at Cyber Week, a video from Guy Mizrahi of the RayZone group showed a crowd of people crossing the street. The video was special because the individual people had actually crossed the same street at different times. They were all superimposed on the same clip and as they walked, each was accompanied by a label showing the time of photography. The software — too expensive for anyone but a government, Mizrahi said — made surveillance a lot easier.
Cameras are cheap now, cameras can reside everywhere, and a video stream can be captured and processed like any other data stream. I think that the simplest classic form of election fraud, a form still suspected here and there in Israel, can easily be discouraged by cameras. If we can have cameras on the Temple Mount, the polling place isn’t too holy for them. Well, except the voting booth itself.
Along with the human beings who watch the voters’ envelopes being collected, opened, and counted — who are never a big enough force, and who can be distracted, bribed, or intimidated — there should be cameras watching everything the staff is up to. In case of any challenge, it should be possible to replay the video and see whether the ballots that were cast wound up being counted correctly.
Technology can solve some old problems, on its way to creating new ones.