How do you promote free speech on the web? According to executive chairman of Google, you develop technologies that give it limits.
Sound counterintuitive? Well in an op-ed for the New York Times this week, Eric Schmidt says just this. Here’s the essence of his proposal:
It’s our responsibility to demonstrate that stability and free expression go hand in hand. We should make it ever easier to see the news from another country’s point of view, and understand the global consciousness free from filter or bias. We should build tools to help de-escalate tensions on social media — sort of like spell-checkers, but for hate and harassment.
Spell-checkers for harassment? I’m not confident in how well this work, particularly considering how bad a job actual spell-checkers have done in creating a web free of grammatical error. Spell-checkers can also make incorrect assumptions on what your intentions are as a writer. Even if you were to agree to functionally fight web harassment with algorithms, how would you distinguish between hate speech and descriptions of it? And what would it look like to the user when the spell-checker suggests “what you really meant to say?”
Even if the logistics behind Schmidt’s proposal are questionable, his points on harassment are salient. A recent New York Times feature on “swatting,” for example, showed the extraordinary lengths one web user went to to target hundreds of women who were active on gamer’s network Twitch. If you’re unsure what “swatting” is, you’re not alone. Most of the agencies targeted didn’t get it either when these cases started popping up. For context, watch the video below:
Web harassment has reach such a level of complexity that it often demands response from law enforcement—a response our legal system currently is not up-to-date enough to empower law enforcement official to give. When hate crimes are committed across international borders through secret networks of false phone numbers, responding through investigation becomes both costly and legally futile.
Scale these concerns to matters of national security—as many this week have done in reflecting on how social media has given power to ISIS—and you’ll start to get brilliant people like Schmidt, formulating vague proposals for attacking a problem that is similarly undefined.
As we discussed in our last class in the case of Michelle Obama, Google is increasingly in a position to make ethical decisions. On the topic of free speech, just what length should Google go to in creating safe spaces for users, and what exactly does a “safe space” on the web look like?
Interestingly, much of what Schmidt is getting at in his proposal is fairness and balance to news. He is arguing that when countries exert their power to limit free speech they are making the web an unfair place. A more international web, which provides a diversity of perspectives, will solve the problem of harassment and hate presumably by creating empathy and holding political leaders accountable for their manipulations of internet.
So where does this business with the spell check come in? What happens when, as in the case of “the serial swatter,” harassment goes beyond written form?
And in judging just what harassment is, are you not simply introducing another form of bias?
It’s a big problem, and it demands a little more than a spell-checker.