The real Trolley Problem in tech

With all the talk about autonomous cars in general media this year, we all got a refresher in Ethics 101 and the Trolley Problem. The Trolley Problem is where you as an onlooker see a trolley barreling towards 5 people. There is a switch you can throw where it will kill 1 previously safe person instead of 5. What do you do? Do you take part in an act which kills 1 person while saving 5? Do you refuse to take part in an act of violence, but then willing let 5 people die because of it? No right answers, just a theoretical problem to think through and see all the trade offs.

But as fun as this all is, Autonomous cars are not the big trolley problem in Tech. Organizing and promoting information is.

Right now, if you put the phase “Was the holocaust real?” into Google, you’ll get back 10 results. 8 will be various websites and articles that make the case that it was not real, but a giant hoax. The second hit (not the first) is the Wikipedia article on Holocaust Denial, and a link further down from the United States Holocaust Memorial talking about all the evidence presented at Nuremburg.

8 out of 10.

The argument we get in Tech a lot is that because results are generated by an algorithm, they are neutral. An algorithm is just a set of instructions a human built once upon a time. When it was being built or refined some human looked at a small number of inputs, what it output, and made a judgement call that it was good. Then they fed it a lot more input, far more than any human could digest, and let it loose on the world. Under the assumption that the testing input was representative enough that it would produce valid results for all input.

8 out of 10.

Why are those the results? Because Google came up with an insight years ago that webpages have links, people produce webpages, and important sites with authoritative information get linked to quite often. In the world before Google, this was hugely true, because once you found a gem on the internet, if you didn’t write it down somewhere, finding it again later was often impossible. In the 20 years since Google, and in the growth of the internet, that’s less true.

It’s also less true about basic understood facts. There aren’t thousands of people writing essays about the holocaust anymore. There are, however, a fringe of folks trying to actively erase that part of history. Why? I honestly have no idea. I really can’t even get into that head space. But it’s not unique to this event. There are people who write that the Sandy Hook shooting was a hoax too, and harass the families who lost children during that event.

8 out of 10.

Why does this matter? Who looks these things up? Maybe it’s just the overly radicalized who already believe it? Or maybe it’s the 12 year old kid who is told something on the playground and comes home to ask Google the answer. And finds 8 out 10 results say it’s a hoax.

What could be done? Without Google intervening people could start writing more content on the internet saying the Holocaust was real, and eventually Google might interpret that and shift results. Maybe only 6 out of 10 for the hoax. Could we get enough popular sites so the truth could even be a majority, and the hoax would only get 4 out of 10? How many person hours, websites, twitter posts do we need to restore this truth?

As we’re sitting on Godwin’s front lawn already, lets talk about the problem with that. 6 million voices (and all the ones that came after them) that would have stood up here, are gone. The side of truth literally lost a generation to this terrible event.

So the other answer is that Google really should fix this. They control the black box. They already down rank other sites for malware, less accessible content, soon for popup windows. The algorithm isn’t static, it keeps being trained.

And the answer you get is: “that’s a slippery slope. When humans start interfering with search results that could be used by the powerful to suppress ideas they don’t like.”

It is a slippery slope. But it assumes you aren’t already on that slope.

8 out of 10.

That act of taking billions of documents and choosing 10 to display, is an act of amplification. What gets amplified is the Trolley problem. Do we just amplify the loudest voices? Or do we realize that the loudest voices can use that platform to silence others?

We’ve seen this already. We’ve seen important voices online that were expressing nothing more than “women are equal, ok?” get brutally silenced through Doxing and other means. Some people that are really invested in keeping truth of the table don’t stop at making their case, they actively attack and harass and send death threats to their opponents. So now that field of discourse keeps tilting towards the ideologues.

8 out of 10.

This is our real Trolley problem in Tech. Do you look at the playing field, realize that it’s not level, the ideologues are willing to do far more and actually knock their opponents off the internet entirely, and do something about it? Or do you, through inaction, just continue to amplify those loud voices. And the playing field tips further.

Do we realize that this is a contributing factor as to why our companies are so much less diverse than society at large? Do we also realize that lack of diversity is why this doesn’t look like a problem internally? At least not to 8 out of 10 folks.

8 out of 10.

I don’t have any illusions this is going to change soon. The Google engine is complicated. Hard problems are hard.

But we live in this world, both real and digital. Every action we do has an impact. In an increasingly digital world, the digital impact matters as much as, or even more than the real world. Those of us who have a hand in shaping what that digital world looks like need to realize how great a responsibility that has become.

Like the Trolley Problem, there is no right answer.

But 8 out of 10 seems like the wrong answer.

 

2 thoughts on “The real Trolley Problem in tech”

  1. In my view, the Google search example in the post is working the way it should. The query results are not based upon the ratio of holocaust denial sites to legitimate sites, but upon the way the query is stated. If a person really, truly is looking for information about the holocaust, she would simply google “holocaust”. Doing so returns 10 reputable websites, and no holocaust denial websites. It’s only if one really wants to see viewpoints that challenge the overwhelming historical evidence that one would google “Was the holocaust real?” In that case, google is smart enough to understand that you’re specifically looking for viewpoints that doubt the conventional one. Google search even has the wisdom to return Wikipedia’s holocaust denial article as being highly relevant. In other words, the query “Was the holocaust real?” strongly biases the search in a very different direction from just “holocaust”. This difference in queries and corresponding results is quite a useful feature, and should not be removed.

    As I see it, the real problem is that many people seek out and uncritically accept these conspiratorial viewpoints. This is a people problem, not a technology problem. Conspiracy theories, of which holocaust denial is one of many, have been extensively studied by scholars of many different stripes. I highly recommend Wikipedia’s comprehensive article on Conspiracy Theory, which categorizes conspiracy theories along multiple dimensions, explains their psychological and sociological motivations, and in so doing provides unflattering characterizations of conspiracy theory adherents.

    Like

    1. The issue I have with that is the following.

      Google doesn’t come with a manual. There is no manual of “Google for a term to get the truth about it, pose in the form of a question to get all the crazy conspiracies about it”. Which is effectively the interface that exists.

      Google is pushing towards conversational interfaces. The entire Google Assistant ad campaign (running now) is about asking natural language questions https://www.youtube.com/watch?v=FPfQMVf4vwQ. So it’s not really clear why people should realize that natural language questions aren’t appropriate for Google. Especially if they aren’t in tech and understand the complexities.

      I’m honestly less concerned about those people that *really* want this information. But it’s not really clear to me that “was the holocaust real?” I’m more concerned by people that stumble down this path because they were trying to understand that thing they overheard that can’t have been true. And then unless they are going to devote a ton of time to it, they find a surface of a bunch stuff which points in a really bad direction.

      We can always blame the user. They did it wrong. But part of our goal as shapers of these systems should be to help the user make less mistakes. And when users do, find ways to correct it.

      Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s