Category Archives: Social

Fluidity of Language

Having a toddler definitely makes you realize how fluid our brains are for mapping and adapting to language changes. And, how quickly those language changes can become a dialect that breaks up understanding.

Things I never considered before being a parent, what is the difficultly level of you child's name for them to say. Because it turns out that if you give you child a name with both a W and an R, those are pretty late on the sound acquisition timeline. So they are not going to use their name as a token for themselves during early speech development, because they physically can't pronounce it.

So, my daughter latched on to the other token that was constantly being used in her direction: "you". When I first saw that emerging it was completely confusing until I figured out the logic of how she got there. The first week I even tried to stamp it out. But, you know what, language is organic.

So, we're living with pronoun inversion for the moment. After two weeks of it, my brain rewired to make it normal, and I don't miss a beat any more. The only time I really realize it's a thing is when friends come over that haven't seen her in a while and she talks to them. And a "You have mama bear" is interpreted as gift giving instead of statement of fact. And the misinterpretation brings scowls.

A lot hinges on a single word some times, and the assumption that we are all using these tokens the same way. But even in normal adult interactions, we aren't. It gives me a finer appreciation of how even if you think you understand people, you need to double check.

Over communicating

I once had a college class where the instructor was in the process of writing a text book for the class. So we were being taught out of photocopies of the draft textbook. It wasn't a very good class.

It wasn't that he wasn't a good writer. He was. The previous semester I'd had a great class using one of his text books. But it was taught by a different professor. There were some places that the text made a lot of sense to me, and some places where the different approach of the non-author professor made far more sense. With two points of view it's about synthesizing an understanding. If something from the book didn't really stick, something from in class might. And each aha moment made everything you'd read or heard before make a bit more sense.

I was reminded of this this morning reading through some technical documentation that was light on content and heavy on references. It was written that way so as to not repeat itself (oh, this part is explained over her instead). And while that may make sense to the authors, it doesn't make for an easy on ramp for people trying to learn.

It's fine to over communicate. It's good to say the same thing a few different ways over the course of a document, and even be repetitive at times. Because human brains aren't disk drives, we don't fully load everything into working memory, and then we're there. We pick up small bits of understanding in every pass. We slowly build a mental approximation of what's there. And people have different experiences that resonate with them, and that make more sense to them.

There isn't one true way to explain topics. So, when in doubt, over communicate.

 

Universal Design Problems

Credit: Amy Nguyen

A great slide came across twitter the other day, which rang really true after having a heated conversation with someone at the OpenStack PTG. They were convinced certain API behavior would not be confusing because the users would have carefully read all the API documentation and understood a set of caveats buried in there. They were also astonished by the idea that people (including those in the room) write software against APIs by skimming, smashing bits into a thing, getting one successful response, and shipping it.

The theme of the slide is really Empathy. You have to have empathy for your users. They know much less about your software then you do. And they have a different lived experience so even the way they would approach whatever you put out there might be radically different from what you expected.

The real Trolley Problem in tech

With all the talk about autonomous cars in general media this year, we all got a refresher in Ethics 101 and the Trolley Problem. The Trolley Problem is where you as an onlooker see a trolley barreling towards 5 people. There is a switch you can throw where it will kill 1 previously safe person instead of 5. What do you do? Do you take part in an act which kills 1 person while saving 5? Do you refuse to take part in an act of violence, but then willing let 5 people die because of it? No right answers, just a theoretical problem to think through and see all the trade offs.

But as fun as this all is, Autonomous cars are not the big trolley problem in Tech. Organizing and promoting information is.

Right now, if you put the phase "Was the holocaust real?" into Google, you'll get back 10 results. 8 will be various websites and articles that make the case that it was not real, but a giant hoax. The second hit (not the first) is the Wikipedia article on Holocaust Denial, and a link further down from the United States Holocaust Memorial talking about all the evidence presented at Nuremburg.

8 out of 10.

The argument we get in Tech a lot is that because results are generated by an algorithm, they are neutral. An algorithm is just a set of instructions a human built once upon a time. When it was being built or refined some human looked at a small number of inputs, what it output, and made a judgement call that it was good. Then they fed it a lot more input, far more than any human could digest, and let it loose on the world. Under the assumption that the testing input was representative enough that it would produce valid results for all input.

8 out of 10.

Why are those the results? Because Google came up with an insight years ago that webpages have links, people produce webpages, and important sites with authoritative information get linked to quite often. In the world before Google, this was hugely true, because once you found a gem on the internet, if you didn't write it down somewhere, finding it again later was often impossible. In the 20 years since Google, and in the growth of the internet, that's less true.

It's also less true about basic understood facts. There aren't thousands of people writing essays about the holocaust anymore. There are, however, a fringe of folks trying to actively erase that part of history. Why? I honestly have no idea. I really can't even get into that head space. But it's not unique to this event. There are people who write that the Sandy Hook shooting was a hoax too, and harass the families who lost children during that event.

8 out of 10.

Why does this matter? Who looks these things up? Maybe it's just the overly radicalized who already believe it? Or maybe it's the 12 year old kid who is told something on the playground and comes home to ask Google the answer. And finds 8 out 10 results say it's a hoax.

What could be done? Without Google intervening people could start writing more content on the internet saying the Holocaust was real, and eventually Google might interpret that and shift results. Maybe only 6 out of 10 for the hoax. Could we get enough popular sites so the truth could even be a majority, and the hoax would only get 4 out of 10? How many person hours, websites, twitter posts do we need to restore this truth?

As we're sitting on Godwin's front lawn already, lets talk about the problem with that. 6 million voices (and all the ones that came after them) that would have stood up here, are gone. The side of truth literally lost a generation to this terrible event.

So the other answer is that Google really should fix this. They control the black box. They already down rank other sites for malware, less accessible content, soon for popup windows. The algorithm isn't static, it keeps being trained.

And the answer you get is: "that's a slippery slope. When humans start interfering with search results that could be used by the powerful to suppress ideas they don't like."

It is a slippery slope. But it assumes you aren't already on that slope.

8 out of 10.

That act of taking billions of documents and choosing 10 to display, is an act of amplification. What gets amplified is the Trolley problem. Do we just amplify the loudest voices? Or do we realize that the loudest voices can use that platform to silence others?

We've seen this already. We've seen important voices online that were expressing nothing more than "women are equal, ok?" get brutally silenced through Doxing and other means. Some people that are really invested in keeping truth of the table don't stop at making their case, they actively attack and harass and send death threats to their opponents. So now that field of discourse keeps tilting towards the ideologues.

8 out of 10.

This is our real Trolley problem in Tech. Do you look at the playing field, realize that it's not level, the ideologues are willing to do far more and actually knock their opponents off the internet entirely, and do something about it? Or do you, through inaction, just continue to amplify those loud voices. And the playing field tips further.

Do we realize that this is a contributing factor as to why our companies are so much less diverse than society at large? Do we also realize that lack of diversity is why this doesn't look like a problem internally? At least not to 8 out of 10 folks.

8 out of 10.

I don't have any illusions this is going to change soon. The Google engine is complicated. Hard problems are hard.

But we live in this world, both real and digital. Every action we do has an impact. In an increasingly digital world, the digital impact matters as much as, or even more than the real world. Those of us who have a hand in shaping what that digital world looks like need to realize how great a responsibility that has become.

Like the Trolley Problem, there is no right answer.

But 8 out of 10 seems like the wrong answer.