Tag Archives: technology

The real Trolley Problem in tech

With all the talk about autonomous cars in general media this year, we all got a refresher in Ethics 101 and the Trolley Problem. The Trolley Problem is where you as an onlooker see a trolley barreling towards 5 people. There is a switch you can throw where it will kill 1 previously safe person instead of 5. What do you do? Do you take part in an act which kills 1 person while saving 5? Do you refuse to take part in an act of violence, but then willing let 5 people die because of it? No right answers, just a theoretical problem to think through and see all the trade offs.

But as fun as this all is, Autonomous cars are not the big trolley problem in Tech. Organizing and promoting information is.

Right now, if you put the phase "Was the holocaust real?" into Google, you'll get back 10 results. 8 will be various websites and articles that make the case that it was not real, but a giant hoax. The second hit (not the first) is the Wikipedia article on Holocaust Denial, and a link further down from the United States Holocaust Memorial talking about all the evidence presented at Nuremburg.

8 out of 10.

The argument we get in Tech a lot is that because results are generated by an algorithm, they are neutral. An algorithm is just a set of instructions a human built once upon a time. When it was being built or refined some human looked at a small number of inputs, what it output, and made a judgement call that it was good. Then they fed it a lot more input, far more than any human could digest, and let it loose on the world. Under the assumption that the testing input was representative enough that it would produce valid results for all input.

8 out of 10.

Why are those the results? Because Google came up with an insight years ago that webpages have links, people produce webpages, and important sites with authoritative information get linked to quite often. In the world before Google, this was hugely true, because once you found a gem on the internet, if you didn't write it down somewhere, finding it again later was often impossible. In the 20 years since Google, and in the growth of the internet, that's less true.

It's also less true about basic understood facts. There aren't thousands of people writing essays about the holocaust anymore. There are, however, a fringe of folks trying to actively erase that part of history. Why? I honestly have no idea. I really can't even get into that head space. But it's not unique to this event. There are people who write that the Sandy Hook shooting was a hoax too, and harass the families who lost children during that event.

8 out of 10.

Why does this matter? Who looks these things up? Maybe it's just the overly radicalized who already believe it? Or maybe it's the 12 year old kid who is told something on the playground and comes home to ask Google the answer. And finds 8 out 10 results say it's a hoax.

What could be done? Without Google intervening people could start writing more content on the internet saying the Holocaust was real, and eventually Google might interpret that and shift results. Maybe only 6 out of 10 for the hoax. Could we get enough popular sites so the truth could even be a majority, and the hoax would only get 4 out of 10? How many person hours, websites, twitter posts do we need to restore this truth?

As we're sitting on Godwin's front lawn already, lets talk about the problem with that. 6 million voices (and all the ones that came after them) that would have stood up here, are gone. The side of truth literally lost a generation to this terrible event.

So the other answer is that Google really should fix this. They control the black box. They already down rank other sites for malware, less accessible content, soon for popup windows. The algorithm isn't static, it keeps being trained.

And the answer you get is: "that's a slippery slope. When humans start interfering with search results that could be used by the powerful to suppress ideas they don't like."

It is a slippery slope. But it assumes you aren't already on that slope.

8 out of 10.

That act of taking billions of documents and choosing 10 to display, is an act of amplification. What gets amplified is the Trolley problem. Do we just amplify the loudest voices? Or do we realize that the loudest voices can use that platform to silence others?

We've seen this already. We've seen important voices online that were expressing nothing more than "women are equal, ok?" get brutally silenced through Doxing and other means. Some people that are really invested in keeping truth of the table don't stop at making their case, they actively attack and harass and send death threats to their opponents. So now that field of discourse keeps tilting towards the ideologues.

8 out of 10.

This is our real Trolley problem in Tech. Do you look at the playing field, realize that it's not level, the ideologues are willing to do far more and actually knock their opponents off the internet entirely, and do something about it? Or do you, through inaction, just continue to amplify those loud voices. And the playing field tips further.

Do we realize that this is a contributing factor as to why our companies are so much less diverse than society at large? Do we also realize that lack of diversity is why this doesn't look like a problem internally? At least not to 8 out of 10 folks.

8 out of 10.

I don't have any illusions this is going to change soon. The Google engine is complicated. Hard problems are hard.

But we live in this world, both real and digital. Every action we do has an impact. In an increasingly digital world, the digital impact matters as much as, or even more than the real world. Those of us who have a hand in shaping what that digital world looks like need to realize how great a responsibility that has become.

Like the Trolley Problem, there is no right answer.

But 8 out of 10 seems like the wrong answer.

 

A record player in a car, what could go wrong

What’s the connection between the Beatles’ George Harrison, boxing legend Muhammad Ali, and Chrysler cars? The Highway Hi-Fi: a vinyl record player that just happened to be the world’s first in-car music system. It appeared 60 years ago this spring, in 1956, and should have been a smash hit. It was innovatory, a major talking point, arrived as the car market was booming as never before, and it came with much press hype. It also had the backing of a leading motor manufacturer. What could possibly go wrong?

Source: Forgotten audio formats: The Highway Hi-Fi | Ars Technica

It's a fascinating story, made even more so because basically proprietary formats and copyright tangles killed it so quickly.

Choose Boring Technology

But of course, the baggage exists. We call the baggage "operations" and to a lesser extent "cognitive overhead." You have to monitor the thing. You have to figure out unit tests. You need to know the first thing about it to hack on it. You need an init script. I could go on for days here, and all of this adds up fast.

...

The problem with "best tool for the job" thinking is that it takes a myopic view of the words "best" and "job." Your job is keeping the company in business, god damn it. And the "best" tool is the one that occupies the "least worst" position for as many of your problems as possible.It is basically always the case that the long-term costs of keeping a system working reliably vastly exceed any inconveniences you encounter while building it. Mature and productive developers understand this.

via Dan McKinley :: Choose Boring Technology.

I've had way too many conversations recently that had some variation of "people should just get over it and learn new things". New has a lot of cost. Even if it's good, new brings load. If you exceed the digestion rate of new technology on people, they hit a wall, give up, and go home mad.

Kids and Computers

It's a common thread among computer professional to complain about "kids these days" when we look at potential new hires. It's always hard to separate how much of that is real vs. how much of it is what people do when they get older, i.e. complain about those young-uns that are on your lawn.

So this was an interesting refreshing look at what it means that Kids can't use computers, especially when it comes to what we screwed up on.

But the curriculum isn’t the only area in which we’ve messed up. Our network infrastructures in UK schools is equally to blame. We’ve mirrored corporate networks, preventing kids and teachers access to system settings, the command line and requiring admin rights to do almost anything. They’re sitting at a general purpose computer without the ability to do any general purpose computing. They have access to a few applications and that’s all. The computers access the internet through proxy servers that aggressively filter anything less bland than Wikipedia, and most schools have additional filtering software on-top so that they can maintain a white-list of ‘suitable sites’.

I hadn't thought about that perspective before, but in playing network lock down, you reduce computing skills. I actually wonder if this same problem is happening at corporate networks as well, and one of the reasons large companies get so bad at organic innovation. Lock everything down, and no one can actually explore new ideas.

Via @anteaya.

A tale of two tech teams

The Atlantic just published an in dept look at the Tech team behind the Obama campaign. It's a little personality heavy, because they are trying to make tech interesting to the average reader, but putting that aside, there is quite a bit of detail on the team and tech structure behind the campaign.

Contrast that with what happened in the other campaign, where this was clearly not a core part of what they were doing.

Migration to Google Email

In late 1999 I claimed my last name as a domain, and have had various email and web solutions hanging off of it ever since. This past weekend I migrated off of self hosted email to Google Apps for Domains. My email address remains the same, but the infrastructure is Google's. There were 3 main converging trends that drove me there: spam, client innovation, and protocol integration.

Because most people host email on one of the big three (Google, Yahoo, Microsoft), spam fighting techniques for the little guy have largely stagnated. If you are big enough there are other algorithms you can apply to patterns affecting your millions of subscribers at once. For individual filtering, especially with an email address that's been constant for over a decade, the best you can do is spamassassin, which last released in 2011, and realistically hasn't done anything innovative in the last 3 years. So recently my false possitive and negative pool have been overlapping in a way that means a lot of manual work. Thunderbird's bayesan filtering is quite good, and makes up for some of this. So when my laptop is running while I'm getting mobile email, my spam rate in mobile is low. When it's not, about 80% of what gets to my mobile inbox is spam.

Spam by itself wouldn't have pushed me over the edge, but Mozilla deprecating new development on Thunderbird was another blow here. Email is powerful because it is universal, can be accessed by any device at any size, multiple clients interacting with the same data, at the same time. Desktop email has gotten the short end of the stick in recent years, again because most people are hosting with the big 3, but Mozilla was still making a valiant attempt to keep email open. They've now decided it's more interesting to chase Chrome than provide value here. That's their call, but it's sad to see desktop email take that hit.

Lastly, there are lots of quite interesting tools growing up to integrate with email in a new social world. Give you profiles of your contacts via social networks, make it easy to convert email into tasks. All great stuff. None of it works with IMAP or Desktop email. All the innovation around email right now is using the GMail API and Chrome extensions to modify GMail web interface.

So the migration is on, I've nearly got my email history dating back to 2000 into Google now. In the process I found I'd actually lost 2008 and 2009 archives, which I've mostly restored via backup. That would explain why some things weren't showing up in search that I expected. Already, the spam filtering is a huge win, and eventually I'll get used to the web UI for some things (still going to keep using Thunderbird in combo for a while).

The biggest challenge in this whole process is that because of how Google has wedged Plus into everything, having both a gmail and an apps email causes some real confusion on the Plus side, because there is no way to tell Google they are the same. That's just going to be confusing for a while, and if anyone has best practices around that, let me know.

Tech Volunteerism

Twice in the last month I've been contacted by friends I've made in the local tech community with questions about tech volunteering they are doing, or planning to do for local non-profits. I, hopefully, was able to provide them with some pointers and info to help them out.

I find that awesome. Not the me helping them out part, but the fact that they've gotten engaged and are giving back some of their vital skills to local organizations in need.

Over the past couple of years, through my work with the Poughkeepsie Farm Project, and the IBM year of service, I've realized that tech volunteerism is quite a rare thing. While there are a lot of techies in our area, when most of them volunteer, they do so in a non tech role. They are board members, and program leaders, which is good and important, but the very real technology needs are often overlooked.

Those conversations, plus a few other in the last month, have made me really start thinking about more ways to encourage and nurture more of this in our area. I'd love to have a peer group where I could share these experiences, and learn from others. This is a whole other master plan.

So, if you are a techie of any sort (developer, designer, it guru), consider giving those skills back to your local community. It's something very few can give, and very many need.

Maybe it has more to do with never being exposed to nature

NY Times: Technology Leads More Park Visitors Into Trouble

The national parks’ history is full of examples of misguided visitors feeding bears, putting children on buffalos for photos and dipping into geysers despite signs warning of scalding temperatures.

But today, as an ever more wired and interconnected public visits the parks in rising numbers — July was a record month for visitors at Yellowstone — rangers say that technology often figures into such mishaps.

People with cellphones call rangers from mountaintops to request refreshments or a guide; in Jackson Hole, Wyo., one lost hiker even asked for hot chocolate.

Though the article doesn't really stress this point, this has always been a problem.  People that are clueless about nature, possibly because they've been sheltered from it in the cities or suburbs, are clueless, whether or not they have a cellphone or gps.

Volunteer Motivations

The whole OLPC goes windows debacle has been going on for months now, creating incredible polarization on many fronts. A huge part of what actually excited many of the XO laptop volunteers was the chance for a Linux breakout market. I really think that senior leadership lost track of the fact that those blogging up the XO effort to its launch were largely in it for the Linux angle. By deciding to go the "natural route" and replace Linux with XP, lots of people have lost interest in the effort, including myself.

Putting more XP machines into the world isn't something very interesting to me, and it seems to go against the whole notion of the computer being a learning tool at all levels. I guess now the children of developing nations will be learning power point instead of python programming. What a shame.

Beware the Anti-Market

A vendor can often be their own worst competition if they create good technology, but put it out in a way that is too limiting, in platform support or licensing, than their prospective users would like it to be. I've often refered to this as the Anti-Market among colleagues. The rules of the Anti-Market are more or less as follows:

If you create a technology that is useful, but 90% of your prospective market can't use it for various reasons, they've got a good chance of getting together and writing a replacement for your product.

Example 1: KDE vs. Gnome

Gnome created out the anti market that KDE created. KDE is built on QT. Back in the early days of KDE, QT was licenced in rather funny ways by Trolltech. The funny license meant that Red Hat (and other Linux distros) didn't want to ship it. Mandrake was originally just Red Hat + KDE to fill such a need. But with the bulk of the KDE user market blocked because of bad licencing, a void existed to be filled. Gnome did that. A decade later Gnome is the primary desktop environment on nearly ever major distro, and while KDE 4 has gotten some recent press, it is definitely now a minority player.

KDE was brought down because it created an anti market. People wanted that kind of function, but the way it was delivered was not acceptable to its users.

Example 2: Java vs. Mono on the Linux Desktop

How many Linux desktop apps are you running right now, or ever, that are Java based? How many that are Mono based? The only Java apps I run on the desktop in any frequency are Azureus and Freemind. On the Mono side F-Spot and Tomboy have seen a lot more use. Until very recently Java remained under a license that made including it with the Linux platform quite an issue. Mono is under an MIT license, and has been since day one. While Mono has a number of short comings, the fact that it's so young, and so much more used than Java in the Linux desktop space speaks a bit to the anti-market that Sun created by waiting forever to open source their baby.

Example 3: MySQL vs. everyone else

In 1995 Linux was already being used to run key parts of the internet. None of the traditional ISVs were paying attention to it (DB2 showed up in 1998 on Linux, and too my knowledge, was the first big database vendor there). You know what you need to run the internet, a reasonable database. MySQL popped out of the anti-market created by there being a platform people were using quite a bit, but lacking ISV support. People needed the function, but couldn't get it even if they wanted to pay for it.

I continue to be amazed at how much of an anti-market MySQL took advantage of.

Closing thoughts

The Linux Desktop space is full of anti-market applications, some of which have even seeped back into the Windows world, like OpenOffice, Gimp, and Pidgin. Adobe just made a very astute move and got Air out for Linux before they forced a new anti-market there. While the Linux Desktop space isn't the highest volume space for users, the developer to user ratio in the space is very high, which means ignoring it means there is a real chance of creating an anti-market.

I'd love to hear other people's thoughts or examples here, comments are open, have at it.