Algorithm: Explain yourself!
by Paul Curzon, Queen Mary University of London
Your life is being controlled by algorithms. If your family has car
insurance then an algorithm will probably have authorised it. Want to
open a bank account? An algorithm decides whether you can. Want to buy a
house one day? An algorithm will decide if you should be given a loan.
Algorithms are doing the counting in elections, deciding who wins. They
are running twitter bots generating vast numbers of tweets to persuade
you to vote one way or another - to control what you think and so how
you make decisions. In some countries algorithms decide when a person
should be released from prison. In the future, in Europe at least, the
algorithms are going to have to do more than just make decisions, they
are going to have to explain themselves too. That means computer
scientists are going to have to come up with some creative algorithms to
do it. Explaining a decision is much harder than making one.
Algorithms are taking over more and more of our lives, which is why it
is just as important that they can justify those decisions. If we don't
understand the algorithms then who knows if they are making the best
decisions or using valid reasons. If you don't realise that pressure
groups use bots, you may be fooled into believing you are joining a
popular protest, when actually you are being manipulated by a powerful
and shady bot-master. In future, algorithms will be making even more
decisions - about what medicine you should be given when ill, whether
you should be allowed into a nightclub based on the way you have
dressed, whether you are a security risk, whether you should be given a
job, ...
Racist programs
A problem is the algorithms that make the decisions rarely explain
themselves. They just give answers. People assume that computers never
make mistakes and are impartial, but neither is true. Take prison
sentences, for example, it turned out that the algorithms making the
decisions in the US were more likely to give african-americans long
sentences than others for exactly the same offence. Programs can be
racist! Your race or religion shouldn't lead to you being treated
differently, but that could increasingly be secretly happening. Prison
sentence computers just follow the rules they are programmed with. If
the rules as written are biased, the decisions will be too.
The algorithm making the decisions gave african-americans
longer prison sentences
Learning to be like that
It's worse than that though. The latest programs work things out for
themselves. They still follow algorithms but those algorithms tell them
how to learn rather than what to do directly. For example, if you want
to automatically detect faces in your photo collection, you give one of
these algorithms lots of pictures of faces labelled with who they are.
With enough images to train on the algorithm works out for itself whose
face is who, and it can then identify faces in new pictures it hasn't
seen before. This is massively effective. It is the technology behind
music recognition programs self-driving cars, and the program that beat
the best human at Go recently, too.
The problem is these programs just spot patterns. If it's hard to work
out why an algorithm based on explicit rules makes decisions, it's even
harder when it is just spotting patterns. The patterns it spots may
not be the ones we think (see
Employ the Best or Bust!).
Laws about rules
Now the European Parliament has passed new laws about programs that use
personal data to make decisions about European citizens. They must in
future be capable of explaining their decisions and so allow decisions
to be challenged if they seem unfair. This is a challenge to computer
scientists. It means they have to invent new programs that can extract
arguments from their decision making processes, together with ones that
can turn those arguments into explanations we can all follow. Both need
the kind of algorithm that creativity researchers have been working on.
More laws like this might seem to be bad for business, but in the
software world it is great as it drives innovation. Companies who can
invent algorithms that can explain themselves will soon have a big
competitive advantage.