Future perfect
Author: vix.reeve@jisc.ac.uk
Go to Source
Dave Coplin inspired Jisc members at this year’s Digifest with a thought-provoking keynote. We asked him: if you could redesign the education system from scratch, to make it fit for the 21st century, what would you do?
We’re still preparing people for the Victorian production line. We’re teaching kids not to speak out of turn and not to question. We squash creativity. We teach skills that are a commodity in today’s technological world.
For example, we want children and young people to be able to remember lots of facts in a world where they have access to every fact our society has ever known. We teach our kids to be calculators when we all have access to more powerful calculators than ever before – in our pockets. But what about collaboration skills? Creativity? Empathy and relating to each other as human beings?
Freedom of the algorithm
I see a future world where algorithms do much of the work that we do today. We will have been freed from repetitive tasks to do the stuff that doesn’t fit neatly in a box, that helps us go to places that we couldn’t before.
So, given that’s the world, what are the skills we’re going to need in people so that they can really live up to that potential? Creativity, empathy and accountability. I want an education system that delivers that.
I want an education system that helps people acquire a healthy relationship with technology, in the same way that education should help us acquire a healthy relationship with reading, writing and arithmetic. I want an education system that helps people acquire a healthy relationship with each other.
Future of work
I think that’s where the future of work is heading. You can see many organisations are beginning that journey but I don’t see many universities or colleges, secondary schools or primary schools really leaning into that opportunity.
We’ve got to help people understand that vision of the future. It is easy to paint a negative picture: either, the robots are going to take our jobs, or the technology’s never going to be up to scratch. What we don’t do is say, “Let me show you what’s possible with this technology. Let me show you what we could do as a society IF we could use it in the right way.”
We need to start with envisioning the world that we could create if only we could equip people with the right kind of skills. The first thing we’ve got to do is inspire people. This isn’t a technological debate. It’s a debate about human culture, about individuals, about effectiveness versus efficiency.
Informed decisions
Fear is a massive barrier. But it’s often misplaced and it can make us lose sight of the opportunities technology offers. The key is to equip people to make informed decisions and help them to manage their personal relationship with technology. Schools turn off or confiscate students’ phones, but is that missing an opportunity to help teach kids to have a healthy relationship with technology?
It can be harder to make those informed decisions if the balance of power is skewed with too much power in the hands of big technology companies. Those companies need to do two things. Firstly, they have to be transparent. They need to explain, if you give me your data, this is what is going to happen. And people need to be free to make the choice as to whether they give it or don’t.
Secondly, the value exchange needs to be really explicit. People need to understand not only what is going to happen to their data, but also what value they are going to receive as a result.
Sharing the conversation
We cannot leave it with the tech companies alone. We all have a part to play. We have to be informed. We have to be part of the conversation whether it’s Facebook, Google, Microsoft or Twitter. This is the service we expect. This is what we’d like. This is the relationship we’d like with you.
If you completely trust the algorithm without bringing in any human judgment, you deserve what you get. If you drive your car the wrong way down a one-way street or over a cliff because your satnav told you to, you’ve only got yourself to blame. Whatever the situation we need to understand how the algorithm made the calculation it did – and factor in bias – and then decide what to do.
How am I going to choose the correct course of human action? What parts of my human judgment can I add to the algorithm to give me what the right choice is? That’s a core skill that we have to learn – as individuals and as organisations – and it is a core skill we have to teach. And we have to do that quickly.
Algorithms are only going to get more powerful and faster, and we are in a symbiotic relationship with them: it’s humans plus machines, not humans against machines. We need to trust that the computer can calculate the answer really quickly but then we need to interpret the answer and choose the correct course of action. And that’s kind of the relationship that we want.