Jim Al-Khalili is optimistic that voices of rationalism are louder than those of ignorance. But he says maybe the loudest in an isolated group - that of US president Donald Trump - is beyond reason. "I’ve gone beyond the point where I feel that talking to President Trump about sensible issues would actually get through to him,” he says.
To fight back, Al-Khalili believes in the value of events such as the March for Science and scientists pushing to be heard in discussions about the world’s biggest issues - climate change, agriculture and access to water, for instance. As a theoretical physicist and science commentator, he has the platform to speak up. He talks to WIRED about why it’s important that others do the same and why governments have to back them.
Those of us who have a public voice feel that we can’t shirk our social responsibilities any more than anyone else can. So much of modern life is dependent on an understanding of how science works, the method of science, and gathering evidence that I think scientists have to speak out where necessary. There are lots of issues like climate change, genetic engineering, artificial intelligence, how we produce our energy, food, water and how we look after the environment, that are going to need a big input from science. It's not going to be the only answer but scientists have a moral responsibility to speak up and make sure their voices are heard along with everyone else.
There are those who have said that if every time you go out on a march and shout about things, that’s not going to change the world. But, actually it does. It is important for those voices of rationalism and reason to be heard and to be heard as loudly, if not louder, than the voices of bigotry and ignorance. We can no longer sit back and hope that the world will come to its senses because that’s the sensible thing to do. Things like the March for Science highlight the importance of gathering evidence, the importance of what is true and what isn’t true and how vital a role science plays in a 21st century society.
The media has to follow certain ethical guidelines, rules, and strict regulations that are not properly in place when it comes to the big corporations like Facebook and Google. They are so big and so far reaching in terms of how they seep into all our lives and the way they behave and react to news that there needs to be much more of a dialogue. In the future I could imagine the United Nations not just being United Nations of autonomous political countries but also having to include some of these larger corporations. They have to be part of world government, they can’t just carry on working independently in a free capitalist society, they have a huge responsibility today.
I’m not sure I could stomach it to be honest. I’ve gone beyond the point where I feel that talking to president Trump about sensible issues would actually get through to him. Whether he is deliberately ignoring advice, reasonable advice, about evidence, about what is true and what isn’t, or whether he just doesn’t have the intellectual ability to cope with these things, I simply think it’s a lost cause. For me I’m hoping the world, particularly America, sits back and waits for this to pass like a bad dream. I’m not sure I could talk any sense into Donald Trump, I’m afraid.
What’s changed now is simply that we have the internet and we have social media. So whereas in the past maybe these people were more isolated voices, they can now shout loudly through social media and they realise there are other like-minded people and they gain in confidence. But we shouldn’t think that because it seems to be a bit more in our faces these days that it’s worse than it was before. I’m still optimistic, I still think that the voices of rationalism, who understand that science can be for good provided we know how to use it and discuss it ethically, are actually winning over these other voices.
Where I do think there are perils that we have to talk about are the much more benign and more immediate issues - things like autonomous killer drones that are used by the military now, even simple things like how you program safe decision making into driverless cars. If the AI in the car is programmed to safeguard its passengers above all else, then what? I think these problems are solvable but I think what’s important when it comes to AI, because the technology is changing so quickly, is that we really need to have much more of an open dialogue about the issues that are going to confront in just a few years from now.
It is true that some areas of scientific research has a more immediate impact on society and there is actually an argument that scientists from lots of different disciplines maybe should devote some time and there should be more of a concerted effort to tackle things like antimicrobial resistance, which I think is an immediate threat to the world. Climate change of course is another big concern. There are issues that face the world that are going to need government policy – they’re going to need finances thrown at them and they’re also going to need scientists to get their heads together to try and tackle them.
This article was originally published by WIRED UK