Sunday, August 26, 2018

DANGEROUS DIVERSIONS AND DISTRACTIONS

Trees of Life Vs The Drones Gouache on paper 30 x 42 cm 2016

POLITICAL SHENANIGANS
This week's political turmoil in Canberra has left many Australians scratching their heads. I will not go into the politics - that would be a never ending Alice in Wonderland rabbit hole type trip! You can Google for more information! However, for foreign readers, briefly, a sitting Prime Minister, Malcolm Turnbull, has been ousted due to recalcitrant forces from within his own party, the Liberal Party. This is not the first time in recent history that this has happened, for either a sitting Liberal Prime Minister or a Labour Prime Minister. 

But, apart from the politics, what about the distraction these kinds of events cause - distraction from more important issues, the issues politicians should be paying attention to? And, there are many.

Here's one that concerns me.

ARMS RACE
Since becoming interested in the burgeoning research area of existential risk posed by emerging technologies, I have become increasingly concerned by political inattention to growing and potential threats posed by advances in technology, either designed for military purposes, or appropriated by defence forces, or aberrant state and non-state individuals or groups. Examples of new and emerging technologies with the potential - due to mal-intent, accident or unintended consequences - to cause major mayhem, civilisation collapse or human species demise are biotechnology, nanotechnology and artificial intelligence. Combining or linking them could be a extremely worrying.*

Adding to the alarm is the apparent escalation in arms research and development utilising new and emerging technologies, including machine learning and artificial intelligence. There are many voices around the world who are concerned. Significantly, many of them are scientists. For example there are a number of AI researchers and developers, including,  Prof Toby WalshProf Stuart Russell and Prof Noel Sharkey, who are concerned about increasingly autonomous weapons ie: weaponry utilising machine learning and artificial intelligence. The Future of Life Institute, based out of MIT, has facilitated open letters clearly expressing that AI research and development must be for the benefit of humanity. These letters have been signed by thousands of scientists, philosophers, International Studies scholars and others. One letter is specifically targeted to address concerns about lethal autonomous weapons. I quote from this letter written in 2015 - three years ago, "If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow."

The idea of a twenty-first century arms race is very worrying. So, today I was pleased to see that an interview by David Aaronovitch with historian Yuval Noah Harari was published in this weekend's Weekend Australian Magazine*. I have read Harari's work, but not Sapiens: A Brief History of Humankind and 21 lessons for the 21st Century. These are on my list! In the interview Harari makes it clear that national political intrigues divert attention away from globally significant issues, such as potential outcomes of emerging technologies. Three years after the Future of Life open letter, here's a sobering quote from Harari, 

    "But in 2018, we are already in a very serious arms race. The Chinese realised it, I think, three 
     or four years ago; the Europeans are realising it now. But the world is in an arms race, and this 
     is terrible news because you cannot regulate this explosive technology if you are in an arms race."

It is ironic that Harari's interview appeared in the Weekend Australian Magazine, the weekend after a week of befuddling political shenanigans that clearly diverted attention away from proper government. It is likely that this coming week will be overshadowed by post-mortem examinations of the last week, intrigues about a new ministry, and what might happen to the so-called 'insurgents' within the Liberal Party. More diversions! Meanwhile, Australia, like the rest of the world, faces repercussions from extremely important issues relating to globally significant and accelerating developments in militarised and militarise-able technology. I have previously written about Australia and a global arms race. Please read my post DRONES AND CODE: FUTURE NOW The title of the post is also the title of a painting.

BACKGROUND
I have a long term interest in existential risk posed by emerging technologies research. This interest was focused on 2015 in research I undertook for a research Masters degree, an M. Phil, at the University of Queensland. Part of my research included examinations of contemporary militarised technology, including, unmanned drone technology, night vision technology, pervasive surveillance capabilities and autonomy in systems. While I was researching I became aware of how fast developments in militarised technology were accelerating. I also became aware that, as the months ticked by, more and more countries were either developing new technology or purchasing it.  And, I also became more aware of the increasing militarise-ability of civilian technology. An arms race, twenty-first century style, is not a fantasy!  

Trees of Life Vs The Drones (above)
This painting depicts a swarm of drones threatening two trees, trees-of-life. The drones represent 21st century militarised technology - unmanned, weaponised, and potentially equipped with some autonomous systems. The drones seem to 'march on' relentlessly! But, let's not give up, the trees-of-life declare! 

Our Bright Future (below)
The contours of the continent of Australia are formed by binary code repeating/instructing AUSTRALIA.  This painting was inspired by watching Kevin Slavin's fabulous TED talk How Algorithms Shape Our WorldAt one point he says "It's a bright future if you're an algorithm."

Really?

  
Our Bright Future Gouache on paper 30 x 42 cm 2016


* Check out the Centre for the Study of Existential Risk, The University of Cambridge. 
* Weekend Australian Magazine: if you have an account you can access the interview online.

Cheers,
Kathryn

No comments: