Bootstrap
volume

Please rotate your device for the best experience.

{Yes/No}

We can prevent autonomous weapons - machines, making life and death decisions but now is the time to act.

Jody Williams Nobel Peace Prize Laureate

Immoral Code is a documentary that contemplates the impact of Killer Robots in an increasingly automated world - one where machines make decisions over who to kill or what to destroy.

Automated decisions are being introduced across all parts of society. From pre-programmed bias to data protection and privacy concerns, there are real limitations to these algorithms - especially when that same autonomy is applied to weapons systems.

Life and death decisions are not black and white, on or off, 1s and 0s. They’re complex, and difficult. Reducing these decisions down to automated processes raise many legal, ethical, technical, and security concerns.

The film examines whether there are situations where it’s morally and socially acceptable to take life, and importantly - would a computer know the difference?

Bootstrap Themes

Technology is developing rapidly and with that, there is an urgent need for diplomacy to keep pace.

The race to develop these autonomous weapons means this is no longer a problem of the future - Killer Robots now pose an immediate and credible threat to international law and security.

The Heart of Immoral Code

The growing influence of computer processing and automation means that decision-making is often a digital process - devoid of the context that comes from human insight. But as humans, we approach everyday decision-making through the lens of our own individual moral codes, each shaped by our culture, our upbringing, beliefs, and ethics.

These 'moral codes' help us make the 'right' decisions in complex situations. But ask yourself, what would your moral code tell you if that decision meant the difference between life and death? What if it were a soldier? A child soldier? A child? It’s certainly not going to be a simple binary decision - a yes or no - life just isn’t that simple.

And yet somehow, we expect the programming of machines to understand and interpret these complexities, to make binary decisions over whether people live, or die.

To demonstrate this point, the film poses a series of increasingly complex moral questions to our participants – a group of everyday people selected for their diversity and varied life experiences – and asks them to make those binary decisions – yes, or no. Life or death.

Amongst this we have a panel of experts who offer their insight and understanding on the complex and often nuanced subject of automated weapons – Killer Robots.

Our Experts include:

The Art of Immoral Code

Killer Robots and automated weaponry, whilst driven by technology, are fundamentally a human issue. And so, when it came to planning how we launch the Immoral Code film, we knew we wanted that to come through loud and clear - to share the vibrancy and eclectic nature of human endeavour, and at the same time contribute something positive to the world around us.

So rather than pour money into traditional advertising methods, have an agency create a killer cover image and share it for all the world to see, we instead commissioned a series of artists to give us their take on the issues presented in the film.

And we could not be more proud of the outcome!

I knew I wanted to play with the idea that robots can be given somewhat ‘human’ features to try and make them seem less threatening and more trustworthy - yet adding these likenesses always ends up really unsettling to me.

Anni Jyn Artist

We’ve had some incredible pieces created for us. From street artist Fokawolf’s uniquely subversive take on a common digital interface, through Anni Jyn’s beautifully haunting hand-drawn robotic figures, Heath Kane's visually arresting overprints, and on to YAYA’s vibrant and humorous characters - they all approach the subject in a different way - extending the story of Immoral Code further with their unique interpretation of the film’s message.

Immoral Code Artisits Heath Kane and UNDUN

Technology has played a big part in many of these pieces too – VOID ONE’s use of Augmented Reality brings his mecha piece to life beautifully, giving an extra dimension to his works, and UNDUN brings binary code to life in vibrant colour, dutifully recreating a portrait of a woman in a sea of 1s and 0s - each individually coloured to demonstrate the diversity and complexity of human decision-making.

We’re going to be sharing the works of all the artists involved through our social media channels but also be sure to keep an eye out in cities across the globe as we seek to spread our posters far and wide. We’ll also be having regular giveaways featuring these unique works as well as sticker packs featuring art from all the artists involved.

Frequently Asked Questions

The technologies we’re worried about are limited. They don’t recognise people as people, instead they reduce living people to data points. Our complex and highly individual identities are lost, instead our physical features and patterns of behaviour are analysed, pattern-matched and sorted into profiles. Decisions about us are made by machines according to which pre-programmed profile we fit into.

Killer robots already exist. There are even adverts on YouTube for autonomous weaponised drones. They use the same technology that has proven bias.

In the rush to make technological leaps, we’re losing human control - these are machines making decisions over who to kill or what to destroy. Some countries are already using systems that can identify and select targets autonomously.

Proven biases and systems flaws are not safe. There’s currently a race to be first to integrate these technologies into weapons, with little concern for the dangers.

Our humanity should not be reduced to physical features or patterns of behaviour analysed by systems unable to understand concepts of life, human rights or the nuances of individuality.

If we allow machines to be in control of killing people we lose human control. We become a secondary consideration. We need to draw a line now.

Thousands of tech experts and hundreds of companies working in the tech sector have raised concerns. They believe that autonomous weapons would be unpredictable and unreliable, vulnerable to hacking or spoofing.

Our knowledge of the remote biometric surveillance industry for example, tells us that increased performance and more ‘precise’ machine vision identification will be used to augment existing biases and discriminatory patterns that have already been demonstrated within AI and facial recognition systems.

Countries blocking a ban and the private companies supporting them are already competing with each other to develop new, ever more automated technologies. If left unchecked the world could enter a destabilizing robotic arms race.

We don’t want to remove the ability to protect lives, but that shouldn’t mean the loss of meaningful human control over such weapons. This would cross a technological and ethical line that would fundamentally change society.

We need your help to change the situation. A demonstration of public demand can overcome the issues faced in bringing about new international law on autonomy in weapons systems.

We have the opportunity to avoid unnecessary loss of life by acting now. States are discussing what to do right now – the world’s eyes are on them - make it clear what you want, what type of society you want to live in. Sign our petition here to make your voice heard.