Having overdosed on the Sci-Fi channel during the past few weeks, Martyn Day considers the devastation that could be reaped if killer robots and software bankers continue to be set loose on the world
With the Iraq wars we all got used to the ‘missile cam’ videos as the cruise missile closed on its target. Now, every week it seems we hear about some strategic strike in Afghanistan or Iraq made by an unmanned drone. There have been over 60 such attacks in Pakistan alone in the last three years, which resulted in a considerable numbers of civilian deaths. These machines are piloted remotely from 7,000 miles away in California and running unmanned missions in the Middle East has become standard practice – every army now wants an air force of drones because it’s ‘war without risk.
Currently America has 200 Predator and 30 Reaper unmanned aerial vehicles (UAVs)and next year plans to spend £3.29 billion on unmanned combat vehicles. The UK had two Predators but one crashed in Iraq last year. In total there are now over 4,000 robots deployed in Iraq and Afghanistan. Besides Britain and the US, another 43 countries have programmes to develop military robots. South Korea and Israel currently deploy armed robot border guards and China and Singapore are increasing their use. Killer Robots are becoming big business.
With advanced developments under way, current machines will look crude and dumb compared to future generations. Featuring artificial intelligence, these machines will make battlefield decisions on their own – less drone and more killer robot.
This all may sound very ‘Terminator’ but the American Office of Naval Research has recently produced a hefty report which delivers a stark warning. While these robots have cognitive advantages over human soldiers and civilians, the report warns they are not error proof, which could lead to civilian or friendly fire incidents. It goes on estimate that by 2047 unmanned aircraft will be able to determine whether or not to strike a target but, as with all written code, there’s no guarantee that they will do only what they are programmed to do, with millions of lines of code to debug coupled with the unpredictability of combat.
The report suggests that rules should be built into the machines, covering legal, social and political issues. It also calls for a debate and research into robot warfare morality. However, there is a rush to market in the design and manufacture of these robot warriors, with a US congressional mandate stipulating that by 2010 a third of all deep strike aircraft, and by 2015 a third of all land vehicles, must be unmanned. This has given rise to robot tanks such as the Talon Sword which comes armed with machine guns and grenade launchers. These robots and drones are cheap to manufacture, require less personnel and, according to the navy, perform better in complex missions, stating that one battlefield soldier could start a large-scale robot attack in the air and on the ground on an enemy position.
The next generation of these machines will be akin to what we find in our sci-fi nightmares.
Dr.Noel Sharkey, Professor of Artificial Intelligence and Robotics Sheffield University recently gave his own warning, “The military have a strange view of artificial intelligence based on science fiction. The nub of it is that robots do not have the necessary discriminatory ability. They can’t distinguish between combatants and civilians. It’s hard enough for soldiers to do that. I do think there should be some international discussion and arms control on these weapons but there’s absolutely none.”
Sharkey also pointed out that for the operators, the psychological effects of war are very different to how they would be for someone sitting in the pilot’s seat. The carnage witnessed while chasing back the Republican Guard from Kuwait in a ‘Turkey shoot’, demoralised the US pilots. This would be highly sanitised when done under remote control.
While we probably still have a small amount of time before killer robots rule the world, we have all seen the decimation that can be meted out by a small group of greed-warriors, more commonly known as bankers. Leaving the business landscape looking like something out of Terminator 1,2,3 (take your pick), automation systems also played their part in creating volatility and helping our recent global meltdown.
There’s something called High Frequency Trading (HFT), which essentially uses computers to scour the markets and look for trends in trading patterns. Running on super computers using complex algorithms they are able to buy or sell huge volumes of shares, milliseconds ahead of everyone else. HFT now accounts for the majority of daily trades and is so common that all banks have HFT systems – this is literally machines trading with machines… may the best algorithm win.
When the market has a gravitational change these systems can become self-fulfilling, creating their own momentum chasing a trend and making things worse. It’s hard to tell a computer program to ‘calm down’.
Human beings are undoubtedly very clever and we have mastered so many tools and techniques to design and manufacture very efficient products to do our dirty work. We don’t like dying but do like winning wars and when researching this article I found it shocking to read about the level of automation already in place. The next generation of these machines will be akin to what we find in our sci-fi nightmares. Technologically powerful governments could unleash automated butchery on any third or second world country. By removing the risk, don’t we make war more likely? And what if the technology goes rogue? It looks like WW3 will certainly be a robot war of the kind never seen before.
The other combative environment, banking, is also in an aggressive tech arms race and again here, without human supervision or intervention, they too could work together in a totally unpredictable way, causing untold carnage – not unlike their poorly supervised human equivalents.
While our ‘masters’ are in command, unfortunately engineers are the sole enablers of all this. Pay us some shekels and we will give people what they want. But surely there’s a moral quandary in designing armaments and death machines? We need to remember Auschwitz was designed by architects and engineers.
Bankers, robots and science-fiction nightmares – Martyn Day looks to the future