Researchers have warned that robots and computers will pursue more criminal activities than humans by 2040.

Tracey Follows from The Future Laboratory, which helps businesses plan for the future, said: “Once robots can be hacked to become suicide-bombing machines, lone-robot attacks could become rife.”

Follows also predicts that artificial intelligence and machine learning could enable robots to self-programme criminal activity. “My forecast would be that by 2040 more crime will be committed by machines than by humans,” she said.

The news has led to fears that rogue driverless cars and drones could also be a threat if they are hacked or re-programmed. Security adviser Raj Samani said: “It’s only a matter of time before we see instances of people left helpless, unable to drive their cars unless they pay up a ransom.”

Aaron Yates, chief executive of the security firm Berea, told Raconteur: “If you can convince the vehicle its GPS telemetry is wrong with a signal jammer, you will be able to pilfer vehicles at leisure.”

There are also domestic dangers. Google engineers have previously warned about the dangers of cleaning robots murdering their owners if they get in the way while they’re tidying up, reports The Sun.

Representatives from all Nato member states met in Belgium yesterday to discuss how to improve their defences against cyber attacks and hackers that could cause as much harm as conventional military attacks.

According to the UK’s National Crime Agency, cyber-crime is on the increase and accounted for 53 per cent of all crimes last year.

One person who is not so concerned is Ron Chrisley, director of the Centre for Cognitive Science at the University of Sussex. “The fact is, robots, despite what one might be encouraged to believe from sci-fi, and despite what may happen in the far future, currently lack what we consider real intentions, emotions and purposes,” he wrote in The Conversation last year. “And contrary to recent alarmist claims, nor are they going to acquire those capacities in the near future.”

He warns that blaming machinery when it does not have its own autonomy and intentions “raises the danger of scapegoating the robot, and failing to hold the human designers, deployers and users involved fully responsible”.

Image credit & Article via: The Week