This sounds a little like Minority Report to us. China is looking into predictive analytics to help authorities stop suspects before a crime is committed.
“Code is law,” as described in Lawrence Lessig’s book ‘Code and Other Laws of Cyberspace’, refers to the idea that computer code has progressively established itself as a predominant way to regulate behavior to the same degree as legal code.
With the advent of blockchain technology, code is assuming an even stronger role in regulating people’s interactions.
However, while computer code can enforce rules more efficiently than legal code, it also comes with a series of limitations.
When should a criminal defendant be required to await trial in jail rather than at home? Software could significantly improve judges’ ability to make that call—reducing crime or the number of people stuck waiting in jail.
In a new study from the National Bureau of Economic Research, economists and computer scientists trained an algorithm to predict whether defendants were a flight risk from their rap sheet and court records using data from hundreds of thousands of cases in New York City. When tested on over a hundred thousand more cases that it hadn’t seen before, the algorithm proved better at predicting what defendants will do after release than judges.
A future in which homeowners display “Beware of Drone” signs on their property may not be too far off. As CNN reports, Sunflower Labs, a startup based in Silicon Valley and Zurich, Switzerland, plans to beta test their drone-based home security system halfway through 2017.
Researchers have created a machine that they claim can tell if a person is a convicted criminal simply from their facial features. The artificial intelligence, created at Shanghai Jiao Tong University, was able to correctly identify criminals from a selection of 186 photos nine out of 10 times by assessing their eyes, nose and mouth.
Over the past year or two, someone has been probing the defenses of the companies that run critical pieces of the Internet. These probes take the form of precisely calibrated attacks designed to determine exactly how well these companies can defend themselves, and what would be required to take them down. We don’t know who is doing this, but it feels like a large nation state. China or Russia would be my first guesses.
Using state driver’s license data, US law enforcement agencies have created a huge network of ID photographs that can be searched using facial-recognition software, raising legal and privacy concerns about its use.
Photographs of more than 117 million adult US citizens are now part of the “perpetual line-up,” according to a report by that name published Tuesday by the Center on Privacy and Technology at the Georgetown University Law Center.
Researchers have warned that robots and computers will pursue more criminal activities than humans by 2040.
Tracey Follows from The Future Laboratory, which helps businesses plan for the future, said: “Once robots can be hacked to become suicide-bombing machines, lone-robot attacks could become rife.”
Robots are becoming an inevitable part of our future.
But questions remain over whether the increased use of artificial intelligence will be a good thing for humanity.
Now academics are becoming concerned that autonomous machines will break the law – and we will be powerless to stop them.
Aptonomy Inc. has developed drone technology that could make prison breaks, robberies or malicious intrusions of any kind impossible for mere mortals.
Dubbing it a kind of “flying security guard,” the company has built its systems on top of a drone often used by movie-makers, the DJI S-1000+, a camera-carrying octocopter.
Police are having to investigate a fourfold rise in the number of crime reports involving shop bought drones – including allegations they are being used by paedophiles over children’s playgrounds, peeping toms spying through bedroom windows, burglars scoping out people’s properties, and even cash point scammers recording PIN numbers.