Level 5 self-driving means autonomous cars can drive themselves anywhere, at any time, in any conditions.
How do you beat Tesla, Google, Uber and the entire multi-trillion dollar automotive industry with massive brands like Toyota, General Motors, and Volkswagen to a full self-driving car? Just maybe, by finding a way to train your AI systems that is 100,000 times cheaper.
It’s called Deep Teaching.
Perhaps not surprisingly, it works by taking human effort out of the equation.
And Helm.ai says it’s the key to unlocking autonomous driving. Including cars driving themselves on roads they’ve never seen … using just one camera.
“AI fundamentally changes how we develop apps and what apps can do, and I would say we’re at the beginning of that revolution,” Intuit CTO Marianna Tessel said.
In a conversation with Nara Logics CEO Jana Eggers at Transform 2020 today, Tessel outlined some of the key ways that AI is changing the mindset at Intuit, with a focus on the app development process. Notably, Intuit is trying to adapt much in a similar way to how it adapted to the emergence of smartphones.
This AI turns blurry pixelated photos into hyperrealistic portraits that look like real people. The system automatically increases any image’s resolution up to 64x, ‘imagining’ features such as pores and eyelashes that weren’t there in the first place.
Duke University researchers have developed an AI tool that can turn blurry, unrecognizable pictures of people’s faces into eerily convincing computer-generated portraits, in finer detail than ever before.
Previous methods can scale an image of a face up to eight times its original resolution. But the Duke team has come up with a way to take a handful of pixels and create realistic-looking faces with up to 64 times the resolution, ‘imagining’ features such as fine lines, eyelashes and stubble that weren’t there in the first place.
“Never have super-resolution images been created at this resolution before with this much detail,” said Duke computer scientist Cynthia Rudin, who led the team.
The system cannot be used to identify people, the researchers say: It won’t turn an out-of-focus, unrecognizable photo from a security camera into a crystal clear image of a real person. Rather, it is capable of generating new faces that don’t exist, but look plausibly real.
While the researchers focused on faces as a proof of concept, the same technique could in theory take low-res shots of almost anything and create sharp, realistic-looking pictures, with applications ranging from medicine and microscopy to astronomy and satellite imagery, said co-author Sachit Menon ’20, who just graduated from Duke with a double-major in mathematics and computer science.
The classic eye exam may be about to get an upgrade. Researchers have developed an online vision test—fueled by artificial intelligence (AI)—that produces much more accurate diagnoses than the sheet of capital letters we’ve been staring at since the 19th century. If perfected, the test could also help patients with eye diseases track their vision at home.
“It’s an intriguing idea” that reveals just how antiquated the classic eye test is, says Laura Green, an ophthalmologist at the Krieger Eye Institute. Green was not involved with the work, but she studies ways to use technology to improve access to health care.
The classic eye exam, known as the Snellen chart, has been around since 1862. The farther down the sheet a person can read, the better their vision. The test is quick and easy to administer, but it has problems, says Chris Piech, a computer scientist at Stanford University. Patients start to guess at letters when they become blurry, he says, which means they can get different scores each time they take the test.
Users of the homepages of the MSN website and Edge browser will now see news stories generated by AI
Dozens of journalists have been sacked after Microsoft decided to replace them with artificial intelligence software.
Staff who maintain the news homepages on Microsoft’s MSN website and its Edge browser – used by millions of Britons every day – have been told that they will be no longer be required because robots can now do their jobs.
Around 27 individuals employed by PA Media – formerly the Press Association – were told on Thursday that they would lose their jobs in a month’s time after Microsoft decided to stop employing humans to select, edit and curate news articles on its homepages.
Russian researchers from HSE University and Open University for the Humanities and Economics have demonstrated that artificial intelligence is able to infer people’s personality from ‘selfie’ photographs better than human raters do. Conscientiousness emerged to be more easily recognizable than the other four traits. Personality predictions based on female faces appeared to be more reliable than those for male faces. The technology can be used to find the ‘best matches’ in customer service, dating or online tutoring.
‘We want to make anything and everything on the platform shoppable’
Facebook is launching what it’s calling a “universal product recognition model” that uses artificial intelligence to identify consumer goods, from furniture to fast fashion to fast cars.
It’s the first step toward a future where the products in every image on its site can be identified and potentially shopped for. “We want to make anything and everything on the platform shoppable, whenever the experience feels right,” Manohar Paluri, head of Applied Computer Vision at Facebook, told The Verge. “It’s a grand vision.”
The drones would fly alongside Air Force warplanes, doing jobs too dangerous or dull for pilots.
The Air Force is soliciting the aerospace industry to provide flyable “Skyborg” drones by 2023.
The drones will be powered by artificial intelligence, capable of taking off, landing, and performing missions on their own.
Skyborg will not only free manned pilots from dangerous and dull missions but allow the Air Force to add legions of new, unpiloted, cheap planes.
The U.S. Air Force is finally pushing into the world of robot combat drones, vowing to fly the first of its “Skyborg” drones by 2023. The service envisions Skyborg as a merging of artificial intelligence with jet-powered drones. The result will be drones capable of flying alongside fighter jets, carrying out dangerous missions. Skyborg drones will be much cheaper than piloted aircraft, allowing the Air Force to grow its fleet at a lower cost.
In a time of COVID-19 disruption, futurists can accelerate organizational recovery and capacity. When partnered with purpose-built AI, augmented intelligence can also spur radical innovation.
Machine learning, task automation and robotics are already widely used in business. These and other AI technologies are about to multiply, and we look at how organizations can best take advantage of them.
COVID-19 disruption has left enterprises with no choice but to reassess digital transformation investments and roadmaps. While less important projects are delayed, transformation projects involving AI and automation are receiving a lot of attention right now. In just the last 60 days, the adoption of varying levels of AI technologies across the enterprise surged with an incredible sense of urgency.
One area where AI can make a tremendous impact — yet one we’re not really talking about it — is modeling future scenarios based on myriads of new data stemming from pandemic disruption. Beyond automation, adding an AI Futurist as a virtual strategic advisor to the C-Suite can help executives navigate this Novel Economy as it takes shape over the next 36 months. In a time when no playbook, expertise, or best practices exist, perhaps this is AI’s moment to shine.
Machine-learning models trained on normal behavior are showing cracks —forcing humans to step in to set them straight.
In the week of April 12-18, the top 10 search terms on Amazon.com were: toilet paper, face mask, hand sanitizer, paper towels, Lysol spray, Clorox wipes, mask, Lysol, masks for germ protection, and N95 mask. People weren’t just searching, they were buying too—and in bulk. The majority of people looking for masks ended up buying the new Amazon #1 Best Seller, “Face Mask, Pack of 50”.
When covid-19 hit, we started buying things we’d never bought before. The shift was sudden: the mainstays of Amazon’s top ten—phone cases, phone chargers, Lego—were knocked off the charts in just a few days. Nozzle, a London-based consultancy specializing in algorithmic advertising for Amazon sellers, captured the rapid change in this simple graph.
Engineers at the Georgia Institute of Technology have designed the first robot capable of not only playing music, but creating music—and its name is Shimon.
The musical robot was trained on a vast data set of everything from progressive rock to jazz to rap. Shimon takes this knowledge of past music and uses algorithms to come up with unique robot music of his own.
The rise of artificial intelligence in the workplace to enable and sustain the digital workforce is an apparent trend for 2020.
Artificial intelligence, machine learning, neural networks or whatever other fancy terms industry is coming out with for what is defined as the sophisticated computer technology that is becoming widely utilized to understand and improve business and customer experiences. I assume you have heard of it before, but the way it is defined today is an area of computer science that emphasizes the creation of intelligent machines that work and react like humans.
Here are ten AI trends to be on the lookout for this year: