Researchers Develop World’s Most Powerful Neuromorphic Processor for AI

ByAlex McFarland


In what is a major leap forward in the field of artificial intelligence (AI), an international team of researchers led by Swinburne University of Technology has developed the world’s most powerful neuromorphic processor for AI. It operates at an astonishing rate of more than 10 trillion operations per second (TeraOps/s), meaning it can process ultra-large-scale data.

The work was published in the journal Nature. 

Led by Swinburne’s Professor David Moss, Dr. Xingyuan Xu, and Distinguished Professor Arnan Mitchell from RMIT University, the team accelerated computing speed and processing power. They were able to create an optical neuromorphic processor capable of operating over 1,000 times faster than any previous ones. The system can also process ultra-large-scale images, which is important for facial recognition as previous optical processors have failed in this regard.

Professor Moss is Director of Swinburne’s Optical Sciences Centre, and he was named a top Australian researcher in physics and mathematics in the field of optics and photonics by The Australian.

“This breakthrough was achieved with ‘optical micro-combs,’ as was our world-record internet data speed reported in May 2020,” he said.

Continue reading… “Researchers Develop World’s Most Powerful Neuromorphic Processor for AI”

AI Matches or Beats Human Diagnoses in Award-Winning Study


Denti.AI’s artificial intelligence (AI) software accurately identified pathoses that were missed by board-certified dental experts in an award-winning study by Manal Hamdan, a graduate student at the University of North Carolina Adams School of Dentistry.

Hamdan concluded that Denti.AI’s technology had comparable or improved results in detecting apical radiolucencies and that it has the potential to reduce provider fatigue and diagnostic errors.

The study won the American Academy of Oral and Maxillofacial Radiology’s 2020 Albert G. Richards Graduate Student Research Award.

“Dr. Hamdan’s research is an excellent example of university and business collaboration utilizing experts in radiology, deep learning, and statistical analysis to produce clinically relevant and potentially game changing results validating software that can elevate patient care,” said Dr. Don Tyndall, professor of oral and maxillofacial radiology at the Adams School.

“Our team at Denti.Ai is committed to being at the forefront of implementing artificial intelligence into clinical practices through rigorous academic validation and collaboration with leading academic institutions,” said Dmitry Tuzoff, founder and CEO of Denti.AI.

Continue reading… “AI Matches or Beats Human Diagnoses in Award-Winning Study”

AI can now help us detect disease at its earliest stages


Combining genomics, MRI scans and artificial intelligence will usher in a new era in healthcare

In many countries, Covid-19 has spread because of a popular scepticism about science, a political manipulation of data and an abundance of inaccurate information, spread in large part on social media but fuelled at some of the highest levels of governments. In 2021, we will understand that only by developing new, science-based approaches to disease detection will we avoid similar future catastrophes.

My own country, the United States, has already provided a live demonstration of this fact. States that enforced practices such as social isolation/distancing, hand washing and use of sanitisers and face masks have had the lowest per capita rates of Covid-19 infection, averaging around 100 to 200 cases per 100,000 people. Compare that figure to the 2,300 per 100,000 people in states that did not enforce these measures.

Reconnecting science with healthcare will have impressive results. One weakness of the world’s response to the pandemic has been its unwillingness to use science to predict vulnerabilities in individuals before they become ill. Not only have many countries been reluctant to perform widespread testing on people who show no Covid-19 symptoms, but they have also been caught seemingly by surprise when pre-existing conditions have exacerbated the disease. In 2021, we will see the application of a multimodal testing approach to detect propensities to diseases in people at very early stages. 

Continue reading… “AI can now help us detect disease at its earliest stages”

DeepMind MuZero AI can master games without knowing the rules

JC Torres 


The holy grail of AI has always been to enable computers to learn the way humans do. The most powerful AIs today, however, still rely on having certain known rules, like rules for a game of chess or Go. Human learning, however, is often messy in inferential, learning the rules of life as we go. DeepMind has long been trying to create such AIs using games as their environment and test suite. Google’s sister company focusing on AI research has just revealed its latest achievement in MuZero, an AI that can master a game without learning the rules beforehand.

DeepMind’s previous AIs like AlphaGo have been widely covered in media for beating human champions in their respective games. Impressive as they may have been, they were still a few steps shy of the ultimate goal. AlphaGo, in particular, had the advantage of knowing not only the rules of Go but also domain knowledge and data from human players. Its successors, AlphaGo Zero and AlphaZero, could still bank on having the rule book to learn from.

While these AIs excelled in games with complex strategies but simple visuals, they failed when applied to more visually complex games where the rules are not so easy to infer. That’s where the new MuZero AI comes in and it uses a selection of Atari games, like Ms. Pac-Man, to test out their theory.

Continue reading… “DeepMind MuZero AI can master games without knowing the rules”

McDonald’s Restaurants Are Putting Cameras, Sensors, and AI Technology in Their Dumpsters

By B.N. Frank

McDonald's restaurants are putting cameras-in-dumpsters
Here’s why some McDonald’s restaurants are putting cameras in their dumpsters

McDonald’s restaurants are putting cameras in their dumpsters and trash containers in an effort to improve their recycling efforts and save money on waste collection.  Nordstrom department stores are doing this as well.

Jason Gates spends a lot of his time thinking about trash, and how we can generate less of it.

Since 2013 his San Francisco-based startup, Compology, has used cameras and artificial intelligence to monitor what’s thrown into dumpsters and trash containers at businesses such as McDonald’s restaurants and Nordstrom department stores. The point is to make sure dumpsters are actually full before they’re emptied and to stop recyclable materials like cardboard from being contaminated by other junk so it, too, doesn’t become waste.

“We’ve found that most businesses and people have the right intentions about recycling, but oftentimes they just don’t know what the proper way to recycle is,” Gates, CEO of Compology, told CNN Business’ Rachel Crane.

To help them do it correctly, Compology puts trash-monitoring cameras and sensors inside industrial waste containers. The cameras take photos several times each day and when the container is lifted for dumping. An accelerometer helps trigger the camera on garbage day.

AI software analyzes the images to figure out how full the container is and can also let a customer know when something is where it shouldn’t be, such as a bag of trash tossed into a dumpster filled with cardboard boxes for recycling. Gates said the company’s cameras can cut the amount of non-recyclable materials thrown in waste containers by as much as 80%.

Continue reading… “McDonald’s Restaurants Are Putting Cameras, Sensors, and AI Technology in Their Dumpsters”


Bonnie Burton

In Japan, not only can you have artificial intelligence pick your mate, but you can also have two giant Pikachu mascots standing by as you say I do.

Japan’s Cabinet Office is asking for budget approval for a new dating service driven by artificial intelligence.

Finding the perfect mate can feel impossible, especially when in-person interactions have come to a screeching halt due to COVID-19 lockdowns. But if you live in Japan, the government there wants to help you find eternal love — or at least your future spouse — using artificial intelligence

In an effort to boost Japan’s declining birth rate, the government has been trying to help single heterosexual men and women find true love so they get married and start families. The number of annual marriages in Japan has fallen from 800,000 in 2000 to 600,000 in 2019.

According to Sora News 24, roughly 25 of Japan’s 47 prefectures currently have some sort of government-run matchmaking service for singles where the users plug in their preferences for a potential mate — including age, income and educational level. The dating services then provide a list of other users who meet their criteria. 


AI warning : Robot soldiers only 15 years away from ‘changing face’ of warfare – expert


ARTIFICIAL INTELLIGENCE (AI) empowered fighting robots will soon transform combat, a military expert has warned.

General Sir Nick Carter, the UK’s Chief of Defence last week suggested the British Army may one day fill its ranks with “robot soldiers”. This may seem like a daunting prospect, and one that will never come to fruition – but a military expert has now predicted such highly-intelligent military robots are actually a mere 15 years away from “changing the face” of warfare.

The high-tech machines will employ cutting-edge AI to inform strategy concerning the “layout of the land and possible threats” in real-time, Charles Glar has revealed.

Continue reading… “AI warning : Robot soldiers only 15 years away from ‘changing face’ of warfare – expert”

What can AI learn from Human intelligence?


At HAI’s fall conference, scholars discussed novel ways AI can learn from human intelligence – and vice versa.

Can we teach robots to generalize their learning? How can algorithms become more commonsensical? Can a child’s learning style influence AI?

Stanford Institute for Human-Centered Artificial Intelligence’s fall conference considered those and other questions to understand how to mutually improve and better understand artificial and human intelligence. The event featured the theme of “triangulating intelligence” among the fields of AI, neuroscience, and psychology to develop research and applications for large-scale impact.

HAI faculty associate directors Christopher Manning, a Stanford professor of machine learning, linguistics, and computer science, and Surya Ganguli, a Stanford associate professor of neurobiology, served as hosts and panel moderators for the conference, which was co-sponsored by Stanford’s Wu-Tsai Neurosciences Institute, Department of Psychology, and Symbolic Systems program.

Speakers described cutting-edge approaches—some established, some new—to create a two-way flow of insights between research on human and machine-based intelligence, for powerful application. Here are some of their key takeaways.

Continue reading… “What can AI learn from Human intelligence?”

UK Army could be 25-percent robotic by 2030, says British general


Do you want Skynet? Because this is how you get Skynet

The big picture: The UK military is moving forward with plans to develop and deploy several thousand combat robots, some which might be autonomous. So far, militaries worldwide have avoided using unmanned technologies in combat situations. Semi-autonomous drones have a pilot who is always at the controls, so humans make the final strike decisions, not AI.

British Army leaders think that by 2030 nearly a quarter of the UK’s ground troops will be robots. That is almost 30,000 autonomous and remote-controlled fighting machines deployed within about a decade.

“I suspect we could have an army of 120,000, of which 30,000 might be robots, who knows?” General Sir Nick Carter told The Guardian in an interview.

Continue reading… “UK Army could be 25-percent robotic by 2030, says British general”

How artificial intelligence may be making you buy things


If you are getting told off for spending too much on wine, maybe you can blame it on artificial intelligence

The shopping lists we used to scribble on the back of an envelope are increasingly already known by the supermarkets we frequent.

Firstly via the loyalty cards we scan at checkouts, and more and more so from our online baskets, our shopping habits are no longer a secret.

But now more retailers are using AI (artificial intelligence) – software systems that can learn for themselves – to try to automatically predict and encourage our very specific preferences and purchases like never before.

Continue reading… “How artificial intelligence may be making you buy things”

If you train robots like dogs, they learn faster


Instead of needing a month, it mastered new “tricks” in just days with reinforcement learning.

Treats-for-tricks works for training dogs — and apparently AI robots, too.

That’s the takeaway from a new study out of Johns Hopkins, where researchers have developed a new training system that allowed a robot to quickly learn how to do multi-step tasks in the real world — by mimicking the way canines learn new tricks.

Continue reading… “If you train robots like dogs, they learn faster”

Disney’s robot can perform realistic eye movements & social interaction

In the past few years, scientists and engineers have developed robots for automated systems such as performing repetitive tasks. Meanwhile, Disney Research has been developing human-like robots with abilities ranging from performing stunts to having eerie eye interactions.

Disney Research recently published a paper that described a realistic and interactive gaze with the Audio-Animatronic humanoid. Previous robot developments have focused on technical implementation with human interaction. The team’s latest advancements include creating a gaze interaction “through the lens of character animation where the fidelity and believability of motion are paramount,” wrote the authors.

For nearly three decades, Disney has been developing animatronic figures, or life-like robots combined with audio and visual elements. These animal or human characters are seen in Disney theme parks around the world.

Continue reading… “Disney’s robot can perform realistic eye movements & social interaction”