Lexus is reimagining the future of self-driving cars.
Although the future of autonomous cars is certainly exciting, much of what it will look like remains unknown. Will we still sit in the “driver’s” seat or will the interiors of new cars look more like a café? This is one of thquestions that Lexus is trying to answer.
The luxury carmaker partnered with two TED fellows to try and figure out what the future of self-driving vehicles will look like. Moreover, the project aims to lessen some fears about taking away the interactive part of driving.
Although true autonomous cars won’t be a reality for most consumers anytime soon, addressing these problems now will help make their adoption much smoother.
Concept illustration of the adaptable Wheel-and-Leg Transformable Robot currently being developed under a Defense Advanced Research Projects Agency contract.
A team of researchers is creating mobile robots for military applications that can determine, with or without human intervention, whether wheels or legs are more suitable to travel across terrains. The Defense Advanced Research Projects Agency (DARPA) has partnered with Kiju Lee at Texas A&M University to enhance these robots’ ability to self-sufficiently travel through urban military environments.
The DARPA OFFensive Swarm-Enabled Tactics (OFFSET) program awarded Lee, associate professor in the Department of Engineering Technology and Industrial Distribution and the J. Mike Walker ’66 Department of Mechanical Engineering, and a team of graduate students another contract after her prior successful accomplishments on developing a mixed-reality swarm simulator with embedded consensus-based decision making for adaptive human-swarm teaming as part of the OFFSET Sprint-3. This project was showcased at OFFSET’s third field experiment (FX3) with other participating teams.
Artist’s impression of the autonomous road repair system, which looks part-tank, part road roller. The Robotiz3d vehicle should be seen on UK roads next year
Scientists are building autonomous repair robots that will use AI to identify and fix potholes in UK roads.
Liverpool spin-out Robotiz3d is planning to put its robots on UK streets in 2021
The weird vehicles look like a cross between a road roller and a heavy duty tank
They use AI to identify potholes and can deposit and flatten asphalt as a quick fix
The electric, self-driving bots – which are being built by a spin-out company from the University of Liverpool called Robotiz3d – can find small cracks in the road and cover them with asphalt.
General Motors’ Cruise autonomous vehicle unit says it will pull the human backup drivers from its vehicles in San Francisco by the end of the year.
Cruise CEO Dan Ammann said in a statement that the company got a permit Thursday from California’s Department of Motor Vehicles to let the cars travel on their own.
The move follows last week’s announcement from Waymo that it would open its autonomous ride-hailing service to the public in the Phoenix area in vehicles without human drivers.
Waymo, a unit of Google parent Alphabet Inc., is hoping to eventually expand the service into California, where it already has a permit to run without human backups.
Cruise has reached the point where it’s confident that it can safely operate without humans in the cars, spokesman Ray Wert said. There’s no date for starting a ride service, which would require further government permission, he said.
Polestar accelerates the shift to sustainable mobility, by making electric driving irresistible.
Parking prices, congestion charges, fuel stations – the costs and diversions one never had to face on their way to work in the age of horse and carriage. The switch from horse-riding to automobiles changed the kind of materials used to build roads – slippery asphalt replaced cobbled streets and dirt roads.
Autonomous driving and other new transportation modes are key technological megatrends in the infrastructure industry. This calls for the built environment to adjust to these latest mobility technologies as they shape the future of roads and real estate construction.
In the public sphere, assimilation to these new technologies in mobility has already begun in the regulatory space. Nations across Asia, Europe and North America for example have already issued autonomous testing permits and offered regulation for self-driving cars on public roads.
It’s 2025 and driverless cars still aren’t zooming around everywhere. Where are the chilled out passengers on their phones, or napping, as an invisible “driver” navigates a crowded intersection?
They’re still mostly stuck in the backseat as a human driver shuttles them around. They’re likely in a highly automated and autonomous-capable vehicle, but a human is still there monitoring the machine. That doesn’t mean robo-vehicles aren’t on the road. Instead they’re working behind the scenes. They’re picking up our groceries, filling trucks with our endless online shopping purchases, and hauling crates of produce across the country.
The pandemic made us more comfortable with the idea of autonomous vehicles, but most industry experts still predict a slow transition to their widespread adoption in the U.S. When you’re avoiding exposure to a deadly disease, perhaps a driverless robotaxi, like the Waymo One service in suburban Phoenix, looks more attractive. But autonomous tech and testing regulations won’t accelerate just because of sudden mainstream acceptance and new social distancing norms.
Motional, the new brand from self-driving startup Aptiv and Hyundai, asked just over 1,000 U.S. adults in July about autonomous vehicle (AV) perception. More than 60 percent said AVs “are the way of the future.” A quarter of those surveyed said they are interested in experiencing the tech regularly. A year ago, the American Automobile Association (AAA) surveyed a similarly sized group of Americans and found 71 percent were afraid to ride in a self-driving car. (Note: How the two groups’ demographics compare is unknown.)
The next five years will likely continue to shift and refocus how we think about self-driving technology. While self-driving ride-shares won’t be the norm, more people will have experienced autonomy on the road. Motional CEO Karl Iagnemma thinks that by 2025, “if you haven’t taken a driverless journey you will know someone who has.”
Researchers at the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory and the University of Texas at Austin have developed an algorithm that could have big implications for autonomous vehicles. With the algorithm, autonomous ground vehicles are able to improve their own navigation systems by watching a human drive.
The approach developed by the researchers is called adaptive planner parameter learning from demonstration, or APPLD. It was tested on an Army experimental autonomous ground vehicle.
The research was published in IEEE Robotics and Automation Letters. The work is titled “APPLD: Adaptive Planner Parameter Learning From Demonstration.”
A Ford Escape automatically stops as a pedestrian crosses in front during a self park demonstration, Tuesday, Aug. 25, 2020 in Detroit.
Ford, Bosch and real estate company Bedrock are teaming up to test technology that will let vehicles park by themselves in parking decks. The companies are testing the technology using floor-mounted sensors and computers that can control mainly existing features in the Ford Escape.
They say the technology is likely to arrive before widespread use of fully autonomous vehicles because sensors and computers inside parking decks can be used.
Developer of affordable imaging radars for the automotive industry, RFISee is unveiling the first Phased Array 4D imaging radar on a chip. RFISee’s all weather radar has proven its ability to detect cars from 500 meters and pedestrians from 200 meters, with an angular resolution greater than 1°.
The company’s engineers have adapted Phased Array antenna technology, used in military systems including the F-35 fighter jet and in air defence systems, while at the same time reducing the price to the current level of automotive sensors. Prototypes of RFISee’s radar are under evaluation by top automotive OEMs and Tier-1s.
Unlike many traditional and new types of radar, RFISee’s patented 4D imaging radar uses a powerful focused beam based on proprietary Phased Array radar technology. The focused beam created by dozens of transmitters rapidly scans the field of view. The receivers ensure a much-improved radar image, a better signal to noise ratio, and a detection range of obstacles such as cars and pedestrians that is six times broader when compared to existing radars. The competitive edge of RFISee’s radar prototype has already been proven in extensive testing.
Helm.ai today announced a breakthrough in unsupervised learning technology. This new methodology, called Deep Teaching, enables Helm.ai to train neural networks without human annotation or simulation for the purpose of advancing AI systems. Deep Teaching offers far-reaching implications for the future of computer vision and autonomous driving, as well as industries including aviation, robotics, manufacturing and even retail.
Artificial intelligence, or AI, is commonly understood as the science of simulating human intelligence processed by machines. Supervised learning refers to the process of training neural networks to perform certain tasks using training examples, typically provided by a human annotator or synthetic simulator to machines to perform certain tasks, while unsupervised learning is the process of enabling AI systems to learn from unlabelled information, infer inputs and produce solutions without the assistance of pre-established input and output patterns.
The expression “elephant in the room” refers to an important question that everyone knows about but no one wants to discuss because it makes them uncomfortable.
Today, in the area of Autonomous Vehicles (AVs), there are three elephants in the room which are worth exploring.