As companies rush to develop delivery robots, disinfection systems, and autonomous vehicles, their engineering teams must maximize productivity amid pandemic and business constraints. New software and hardware platforms put artificial intelligence at the edge to facilitate robotics innovation and speed to market. For instance, ADLINK Technology Inc. said its Edge AI portfolio automates edge computing processes to free developers from the need for advanced knowledge of data science and machine learning models.
In June, ADLINK released the ROScube-I, a new controller using Intel processors that it said supports a wide variety of sensors and actuators to enable faster, easier, and scalable development of robotics applications. The Taiwan-based company also launched the ROScube-X, which uses the NVIDIA Jetson AGX module for robotics applications requiring artificial intelligence and minimal power consumption. Both controllers are built on ROS 2, the latest version of the open-source Robot Operating System.
“The difference between autonomous machines such as a Roomba and an autonomous vehicle is just a matter of scale, complexity, and risk,” said Joe Speed, field chief technology officer at ADLINK. “Passenger-carrying applications are stalled because of the coronavirus, but AMRs [autonomous mobile robots], AGVs [automated guided vehicles], and cargo robots are facing demand expected for a two-year period within six months.”
Defining edge AI and sharing more machine vision R&D
“We don’t use the mangled definition of ‘edge computing’ that network operators use,” Speed told The Robot Report. “We’re not just talking about the edge of telecommunications networks; we’re moving compute where sensors and data are generated — not the office or the data center. These controllers are for robots in industries like aerospace and vehicle manufacturing.”
“ADLINK’s technology and software partners enable things like optical inspection, hand-eye coordination for robots, and rugged industrial applications,” he said. “Compute can be built right into the end of the robot arm.”
“There are people doing a lot with computer vision with hardware-accelerated machine learning for object detection and SLAM [simultaneous localization and mapping], but not enough gets contributed upstream,” Speed noted. “With ROS, we have a solid framework, and some university research is published, but there is not enough Github reposting of code.”
“Intel has done well with OpenVINO, enabling the MoveIt path planner, and for perception, adding Movidius support for the RealSense cameras in ROS. With its Keem Bay modules, it has achieved Xavier-level performance on a fifth of the power budget,” he said. “NVIDIA historically hasn’t done a lot of direct contributions to ROS, but I think that’s going to change with Jetson GPUs [graphics processing units].”
ROScube-X. Source: ADLINK
ADLINK touts ties to the global robotics community
“Unlike random box builders, ADLINK has software labs around the world,” said Speed. “We’re big contributors to ROS 2, along with Amazon Web Services [AWS], Bosh, iRobot, and Open Robotics, as well as to the Eclipse Foundation and Autoware for Tier 1 middleware for autonomous driving.”
Speed said he expects big hardware and middleware players to increasingly support ROS. “ARM recently joined as a contributor. Its plans for profiling, optimization, and building trust in ARM support affect the behavior of robots,” he said. “The Cyclone DDS [Data Distribution Service] is very small, fast, and efficient. If it can move data with half the CPU cycles, it has implications for battery life.”
ADLINK says ROS 2 designed for commercial developers
ROS was widely viewed as not robust enough for commercial applications, but the Foxy Fitzroy build presents new opportunities, Speed said.
“A lot of people tried ROS and didn’t like it,” he acknowledged. “ROS Classic was very much a tool for academics, and people would harden it and add to it. They didn’t advertise it, but companies like Cruise were originally built on ROS and NVIDIA Isaac.”
“It’s not obvious to outsiders, but ROS 2 has a heavy investment from technology corporations,” said Speed. “There are dozens of contributing engineers at AWS, and those who aren’t roboticists are software engineers and testing and QA [quality assurance] people with serious skills.”
“We knew what was needed for ROS 2 — fault tolerance and real-time analytics for safety,” he said. “We rearchitected ROS to be built on DDS, which is used by industry, the U.S. Department of Defense, and NASA. DDS is in every Fujitsu fiber switch, switches in 15 navies, and medical and farming robots.”
“By building robots on a solid foundation, middleware can get to autonomy faster,” Speed claimed. “Engineers can get to bigger, heavier things that move faster or with more valuable cargo or people.”
“If you were on the fence, Foxy is the time,” he said. “That release has the contributions of 150 engineers. If you look at the development, testing, and release for commercial software, it would have been an $8 million effort.”
“When you look at ROS or ROS 2, I challenge you to find more than 10 robotics startups that are not using it,” said Speed. “ROS is coming to arms, too. Trajekt Sports has developed a robot that can accurately replicate any recently recorded MLB pitch.”
Pandemic puts demands on robotics development
“While autonomous passenger vehicle programs have slowed a bit or are stalled, delivery robotics has sped up,” said Speed. “For example, Damien Declerq at Neolix has described a 20x increase in orders in 90 days. The pandemic has changed the timelines.”
“I have friends at Box Robotics in Philadelphia who have 36 years of experience in warehouse robots,” he said. “They say COVID-19 has not only accelerated the shift from retail stores to e-commerce, but it has also put a real strain on warehouses and supply chains.”
“We’re working with Ouster and SICK on using ROS 2 perception to make a warehouse vehicle drive twice as fast safely. This would save 30% of the cost and get more throughput,” Speed said. “Most things move at 2 m/sec. but really at 1 to 1.5 m/sec., but they should really be able to move the speed of a human or forklift at 5 m/sec.”
ADLINK’s customers include a major aerospace manufacturer that requires any robots that it buys to support ROS, Speed said. “ROS Industrial is starting to be built into the procurement requirements of big manufacturers,” he said.
ROS 2 development environment. Source: ADLINK
Fleet management challenges ahead
“There is a multiyear program to automate Changi General Hospital in Singapore,” he said. “It’s a heterogenous environment in which robots must integrate with hospital systems and multiple robots. A robot can summon an elevator, roll into it, and make way for another robot. These robots are already sharing space with humans, doing everything from resupply to bedside care, all working together.”
ADLINK is also working with major e-commerce providers on autonomous delivery, AMRs, and self-driving vehicles, but Speed declined to name any partners.
“With Autoware for designing autonomous cargo delivery, a robot could pick up parts in part of a car factory, navigate onto a street and into another building to deliver parts at the right moment,” he said. “This combines indoor and outdoor navigation and operating in areas with other robots.”
“ADLINK got me to join partly with its work on assistive technologies,” recalled Speed. “We’re working on an autonomous bus with MIT, Princeton, AARP, IBM, and the Mayo Clinic. With cloud-based remote operations, help from AWS, and ROScube-I and ROScube-X, robotics developers can meet these urgent timelines.”
The post ADLINK CTO describes value of edge AI, ROS 2, and faster robotics development appeared first on The Robot Report.