By Yulin Wang, Technology Analyst at IDTechEx
With the high inflation, increasing labor costs, labor shortage, energy crisis, and hybrid working, the manufacturing industry experienced significant turmoil in 2022. Although some of these have started to get eased, 2023 is still expected to be a challenging year for the manufacturing industry.
RELATED: Promising innovations in emerging image sensors, by IDTechEx
Forbes recently published an article, “The 5 Biggest Business Trends In 2023 Everyone Must Get Ready For Now”, listing a few key transitions in the manufacturing industry. One of the most interesting transitions in the manufacturing business is the trend toward digital transformation. Digital transformation, along with Industry 4.0, has been a buzzword for many years. However, despite the fancy name, digital transformation is often a vague concept for manufacturers. IDTechEx believes that digital manufacturing can be deconstructed into two main themes: safe human-robot interaction (HRI) to achieve higher productivity and an increased level of autonomous mobility for material and goods transportation. These themes are ultimately enabled by cutting-edge sensor technologies.
Safe HRI to Achieve Higher Productivity
Safety has always come as the overarching priority when it comes to using robots/machines in the manufacturing industry. Robots can pose a variety of hazards to workers. For example, while industrial robots are designed to operate at a safe distance from people, these devices traditionally lack the sensory skills required to identify adjacent humans.
Recently, with the fast adoption of collaborative robots (cobots), human operators are directly exposed to the workspace of robots, which can lead to further collisions, risks, and dangers. In order to mitigate safety concerns, IDTechEx has seen multiple sensors, such as force and torque sensors, LiDAR, and tactile sensors being installed on robots to equip them with better environmental perception and collision avoidance capabilities. One of the critical applications of sensors in robots is proximity detection and collision detection.
Proximity detection can be achieved using photoelectric sensors (photoelectric fences), LiDAR, and capacitive proximity sensors. Photoelectric sensors/light curtains can be an ideal solution for industry robots. A safety light curtain is made up of a transmitter and a receiver. The transmitter transmits modulated infrared light, which is received by the receiver to create an array of light beams (also known as a light curtain). When a human operator enters or is blocked by the protection net, the light receiver circuit replies through the internal control circuit, which outputs a signal to the machine, causing the machine to slow down or stop its operation, thereby preventing the occurrence of a potential collision.
By contrast, force and torque sensors are commonly used for cobots when it comes to collision detection. Unlike industrial robots, cobots work in the same workspace as humans, meaning that a physical light curtain/fence would not suffice. IDTechEx noticed that the majority of commercialized cobots are equipped with at least one force/torque (F/T) sensor at their joints. F/T sensors have two main functions, including force measurement and collision detection. F/T sensors are typically installed around the robot’s end-effectors to measure the force.
Depending on the tasks, the range of forces needs to be preset, and when the collision happens, the force or torque detected by the sensor will exceed the pre-determined range, thereby informing the robot to stop its operation. With the increasing safety requirement of HRI, more F/T sensors are expected to be installed. Most cobots have one F/T sensor installed at this stage, typically around the end-effector. However, IDTechEx has noticed that a few cobot OEMs (e.g., Franka Emika) are starting to incorporate more torque sensors on all the joints to enable better force control and collision detection.
Increased Level of Autonomous Mobility
Autonomous mobility is one of the most important parts of a robot’s autonomy. Autonomous mobility requires the robot to have the capabilities of navigating, localizing, and avoiding obstacles. In the context of the manufacturing industry, mobile robots, especially automated guided vehicles (AGVs) and autonomous mobile robots (AMRs), will be used for material transportation. The autonomous mobility function of mobile robots is enabled by sensors such as LiDAR, cameras, and ultrasonic sensors.
Different sensors have benefits and drawbacks, and in reality, multiple sensors are usually used in combination with each other to achieve the best overall performance. For instance, LiDAR is relatively easy to use, and they are immune to poor weather. However, LiDAR usually comes with a high cost. By contrast, cameras or imaging sensors are the only ones that can be used for object classification/recognition, but they have poor performance when it comes to adverse weather or limited visibility. In terms of the manufacturing industry, IDTechEx believes that cameras will be increasingly adopted because those robots tend to work in a well-controlled indoor environment with stable illumination.
At this stage, IDTechEx believes that many indoor AGVs in the manufacturing industry can perform on level 3 autonomy, meaning that the robotic onboard systems can achieve most of the autonomous driving tasks and multiple AGVs can be monitored simultaneously by one operator. With the trend toward level 4 and a higher level of autonomy, IDTechEx believes that more robust sensors will be incorporated. A detailed analysis of the market forecast can be found in IDTechEx’s latest research, “Sensors for Robotics 2023-2043: Technologies, Markets, and Forecasts”.
To find out more about this IDTechEx report, including downloadable sample pages, please visit www.IDTechEx.com/rosensors.
COVER IMAGE: Asmag.com