Neocortex Enables Random Part Handling and Automated Assembly
These applications deliver high resolution real-time 3D, identifying complex, highly reflective or black objects in 6 DOF – whether machined, glass or plastic, and can be combined with 3D inspection prior to robot placement.
Neocortex increases flexibility in recognizing various objects, and enables the robot to react to different orientations, bin depths or lighting in real time. Sensor input and dynamic machine control provide real-time, reactive control in chaotic environments.
This flexible random bin picking video shows Neocortex 4.1 processes real-time 3D sensor data and provides intelligent part object recognition as vision guidance to the robot in under 500ms – for every move. Note in the last part of the video when the bin is moved, the robot reacts and still finds the object!
This video shows a Universal Robotics and Doerfer Companies demonstration of their adaptive vision guidance solution for pharmaceuticals and medical devices, eliminating the need for vibratory, centrifugal or tray feeders. It locates pharmaceutical bottles randomly placed in a bin, guides a Motoman 6-axis robot to pick up these bottles pneumatically, reorients them, and places them with lids facing up into an assembly platform.
Fast – Up to 60 parts/min in any orientation or tilt
Flexible – Rigid/semi-rigid, opaque/tinted objects
Scalable- Off-the-shelf 3D sensors for just right accuracy
Tested – With major robot manufacturers; 3 sigma reliability
ROI – Less than 24 months bayback
by Phone: 615-366-7281
Universal Robotics has combined automated intelligence with high-speed control to enable 3D sensor input to update robot behavior in real-time. Universal’s applications integrate the best sensors and equipment with three main components to ensure measurable return on investment.
Intelligence (Neocortex). Universal’s patented Neocortex provides intelligence for operational analysis or flexible machine control in chaotic environments, enabling identification and manipulation of boxes, cases, and cartons never seen before.
Sensing (Spatial Vision Robotics). Sensor data, managed by Spatial Vision Robotics software, give a complete continuous 3D update of the entire workstation area. Universal engineers are experts at mixing various sensor technologies to generate accurate multi-dimensional vision guidance and inspection.
Motion Control. Universal’s expert motion control programming optimizes throughput and accuracy of complex motion.
Traditional vision approaches can’t automatically detect and identify complex parts randomly placed in a pile, or in a bin. Even with the problem simplified, the result was either overly complex, too slow for normal operations, or had low return on investment.
Neocortex Random Bin Picking breakthrough from Universal uses a suite of sensors that integrates off-the-shelf time of flight sensors, structured light sensors, lasers, or cameras for stereoscopic vision. The cost effective approach eliminates expensive fixturing and automated tables, and works well under varying light conditions.
The standard application moves one part in any orientation at up to 60 per minute. Whether loosely or tightly packed, on a tray, in a bin, the parts can be in any orientation. This application dynamically provides 3D vision guidance to the robot for parts, as well as provides dynamic obstacle avoidance in case the bin is bumped during operation or isn’t put exactly in the right spot.
Neocortex software can work with any popular industrial robot, and with a wide variety of sensors. Universal selects the combination of sensors required to accurately pick and/or inspect the part. Typical 3D accuracy requirements from customers range from ± 0.25mm to ± 4.0mm accuracy. The Neocortex Random Bin Picking application can be integrated with new robots or retrofitted with existing robots, whether 6-axis articulating robots or high-speed 3 or 4 axis delta robots.
Depending on the sensing required, other touch, acceleration, and proximity sensors may be added. The Application provides the 3D position (X,Y,Z) and pose (Rx,Ry,Rz) of complex parts in arbitrary orientations in a bin, on a conveyor or assembly area. The parts can be machined, plastic, clear, reflective, or various colors. After intrinsic calibration of cameras and sensors, and extrinsic calibration of the robot, the Application coordinates 3D input to the robot. It processes an overall 3D view of the parts, selects the part(s) to be robotically picked, transmits the position and pose to the robot controller, and monitors the motion of the robot as it places the part. The Application can match CAD models, but doesn’t require models. It can be trained to recognize a wide variety of objects. When the bin is empty, it can alert the factory automation via a PLC.