Neocortex – Software with an IQ®
Neocortex® is a new form of artificial intelligence that uses sensor information to learn. It discovers patterns in chaotic environments that are relevant to an assigned task. It then analyzes those patterns to understand complexity, improving process. It is based on technology originally co-developed between NASA and Vanderbilt University.
It is unlike any other technology. The software is independent of any hardware, allowing it be used for a host of applications from data analysis to robot and motor control. It uses sensor information to discover multi-dimension patterns in dynamic environments.
By automatically handling a wide range of constantly changing parts, Neocortex reduces the need for manual change-overs.
Learns new attributes of never before seen SKU’s
Does not become outdated
Reacts to changes in objects and the environment
Remote expert support with cloud-based backup
Provides ongoing volume-based efficiency metrics
Includes software upgrades
Less than 24 month ROI
Or by Phone: 615-366-7281
“Neocortex artificial intelligence represents a significant breakthrough which was needed for robotics to be utilized in warehouses and fulfillment centers”
Roger Christian, VP, Yaskawa America, Inc., Motoman Robotics
Neocortex handles a wide variety of objects in stride. Whether moving multiple bag types from a bin, or unloading a trailer with different case sizes, Neocortex handles every object in real-time. It can differentiate colors, marking, barcodes or SKU numbers. Neocortex distinguishes between normal and arbitrary object traits – such as box flap bent over versus a cardboard edge.
Typical vision systems must be pre-configured in order to match the object with a CAD model, known shape, surface or distinguishing feature. Due to processing speed and limited memory, these vision systems can’t randomly handle more than one or two dozen known objects during operation.
In contrast, in Neocortex’s first application – the Unlimited Depalletization Application – it learned attributes that define a “box.” It reacted to a moving box, found a case with unique combination of labels it had never seen, and identified a damaged carton without preconfigured images, geometry or details.
Neocortex at the workcell provides volume throughput analysis. Material volume flow is analyzed by tracking physical objects on each pallet or in each bin, resulting in loading and unloading densities, speeds by time of day, shift, SKU, and vendor. This provides additional information for better decision making such as load balancing and better pallet densities.
Neocortex keeps track of every object it encounters with enough specificity to identify an object’s shape, volume, labels, and damage (if any). Data with this level of granularity is used to improve stacking for storage and trailer loading, warehouse throughput sequencing, object damage reporting, and reusable container tracking.
As variation increases, the need for learning on-the-fly also increases. This is the situation with logistics tasks, where nothing is guaranteed to be in its proper place or configuration. This randomness and uncertainty require a new way to approach package/parts handling. As unexpected variability is introduced, traditional programming approaches fail. Hard-coded programs cannot define unanticipated objects in an unstructured environment.
Previous generations of artificial intelligence tend to be brittle and cumbersome. They require pre-established, specific rules of action based on historical trends. This works when enough historical data is available, and when future events mimic previous activities. Greater variations require more and more rules, making artificial intelligence more complex, slower to react in real time, and always a step behind new changes.
Neocortex uses sensor information to learn. Neocortex discovers patterns in a chaotic environment that are relevant to an assigned task. This cybernetic intuitive learning process is mathematically represented in Neocortex, and is similar to how all mammals naturally learn. It is a dynamic process whereby experience builds fine-tuned understanding.
Neocortex’s interactive intelligence is patterned after mammal’s Sense > Act > Learn process
In collaboration with the Defense Advanced Research Projects Agency (DARPA), NASA Johnson Space Center aimed to build Robonaut 1 — a humanoid robot complete with dexterous manipulation — to work alongside its human counterparts.
In addition, artificial intelligence — which uses computers to perform tasks that usually require human intelligence — was lacking for robots to be able to respond to unanticipated changes in their environment. This was a particular concern for Robonaut, since there can be delays or even failures in the communication between mission control and the International Space Station. Without commands, the robot would become useless.
In 2001, in cooperation with DARPA, Johnson Space Center began working with Vanderbilt University and other universities.
Previous generations of artificial intelligence required pre-programming or hard programming of rules in order for the robot to determine how to respond. All of the objects in an environment had to be labeled and classified before the robot could decide how to treat them. Alan Peters, Associate Professor at Vanderbilt, aimed for software that could support robot autonomy by enabling it to sense a new object, determine its attributes, and decide how best to handle it.
Testing on Robonaut 1 demonstrated Alan Peters’ algorithms were able to produce learned knowledge from sensory and motor control interactions — just like mammals, but without having a program written to tell it what to do.
For Alan Peters, the work led to several patents related to robotics intelligence. Now, he serves as the Chief Technology Officer at Universal Robotics, a software engineering company in Nashville, Tennessee, where the NASA-derived technology is available in a product called Neocortex.
“The results significantly informed the mechanical and electrical design of the second generation, or Robonaut 2. [The students of Alan Peters] took it to a new level and gave the robot the ability to reason about how to handle and interact with objects and tools. It is now running on Robonaut 2 in space.”
Automation and Robotics Engineer at NASA Johnson Space Center