Monday, September 23, 2024
Google search engine
HomeData Modelling & AIRedefining Robotics: Next Generation Warehouses

Redefining Robotics: Next Generation Warehouses

People picture robots changing to look more like humans, but in reality, the evolution of robotics involves things you can’t actually see. For Bastiane Huang at Osaro, the development of robots means greater advances in autonomy. Building brains for robots gives them more flexibility for tasks and creates more significant traction, not only in the business world but all fields affected by cutting edge robotics. Hint, it’s just about every field. Let’s take a look at how we’re redefining robotics and how Huang sees the future shaping up.

[Related Article: Shared Control For Dynamic Systems Between Humans and Robots]

Machine Learning in Robotics

Machine learning helps robots move away from automation to autonomy. We’ve used robots for single tasks to streamline production, but robot mastery of multiple tasks requires more. Moving through the levels of autonomy gives us greater freedom for deployment and reduces the need for (expensive and time intensive) human intervention.

The autonomy we see in deployment now still requires substantial human intervention. Even in warehouses, automation can’t handle the types of products that need to go into boxes. The advances we’ve made on the assembly line can’t compete with the simple human ability to put different types of objects into boxes. We’re currently trying to move ahead of this.

Deep networks could be the secret to this evolution. When deploying deep learning, we can utilize the types of adjustments to learning that humans can make in their own environments. The robot learns and action and repeats it just like a human with adjustments along the way until the robot is proficient in the target task.

Reinforcement like this lets humans move away from programming robots for every little thing. It also allows computers to address finally “hard to describe” questions. Robots can handle variabilities and adjust accordingly while learning at scale. The short? Faster learning with less human intervention.

We have a labor problem. Smaller labor forces require us to rethink the types of things we need humans to do. Plus, the turnover rate for jobs like this are incredibly high, costing businesses big money, costs sometimes passed on to the consumer. Plus the rise in e-commerce could cause serious strain for scale if we can’t figure out how to reduce the load on human workers and rebuild our fulfillment using robotics. 

Challenges and Opportunities

There are a few challenges that arise with warehouse robotics at this stage. Humans in warehouse work have no trouble with reflective packaging or understanding how much force to use for a particular product. If you’re packaging a tube of gel, you understand that squeezing with a certain amount of pressure will explode the package. Machines, not so much. These are what Huang sees as the biggest challenges facing robotics at warehouse scale.

Sensor

The quality of the camera prevents real-world sensors from performing successfully. They fail a lot when faced with reflective surfaces or translucent packaging. Osaro’s answer is a combination of image recognition and segmentation techniques, For example, the company uses convolutional neural networks plus pixel-level prediction to increase accuracy rates for object recognition.

Training Data

Reinforcement learning is data-hungry. Osaro uses data augmentation with imitation learning, which is having someone demonstrate to the robot how to perform a task. They’re also experimenting with new forms of learning such as fusion learning to reduce the amount of data needed to train a model fully. 

Scalable Use Cases

Manufacturing can vary so much from factory to factory that it’s tough to transfer learning from one warehouse to another. Starting with scalable use cases could help get robotics in the hands of more warehouses faster.

Integration

The system should be integrated tightly with the software you have. It’s fragmented, unfortunately. Each robotics company has proprietary software, methods, and products, and that can make it difficult to integrate fully with these products. For Osara, that meant remaining a software company to provide flexibility. The customer can choose the right robotics to work with the software Osara provides.

Model Training

Customers want models to work. Additional training needs to be quick and continuous. Providing pre-trained models so that the customer doesn’t have to worry about it is a great way to get robotics into the right hands. When additional training is required, Osara continues to work with the client as a software company.

Osaro’s Approach

Osaro is working within these parameters to design robotics that isn’t affected by environmental issues like light variation or shape inconsistencies. They’re building scalable software-based picking solutions that depend on the kind of deep reinforcement learning that could handle autonomous choosing. 

This type of machine learning is interdisciplinary at its best. We need to take care of back end structure and design solutions for tight integrations. Supporting research into better sensors and servers so that customers don’t have to spend tons of money on them is also the future of warehouse robotics. 

[Related Article: Trends in AI: Towards Learning Systems That Require Less Annotation]

Thoughts on the Future

As accuracy improves, using robotics in industries such as automotive or food industries will be more common. Food brings new challenges, but for Osara, blending these traits for working with the food industry. They worked recently with a Japanese company for assembling bento boxes, a difficult task when nuggets are small and not uniform in shape. They tried to increase noise and train with difficult objects to better reflect real-world problems.

Robotics will continue to improve as companies like Osara pursue better training methods, deeper learning, and more consistent autonomy. In the future, as we move past redefining robotics and into further deploying them, we’ll move through the stages of independence so that we can finally deploy robotics to our workforce where we need it most.

RELATED ARTICLES

Most Popular

Recent Comments