How Autonomous Mobile Robots (AMRs) Use Vision Technology to Operate Safely in Dynamic Environments
December 19, 2019
Noon - 1 PM ET
ABOUT THIS WEBINAR
Innovating distributors, fulfillment centers, and manufacturers are rapidly adopting Autonomous Mobile Robot (AMR) systems and platforms to automate workflows throughout the facility to address labor shortages, improve productivity, enhance efficiency, and gain flexibility. In order to be fully collaborative, AMRs need to navigate in a space with people, forklifts, and other material handling equipment, often in a congested, dynamic facility. In this webinar you will learn how AMRs leverage advanced vision technologies to perceive the world.
- An overview of AMR sensor technology and they work
- Understand sensor limitations in practical applications
- How AMRs use 2D and 3D vision to navigate safely and avoid dynamic obstacles
Presented by Fetch Robotics
Director of Robotics, Fetch Robotics
Russell Toris is Director of Robotics at Fetch Robotics. His passion is bringing robots out of the lab and into the world to be used by everyday people. Throughout his career, Toris has built and distributed tools which enable researchers, developers, and end-users to interact with robots in an intuitive way using popular cloud and web-based technologies. His research in Learning from Demonstration, Cloud Robotics, and Mobile Manipulation has focused primarily on non-expert end-users which has helped Fetch Robotics bring robots into the workplace. Toris received his PhD in Computer Science from Worcester Polytechnic Institute.