Robot navigation
For any mobile device, the ability to navigate in its environment is important. Avoiding dangerous situations such as collisions and unsafe conditions (temperature, radiation, exposure to weather, etc.) comes first, but if the robot has a purpose that relates to specific places in the robot environment, it must find those places.
This article will present an overview of the skill of navigation and try to identify the basic blocks of a robot navigation system, types of navigation systems, and closer look at its related building components.
Robot navigation means the robot's ability to determine its own position in its frame of reference and then to plan a path towards some goal location. In order to navigate in its environment, the robot or any other mobility device requires representation, i.e. a map of the environment and the ability to interpret that representation.
Navigation can be defined as the combination of the three fundamental competences[citation needed]:
- Self-localisation
- Path planning
- Map-building and map interpretation
"Map" in this context denotes any one-to-one mapping of the world onto an internal representation.
Robot localization denotes the robot's ability to establish its own position and orientation within the frame of reference. Path planning is effectively an extension of localisation, in that it requires the determination of the robot's current position and a position of a goal location, both within the same frame of reference or coordinates. Map building can be in the shape of a metric map or any notation describing locations in the robot frame of reference.
Contents
1 Vision-based navigation
1.1 Indoor navigation
1.2 Outdoor navigation
1.3 Autonomous Flight Controllers
2 See also
3 References
4 Further reading
5 External links
Vision-based navigation or optical navigation uses computer vision algorithms and optical sensors, including laser-based range finder and photometric cameras using CCD arrays, to extract the visual features required to the localization in the surrounding environment. However, there are a range of techniques for navigation and localization using vision information, the main components of each technique are:
- representations of the environment.
- sensing models.
- localization algorithms.
In order to give an overview of vision-based navigation and its techniques, we classify these techniques under indoor navigation and outdoor navigation.
The easiest way of making a robot go to a goal location is simply to guide it to this location. This guidance can be done in different ways: burying an inductive loop or magnets in the floor, painting lines on the floor, or by placing beacons, markers, bar codes etc. in the environment. Such Automated Guided Vehicles (AGVs) are used in industrial scenarios for transportation tasks. Indoor Navigation of Robots are possible by IMU based indoor positioning devices.[1][2]
There are a very wider variety of indoor navigation systems. The basic reference of indoor and outdoor navigation systems is "Vision for mobile robot navigation: a survey" by Guilherme N. DeSouza and Avinash C. Kak.
Also see "Vision based positioning" and AVM Navigator.
Some recent outdoor navigation algorithms are based on convolutional neural network and machine learning, and are capable of accurate turn-by-turn inference [3].
Autonomous Flight Controllers
Typical Open Source Autonomous Flight Controllers have the ability to fly in full automatic mode and perform the following operations;
- Take off from the ground and fly to a defined altitude
- Fly to one or more waypoints
- Orbit around a designated point
- Return to the launch position
- Descend at a specified speed and land the aircraft
The onboard flight controller relies on GPS for navigation and stabilized flight, and often employ additional Satellite-based augmentation systems (SBAS) and altitude (barometric pressure) sensor.[4]
See also
- Neato Robotics
References
^ Chen, C.; Chai, W.; Nasir, A. K.; Roth, H. (April 2012). "Low cost IMU based indoor mobile robot navigation with the assist of odometry and Wi-Fi using dynamic constraints". Proceedings of the 2012 IEEE/ION Position, Location and Navigation Symposium: 1274–1279. doi:10.1109/PLANS.2012.6236984..mw-parser-output cite.citation{font-style:inherit}.mw-parser-output q{quotes:"""""""'""'"}.mw-parser-output code.cs1-code{color:inherit;background:inherit;border:inherit;padding:inherit}.mw-parser-output .cs1-lock-free a{background:url("//upload.wikimedia.org/wikipedia/commons/thumb/6/65/Lock-green.svg/9px-Lock-green.svg.png")no-repeat;background-position:right .1em center}.mw-parser-output .cs1-lock-limited a,.mw-parser-output .cs1-lock-registration a{background:url("//upload.wikimedia.org/wikipedia/commons/thumb/d/d6/Lock-gray-alt-2.svg/9px-Lock-gray-alt-2.svg.png")no-repeat;background-position:right .1em center}.mw-parser-output .cs1-lock-subscription a{background:url("//upload.wikimedia.org/wikipedia/commons/thumb/a/aa/Lock-red-alt-2.svg/9px-Lock-red-alt-2.svg.png")no-repeat;background-position:right .1em center}.mw-parser-output .cs1-subscription,.mw-parser-output .cs1-registration{color:#555}.mw-parser-output .cs1-subscription span,.mw-parser-output .cs1-registration span{border-bottom:1px dotted;cursor:help}.mw-parser-output .cs1-hidden-error{display:none;font-size:100%}.mw-parser-output .cs1-visible-error{font-size:100%}.mw-parser-output .cs1-subscription,.mw-parser-output .cs1-registration,.mw-parser-output .cs1-format{font-size:95%}.mw-parser-output .cs1-kern-left,.mw-parser-output .cs1-kern-wl-left{padding-left:0.2em}.mw-parser-output .cs1-kern-right,.mw-parser-output .cs1-kern-wl-right{padding-right:0.2em}
^ GT Silicon (2017-01-07), An awesome robot with cool navigation and real-time monitoring, retrieved 2018-04-04
^ Ran, Lingyan; Zhang, Yanning; Zhang, Qilin; Yang, Tao (2017-06-12). "Convolutional Neural Network-Based Robot Navigation Using Uncalibrated Spherical Images" (PDF). Sensors. MDPI AG. 17 (6): 1341. doi:10.3390/s17061341. ISSN 1424-8220.
^ http://autoquad.org/wiki/wiki/configuring-autoquad-flightcontroller/flying/
- [DeSouza, G.N.; Kak, A.C., "Vision for mobile robot navigation: a survey," in Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol.24, no.2, pp. 237–267, Feb 2002, URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=982903&isnumber=21179]
Mobile Robot Navigation Jonathan Dixon, Oliver Henlich - 10 June 1997
Further reading
- BECKER, M. ; DANTAS, Carolina Meirelles ; MACEDO, Weber Perdigão, "Obstacle Avoidance Procedure for Mobile Robots". In: Paulo Eigi Miyagi; Oswaldo Horikawa; Emilia Villani. (Org.). ABCM Symposium Series in Mechatronics, Volume 2. 1 ed. São Paulo - SP: ABCM, 2006, v. 2, p. 250-257.
ISBN 978-85-85769-26-0
External links
- line tracking sensors for robots and its algorithms