ARK
The ARK will save all Kilobots from their simplicity!
Contents
Source code
Source code of ARK is available on GitHub at: https://github.com/DiODeProject/KilobotArena
Source code of the Camera Calibration Software is available on GitHub at: https://github.com/DiODeProject/KilobotArenaCalibration
Publication
The ARK system has been published in:
A. Reina, A. J. Cope, E. Nikolaidis, J. A.R. Marshall and C. Sabo. ARK: Augmented Reality for Kilobots. IEEE Robotics and Automation Letters 2(3): 1755-1761, 2017.
If you use the ARK technology in your experiments, please, cite this work in your paper.
Supplementary material
Here, we provide the supplementary video of the article A. Reina et al. ARK: Augmented Reality for Kilobots. IEEE Robot. Autom. Lett. 2017.
youTube link: https://www.youtube.com/watch?v=K0KvPzhOSDo
The supplementary video showcases the ARK's functionalities through three demos. In Demo A, ARK automatically assigns unique IDs to a swarm of 100 Kilobots. Demos B shows the possibility of employing ARK for the automatic positioning of 50 Kilobots, which is one of the typical preliminary operations in swarm robotics experiments. These operations are typically tedious and time consuming when done manually. ARK saves researchers' time and makes operating large swarms considerably easier. Additionally, automating the operation gives more accurate control of the robots' start positions and removes undesired biases in comparative experiments. Demo C shows a simple foraging scenario where 50 Kilobots collect material from a source location and deposit it at a destination. The robots are programmed to pick up one virtual flower inside the source area (green flower field), carry it to the destination (yellow nest), and deposit the flower there. When performing actions in the virtual environments, the robot signals by lighting its LED in blue. When picking up a virtual flower from the source, the robot reduces the source's size for the rest of the robots (by reducing the area’s diameter by 1cm). Similarly when a robot deposits flowers at its destination, the area increases by 1 cm. This demo shows that robots can perceive (and navigate) a virtual gradient, can modify the virtual environment by moving material from one location to another, and can autonomously decide when to change the virtual environment that they sense (either the source or the destination).
For further information, check the paper or contact A.Reina@sheffield.ac.uk.
Source code for Demo A, B, and C
The source code to run Demo A, B, and C is available here: Experiments Source Code