Difference between revisions of "ARK"

From Kilobots
Jump to: navigation, search
 
(5 intermediate revisions by the same user not shown)
Line 7: Line 7:
 
Source code of the Camera Calibration Software is available on GitHub at: https://github.com/DiODeProject/KilobotArenaCalibration
 
Source code of the Camera Calibration Software is available on GitHub at: https://github.com/DiODeProject/KilobotArenaCalibration
  
==Supplementary material==
+
Source code for experiments and specific ARK functionalities, such as motor calibration and ID assignment, can be found on the other page [[Experiments Source Code]].
  
Here, we provide the supplementary videos of the article "ARK: Augmented Reality for Kilobots", by Andreagiovanni Reina, Alex Cope, Eleftherios Nikolaidis, James Marshall and Chelsea Sabo, submitted to IEEE Robotics and Automation Letters and IROS 2017.
+
==Publication==
  
==Video Demo A - Automatic ID assignment to 100 Kilobots==
+
The ARK system has been published in:
 +
<br />
 +
A. Reina, A. J. Cope, E. Nikolaidis, J. A.R. Marshall and C. Sabo. [http://diode.group.shef.ac.uk/extra_resources/iros17_reina.pdf ARK: Augmented Reality for Kilobots]. ''IEEE Robotics and Automation Letters'' '''2'''(3): 1755-1761, 2017.
  
<youtube>9E8rRbuPXmw</youtube>
+
If you use the ARK technology in your experiments, please, cite this work in your paper.
  
youTube link: https://www.youtube.com/watch?v=9E8rRbuPXmw
+
===Supplementary material===
  
==Video Demo B - Automatic positioning of 50 Kilobots==
+
Here, we provide the supplementary video of the article A. Reina et al. ARK: Augmented Reality for Kilobots. IEEE Robot. Autom. Lett. 2017.
  
<youtube>Ut2CwsBHMIU</youtube>
+
<youtube>K0KvPzhOSDo</youtube>
  
youTube link: https://www.youtube.com/watch?v=Ut2CwsBHMIU
+
youTube link: https://www.youtube.com/watch?v=K0KvPzhOSDo
  
==Video Demo C - Foraging with 50 Kilobots==
+
The supplementary video showcases the ARK's functionalities
 +
through three demos. In Demo A, ARK automatically assigns unique IDs
 +
to a swarm of 100 Kilobots. Demos B shows the possibility of employing
 +
ARK for the automatic positioning of 50 Kilobots, which is one of the
 +
typical preliminary operations in swarm robotics experiments. These
 +
operations are typically tedious and time consuming when done
 +
manually. ARK saves researchers' time and makes operating large swarms
 +
considerably easier. Additionally, automating the operation gives more
 +
accurate control of the robots' start positions and removes undesired
 +
biases in comparative experiments. Demo C shows a simple foraging
 +
scenario where 50 Kilobots collect material from a source location and
 +
deposit it at a destination. The robots are programmed to pick up one
 +
virtual flower inside the source area (green flower field), carry it
 +
to the destination (yellow nest), and deposit the flower there. When
 +
performing actions in the virtual environments, the robot signals by
 +
lighting its LED in blue. When picking up a virtual flower from the
 +
source, the robot reduces the source's size for the rest of the robots
 +
(by reducing the area’s diameter by 1cm). Similarly when a robot
 +
deposits flowers at its destination, the area increases by 1 cm. This
 +
demo shows that robots can perceive (and navigate) a virtual gradient,
 +
can modify the virtual environment by moving material from one
 +
location to another, and can autonomously decide when to change the
 +
virtual environment that they sense (either the source or the
 +
destination).
  
<youtube>i0cp9V0dsQQ</youtube>
+
For further information, check the paper or contact A.Reina@sheffield.ac.uk.
  
youTube link: https://www.youtube.com/watch?v=i0cp9V0dsQQ
+
===Source code for Demo A, B, and C===
 +
 
 +
The source code to run Demo A, B, and C is available here: [[Experiments Source Code]]

Latest revision as of 13:12, 30 October 2020

The ARK will save all Kilobots from their simplicity!

Source code

Source code of ARK is available on GitHub at: https://github.com/DiODeProject/KilobotArena

Source code of the Camera Calibration Software is available on GitHub at: https://github.com/DiODeProject/KilobotArenaCalibration

Source code for experiments and specific ARK functionalities, such as motor calibration and ID assignment, can be found on the other page Experiments Source Code.

Publication

The ARK system has been published in:
A. Reina, A. J. Cope, E. Nikolaidis, J. A.R. Marshall and C. Sabo. ARK: Augmented Reality for Kilobots. IEEE Robotics and Automation Letters 2(3): 1755-1761, 2017.

If you use the ARK technology in your experiments, please, cite this work in your paper.

Supplementary material

Here, we provide the supplementary video of the article A. Reina et al. ARK: Augmented Reality for Kilobots. IEEE Robot. Autom. Lett. 2017.

youTube link: https://www.youtube.com/watch?v=K0KvPzhOSDo

The supplementary video showcases the ARK's functionalities through three demos. In Demo A, ARK automatically assigns unique IDs to a swarm of 100 Kilobots. Demos B shows the possibility of employing ARK for the automatic positioning of 50 Kilobots, which is one of the typical preliminary operations in swarm robotics experiments. These operations are typically tedious and time consuming when done manually. ARK saves researchers' time and makes operating large swarms considerably easier. Additionally, automating the operation gives more accurate control of the robots' start positions and removes undesired biases in comparative experiments. Demo C shows a simple foraging scenario where 50 Kilobots collect material from a source location and deposit it at a destination. The robots are programmed to pick up one virtual flower inside the source area (green flower field), carry it to the destination (yellow nest), and deposit the flower there. When performing actions in the virtual environments, the robot signals by lighting its LED in blue. When picking up a virtual flower from the source, the robot reduces the source's size for the rest of the robots (by reducing the area’s diameter by 1cm). Similarly when a robot deposits flowers at its destination, the area increases by 1 cm. This demo shows that robots can perceive (and navigate) a virtual gradient, can modify the virtual environment by moving material from one location to another, and can autonomously decide when to change the virtual environment that they sense (either the source or the destination).

For further information, check the paper or contact A.Reina@sheffield.ac.uk.

Source code for Demo A, B, and C

The source code to run Demo A, B, and C is available here: Experiments Source Code