OS
Operating
Systems
and
Real
Time
Operating
Systems
General information

Project of Fall 2020: "Waregouse game"

This year, the robot competition is replaced by a robot challenge with only one robot at the time in the stadium.


Mindstorms NXT(TM)/EV3 robots are robots made of a main unit and a set of sensors and actuators. They have been sold by Lego for several few years now, starting with the NXT version, and now with the ev3 version. Many very elaborate designs have been made using the standard kit (see for example "Mindstorms: Not just a kid's toy"). You may also watch interesting videos on youtube, such as a breakfast machine, marble conveyor, a car factory, a walking dog, a scorpion, a forklift truck, the "holononic drive", a rubik's cube solver, a printer, and a sumo game. You can also find some other nice systems on battlebricks.

The project you will work on might not be as complex as some aforementioned systems, yet it requires spending quite a lot of design and programming time on the robot and its interfaces (sensors, actuators).

Once you have received your robot box, test your robot, motors, and sensors as soon as possible! We may be able to repair/order parts with a few weeks notice, but there is very little we can do the day before the competition.

Project description

The robot you have to build shall be the winner of a warehouse game. This section includes a description of the game and then a specification of the robot.

Note that the rules might evolve during the next weeks until the final competition. I will always inform you whenever a rule evolves.

General description

The main objective of the game is to score points by transfering balls from one container to another one. A game is as follows.
  • A robot plays alone robots in the stadium. A robot starts from inside the starting area. It may be positionned in this area as desired by the team as long as it totally fits inside the starting area. Once the game has started, the robot is free to navigate where it wants in the stadium.

  • Robots have 5 minutes at most - i.e. as soon as the robot is stuck the game ends - to score points by transferring balls from origin containers to destination containers. The overall game area is delimited by a wooden fence on the four sides (see figure below).

  • Three containers are placed in a predefined position: one cube is an origin container containing 4 balls, and two destination containers: one cube and one pyramid. You score one point for each ball present in the destination cube at the end of a round. You score 3 points for each ball present in the destination pyramid at the end of a round.

  • An additional cube container with 4 balls is placed at a random location, in one of the orange areas of the stadium, see figure below. Just like for the other origin cube, the four balls it contains are expected to be transferred into one of the destination containers.

  • The robot can be freely re-programmed between game rounds, but it is forbidden to remotely control your robot.

Tournaments will be organized, and medals (master, diamond, platinum, gold, silver and bronze medals) will be given to robots according to their ranking. The ranking is first determined with the score and with the average time to transfer a ball. This average time is computing by dividing the time taken by the robot (maximum: 5mn) divided by the number of balls correctly transferred. The higher this average is, the better it is.

Stadium


The stadium is explained in the following picture.
this is a PNG
Specification of the stadium



Containers


Containers are as follows.
  1. Cube: 15cm on each dimension. 3D model in openscad format, 3D model in STL format
  2. Pyramid: 15cm on x/y dimension, 8cm height, top hole is a 7cm square. 3D model in openscad format, 3D model in STL format
A few pictures: cube, cube height, cube and pyramid, pyramid, top of pyramid, pyramid height.

Specification of the robot

  • The robot must be contained in a cube of 35 cm maximum on each dimension at start-up.

  • Robots can use up to four sensors and up to four engines. You are free to use the sensors you want to among the following ones: touch sensor, light sensor, color sensor, ultrasonic sensor (i.e., distance sensor), compass sensor, gyroscope sensor, magnetic sensor.

  • The robot must contain a flag, on which the number of your group is clearly readable from at least two sides of your robot. The flag dimension is at most 10x10cm. It may also contain a logo, a drawing, and the name of the robot.

  • A robot may change its shape (by deploying elements, or withdrawing them) during game phases, but it should fit in a 35cm cube at the start of the game.

  • Destructive weapons are NOT allowed. For your information, a EMP can be constructed for ~300 dollars as shown here.

  • It is forbidden to send orders - remotely or not - to your robot while it is playing: it must be fully autonomous as soon as the game starts, and until it ends. Any kind of chealing will result in a 0 grade, just like for all exams at Eurecom.



Competitions and reports

There are two "competitions", as described after. Your project grade takes into account the two competitions (the one in December, the one in January).
  1. The 14th of December, 2020: your website should be started with at least the name of your robot (I don't expect more in the website for this deadline). The six following tests will be performed in the stadium. Tests are used to test "basic" functions in order to understand how well their work. They are not taken into account for the final grade, apart if the final competition fails. You can validate multiple tests with one execution if you wish to.

    • Test #1. Be able to grab a ball or a whole cube located in front of your robot.
    • Test #2. Be able to go from the starting area to at most 10cm far from the origin cube.
    • Test #3. Be able to grab a ball located inside a cube located in front of your robot. If you decide to grab the whole cube, this test does not apply.
    • Test #4. Be able to put a ball (or a set of balls) in a cube located in front of your robot. Balls are pre-added to your robot before the test..
    • Test #5. Be able to put a ball (or a set of balls) in a pyramid located in front or your robot. Balls are pre-added to your robot before the test.
    • Test #6. Be able to find the randomly placed cube. By "finding", we mean stopping in at most 10 cm from this randomly placed cube.


  2. The 18th of January, 2021: final competition. For the final competition, the website must be fully completed. Also, during the final competition, I may interview each member of the group, so as to understand the contribution of each group member.


Your report consists of a website and the source code of your robot. The website shall contain the following information:
  • Description of the architecture of the robot: sensors and actuators, etc. Pictures of your robot on the web site would be appreciated.

  • Algorithms used for the two most important points: strategy to drop or throw balls, to find and pick up balls, competition with other robots, etc. Don't provide source code here, just try to describe the algorithms using a pseudo C language. Also, do comment those algorithms, and explain why you think they are efficient.

  • Source code, and instructions on how to use that source code: how to compile it, how to download it on the robot, and how to start the robot. To protect you work, you can set a password to access to the code, but make sure to give us access in order to grade it (e.g., private git repository). You could even put a fake code on your website until the very last moment ;-). I strongly advise you to rely on a versioning system (svn, git, hg) to work on the code. Also, frequently backup your code.

  • Videos / pictures of your robot in action [we may provide you with cameras and video recorders if necessary].

  • How you have worked in the group, i.e., who made what. Be clear about that. Each member of groups will get an individual grade. All members of a team must contribute to the source code. You source code must clearly indicate, with comments, who programmed which function.

Groups

(Current total: Remaining: 6 group of 3)
(The group leader is listed first).
  1. Name of the members, name of robot and link to website.

  2. Paolo VOLPE, Raffaele DI PLACIDO, Guy ABI HANNA. Warhouser

  3. Florian LE MOUËL, Mathieu CHAMPION, Virgile RETAULT. GLaDOS.

  4. Romain BEURDOUCHE, Prithiraj DAS, Jacques de MATHAN. Fischer2020 Code

  5. Rakesh MUNDLAMURI, Massimo GISMONDI, Hend CHAKROUN. Ravattino

  6. Emilien VANNIER, Aymen BENAOUDI, Louis FARGE. Botspector Gadget Link to code

  7. Jean-Baptiste PEYRAT, Nicolas SERVOT and Brehima COULIBALI. REBOOT Code


How do I borrow a robot?

Please, take care not to lose parts, especially cables and sensors. Also, a microsd card and a wifi dongle comes with the robot. The procedure to borrow a robot is as follows:
  1. Make a group of 3 students, and decide on a group leader.
  2. The group leader sends me an email with the list of students of the group. List of: first_name, LAST_NAME, email. The leader shall be the first student in the list.
  3. I will validate the group. If the group is not validated for some reason I will provide, go back to stage 1. Otherwise, go to next step. Note that I validate groups with a First Come First Served policy (Date of email sent).
  4. Once I've answered your group with "validated", the group leader can go to Franck Heurtematte's office (327), in the IT Department. You will be granted an ev3 box. Only the group leader has the right to enter in the office of Mr Heurtematte, and only during the morning or the afternoon breaks (do not go during lunch break).

Access to stadium

  • The stadium is located in room 52. You are free to use it whenever you want as long as you respect the COVID rules (washing your hand before touching the arena, wearing a mask).



ev3

For the ev3 system, the idea is to flash on the provided SDcard a Debian GNU/Linux system. Then, you will be able to use the development language of your choice (C, python, etc.). But we ask you to develop in C.

Installing Linux on ev3

All information to install Linux on your ev3 is given here. This page also explains how to connect to your ev3 to your PC via the usb cable, bluetooth, or wifi. I do suggest to use the Debian JESSIE version, so not the most recent version. You should find the correct version in this webpage by selecting the most recent ev3dev-jessie image.

Using and programming ev3

  • Connect to the EV3 by ssh
  • If you have difficulties to connect to your robot via WIFI, BT ou USB, please contact us.
    Once your robot has access to Internet, it should display its IP address in the upper left corner of its display (let's assume it is 192.168.2.2). Then, you can try to ssh to the robot (the default password is maker):
    $ ssh robot@192.168.2.2
    
    The robot user is a sudoer, i.e. it can execute commands with root privileges when using the command sudo first.

    Once you are logged on the robot, do verify that you really have an internet access. For instance, try to do:
    $ ping www.ev3dev.org
    

  • Now, update the robot with the latest packages
  • $ sudo apt-get update&&sudo apt-get upgrade
    

  • Install extra packages you will need for the project:
  • $ sudo apt-get install gcc make
    

  • You are now ready to compile a basic example, and to run it. To do so, you first need to install the development environment on the robot:
  • $ GIT_SSL_NO_VERIFY=true git clone https://github.com/in4lio/ev3dev-c.git
    $ cd ~/ev3dev-c/source/ev3
    $ make
    $ sudo make install
    

  • Now, you need to get the example. The tester file runs all 3 motors, then tests the color, touch, sonar, and compass sensors, and is provided on the project git. You can either clone the git with the code on the robot, or on your host computer.
  • $ GIT_SSL_NO_VERIFY=true git clone https://gitlab.eurecom.fr/ludovic.apvrille/OS_Robot_Project_Fall2020.git
    $ cd OS_Robot_Project_Fall2020/client
    
    If you are on your host computer, you then need to copy the example files to the robot:
    $ scp tester.c Makefile robot@192.168.2.2:
    

  • Then, on the robot, you can compile the example main file, and then run it. You need tachos and sensors to be connected to the brick if you want the program to correctly execute.
  • To compile:
    $ make testergcc -I./ev3dev-c/source/ev3 -O2 -std=gnu99 -W -Wall -Wno-comment -c tester.c -o tester.o
    $ gcc tester.o -Wall -lm -lev3dev-c -o tester
    
    or:
    $ make
    
    To run the code:
    $ ./tester
    
    or to compile and run:
    $ make tester
    
note that the code and Makefile I provide are for the version of Debian Jessie that I have installed on my robot. the code and/or Makfile may fail for your robot: in that case, try to understand the error(s) you get and accordingly adapt the Makefile or the code.

Other examples

  • Sample code is also provided in the ev3c-master repository. Samples of how to control the motors and sensors are included in tester.c. The Ultrasonic Sensor is described here.

  • If your compass sensor does not correctly work with the test_sensor program, we provide a code specific to the use of this sensor. Use the "i2c" target of the provided Makefile to compile the file.

  • Other examples can be downloaded from the ev3dev C library here. Examples are provided in eg/tacho and eg/sensor.

  • rfcomm-client.c is an example of how to communicate with the server over wifi. Compile with gcc rfcomm-client.c -o rfcomm-client.

  • If you can't get your robot working, bring it along to course staff.


Cross compilation

Basic

You may want to compile your source code on your laptop, e.g. to check the syntax of your code or to speed up the compilation process. Cross-compilation for ev3 is available as an ev3 docker image. Here is what I've made to make this work on my Debian GNU/Linux computer:
  1. I have first installed docker, created a docker group, added myself to the docker group, and started the docker service. Installing docker may require more than a simple apt-get, so please refer to the document of your OS, e.g. for the one for ubuntu
  2. $ sudo apt-get install docker-engine
    $ sudo groupadd docker
    $ sudo gpasswd -a ${USER} docker
    $ sudo service docker start
    
    You may have to log-out for the group modification to be effective.
  3. Then, I have installed the ev3 docker image:
  4. $ docker pull ev3dev/debian-jessie-armel-cross:latest
    
  5. Once done, I've created a helloworld in C:
  6. $ cat > /tmp/hello.c
    #include 
    
    int main(int argc, const char *argv[])
    {
        printf("OS is fantastic!!\n");
    
        return 0;
    }
    
    (do "CTRL D" at the end of this command).
  7. I've compiled the file using the cross compiler provided within the docker image, and tested the generated file. The following command assumes that hello.c is located in /tmp on your host system (but you can place it wherever you want: update the following command accordingly). The following instructions are the one that I have used on my debian/jessie GNU/Linux PC.
  8. $ docker run -e LOCAL_USER_ID=`id -u $USER`  --rm -it -v /tmp:/src -w /src ev3dev/debian-jessie-cross
    [Shell in container]$ cd src
    [Shell in container]$ arm-linux-gnueabi-gcc -o hello hello.c
    
    Then, if you want to execute the generated ARM executable, you first need to install the ARM emulation environment. Then, you can execute "hello":
    $ sudo apt-get install qemu-user-static
    [Shell in container]$ ./hello
    OS is fantastic!!
    
For handling the compilation for your ev3, you need to make additional stages (thanks to Paolo!):
  1. Tag the docker image:
  2. $ docker tag ev3dev/debian-jessie-cross ev3cc
    
  3. Install ev3 lib, put them in the project directory, and start the docker image:
  4. $ git clone https://github.com/in4lio/ev3dev-c
    $ docker run --rm -it -h ev3 -v PATH/TO/PROJECT/:/src -w /src ev3cc /bin/bash
    
  5. In the container, run the following commands:
  6. [Shell in container]$ cd ev3dev-c/source/ev3/&&make&&sudo make install&&make shared&&sudo make shared-install
    
    In the Makefile for compiling your code, do not forget to substitute “gcc” with "arm-linux-gnueabi-gcc”. For running your project, add "export LD_LIBRARY_PATH=~/ev3dev-c/lib”in the Makefile.

Another way to do

(Way suggested by Fredrik Flornes Ellertsen. Note: on our side, for the tester.c file, we had to put the exact location of included files such as ev3.h. And also we had to remove the lines that use 'ev3_brick_addr', which the compiler cannot understand)

This has been tested on native Ubuntu 17.04 and on Windows using the Windows Subsystem for Linux.
  1. In an empty folder (here called robot_code/) on your computer, clone the ev3dev repository:
  2. $ git clone https://github.com/in4lio/ev3dev-c
    
  3. Install the cross compiler (source: https://www.acmesystems.it/arm9_toolchain):
  4. $ sudo apt-get install gcc-arm-linux-gnueabi
    
  5. In order to compile code that links to the ev3dev-c and/or other libraries, we need to use the robot's own versions of these libraries e.g. libev3dev-c.a. By linking statically we eliminate the need to install anything on the robot itself. Place these files in robot_code/libraries.

  6. Create a Makefile to simplify the compilation process. The following example Makefile expects the project to be organized like illustrated below, but will naturally depend on the group's project:
  7. 	robot_code/
    		Makefile
    		ev3dev-c/
    		libraries/
    			libev3dev-c.a
    		include/
    			movement.h
    			sensors.h
    		source/
    			movement.c
    			sensors.c
    			main.c
    
    Running 'make' in the robot_code/ directory should yield an executable called 'main' in the same directory. Using a file transfer tool like scp, copy this file to the robot and run it.


Competitions results

First tests, December the 14th, 2020



  • Group #1. Test 1, 2, 3, 4, 5, 6 validated. To be improved: robot does not scan where the ball is located in front of the robot (test 1), nor the distance with the object to drop the ball (cube, pyramid). Also, balls in the cube can be grabbed only if they are correctly prepositionned within the cube (e.g. not being against the front side). Robot positionning could be improved by fusioning motor information, time, sonar and gyroscope.

  • Group #2. Test 2 validated. To be improved: use of compass+gyroscope to improve robot position. Test 4 done. Grabbing the ball mostly fail because of many aspects (movement seems to be too fast, no protection on the side of the handle). Test 6: fine.

  • Group #3. Test 1, 3 and 4 validated, with a fix robot position. Test 2 was ok (but dropped the ball outside of the box). Going forward and turn could be more reliable (use the gyroscope). Test 6 works, but scan is not accurate enough.

  • Group #4. Test 1 validated (with the grabbing system outside of the robot). Test 2 ok (but improve turns with gyro and compass). Test 4/5 is ok with the clamp control only.

  • Group #5. Test 1: perfect. Test 2: ok (in several parts). Test 4: Ok (failed the first time: too far from the box).

  • Group #6.




Final competition, January the 18th, 2021



Points and evaluation criteria could be updated during competition to better reflect the capacities of robots.

Podium
    to come :-)