EU@HomeEdu 2019 - Preliminary activities

The preliminary activities in preparation of EU@HomeEdu 2019 are provided in form of Assignments. Each assignment contains a specific task to be carried out to make your team ready for the workshop and competition. Assignments will be periodically assigned and teams will be informed by e-mail about new assignments published here. Challenge organizers will be available to support teams encountering problems in performing the assignments.

Technical questions may be sent to:

Assignment 1. Software installation (part 1)

Goal: Having a first easy-to-install and ready-to-use installation of the software needed for the Challenge.

Requirements: a laptop or desktop computer with enough computational power, memory and disk space and already installed OS.
Linux OS preferred, but Microsoft Windows OS is also acceptable.

  1. Install Oracle VirtualBox and Oracle VM VirtualBox Extension Pack
  2. Download MARRtino VM image from MARRtino software web site and install it
  3. Download and install MATLAB (last version) including the Robotics Toolbox
    Go to the RoboCup support page and request a complimentary software license.
    The steps to install and activate MATLAB will be emailed to you once the license is approved, and each team member should use the same information.

     Toolboxes required to run examples:
    Robotics System Toolbox
    Image Processing Toolbox
    Computer Vision Toolbox
    Statistics and Machine Learning Toolbox
    Deep Learning Toolbox
     Additionally recommended:
    MATLAB Coder
    Simulink Coder
    Embedded Coder
    Automated Driving Toolbox
    Text Analytics Toolbox
    Control System Toolbox
    Simulink Control Design
    Signal Processing Toolbox

Note: MATLAB can be installed on the same host machine running the Virtual Machine or on a different one. MATLAB can also be installed within the virtual machine, but you will need to create a new virtual disk (suggested size 32 GB), to connect and mount it in the VM file system, and to install MATLAB on it.

Note: For network connection between the Virtual Machine and the Host machine (running a browser or MATLAB) use Bridged or Host-only adapter in the settings of the VirtualBox Virtual Machine.

Exercise 1. Run the virtual machine, run firefox in the virtual machine. Access the Bringup web page, connect to localhost, and start the simulated robot (Simrobot start button). Open a new firefox tab on the Programming web page and choose Commands, Blockly or Python. Connect to localhost, write your simple program (e.g., move the robot in a square) and see the robot moving in the simulator.

Exercise 2. Repeat exercise 1 using MATLAB interface instead of the web programming interface.
and follow instructions for the scripts in IntroROS section

Assignment 2. Software installation (part 2)

Goal: Having a more efficient software installation for the Challenge

Requirements: a laptop with enough computational power, memory and disk space and a free partition to install Linux.

  1. Install Linux Ubuntu 16.04 on a partition (Ubuntu Mate 16.04 suggested)
  2. Install ROS kinetic ( ros-kinetic-desktop-full and  ros-kinetic-navigation )
  3. Install MATLAB as in Assignment 1 under Ubuntu OS

Exercise 1. Run stage simulator (follow instructions in
Type a shell command to move the robot
    rostopic pub /cmd_vel geometry_msgs/Twist -r 3 -- '[0.5,0.0,0.0]' '[0.0, 0.0, 0.0]'

Exercise 2. Run stage simulator and move the robot using MATLAB interface

    [velPub,velMsg] = rospublisher('/cmd_vel');
    velMsg.Linear.X = 0.5;

Assignment 3. Navigation

Goal: testing mapping and navigation functionalities using Stage simulator.

Requirements: Software installed as in Assignments 1 or 2 above, including marrtino_apps, MATLAB and rc-home-edu-learn-matlab.
You need to update the software installed. The following commands are customized for MARRtino VM

cd ~/Downloads
wget -N
source update_rchomeedu_1.bash
cd ~/src/marrtino_apps
git pull

If you have your own different installation, please adapt these commands to your settings.

    RoboCup@Home Education MARRtino Guide

  1. Odometry navigation. Run the examples "Velocity Control" and "Odometry Reading" in Part 4 of MARRtino Guide.
  2. Mapping. Create a map of the environment with the instructions about "Building a Map with Simulator" in Part 4 of MARRtino Guide.
    In Step 3, when saving the map file, use the name of your team as name of the map to save.
    For example, if your team name is ABC, modify the command into
        rosrun map_server map_saver -f ABC
  3. Navigation with Rviz. Follow the instructions "Navigation with Simulator" in the same document.
    In Step 1, use the map name provided during the mapping phase.

    For example,

        ./simrobot_navigation.bash ABC

    Set several goals with Rviz to explore all the environment.
  4. Navigation with Python. Set goals with Python script
        cd src/marrtino_apps/navigation

        python 0 0 0
  5. Navigation with MATLAB. Use the script
    modifiy the script to change goals

Exercise 1
. Create the map of Montreal 2018 @Home arena (or any other environment you want to try) that appears when you start the simulator on the virtual machine, start the navigation mode with the new map, and write a program for the robot to visit all the rooms in the apartment and then exit the door in the bottom-right side.
Version 1A. Use Python
Version 1B. Use MATLAB

Note: all the tasks and exercises can be performed on real robots as well. If using MARRtino robot, you should just need to run the robot node instead of the simulator node. For mapping and navigation, you should have a laser range finder mounted on the robot and run the corresponding node.

Assignment 4. Speech synthesis and recognition

Goal: testing speech functionalities integrated with navigation using Stage simulator and LU4R Android app.

Requirements: Software installed as in Assignments 1 or 2 above, including marrtino_apps, MATLAB and rc-home-edu-learn-matlab,
rc-home-edu-learn-ros, LU4R Android app for Speech Recognition
Update the software installed on MARRtino VM

    cd ~/src/marrtino_apps
    git pull

    cd ~/src/rc-home-edu-learn-matlab
    git pull

    cd ~/src/rc-home-edu-learn-ros
    git pull


    RoboCup@Home Education MARRtino Guide


1. Voice-command control. Run MARRtino simulator and audio server (follow instructions in Part 2 of
RoboCup@Home Education MARRtino Guide). Install, run and configure LU4R Android app for speech recognition. Run the program listed as Example 1 speech control in Part 3 of RoboCup@Home Education MARRtino Guide.

2. Test speech functionalities described in the last part of RoboCup@Home Education MARRtino Guide

Exercise 1. Define a set of locations in the simulated environment as pairs (location name, global map coordinates). For example: ('kitchen', 6, -5, -90), ('living room', 9, -4, -90), and so on. Use global map coordinates as described in Assignment 3. Run a simulated robot with navigation capabilities on a map (as described in Assignment 3).

Write a program for voice control of robot locations, where the user can say to the robot to go to a location (e.g., "go to the kitchen") and the robot will move there using the mapping between location names and map coordinates specified above.

Version 1A. Use Python
Version 1B. Use MATLAB

Assignment 5. Vision and object recognition

Goal: testing object recognition functionalities using MARRtino VM.

Requirements: MARRtino VM (Software installed as in Assignments 1 above)
Update the software installed on MARRtino VM up to version 2.6.1.

    cd ~/install

Run python until you reach version 2.6.1


    RoboCup@Home Education MARRtino Guide


1. Read Part 4, Section Vision functionalities in RoboCup@Home Education MARRtino Guide and test how to download images from web and display them in the VM.

2. Test object recognition functionalities on web images

Exercise 1. Define a set of locations in the simulated environment and run a simulated robot with navigation capabilities on a map (as described in Assignment 4).

Write a program to bring objects in the proper location, by implementing the following behavior: 1) get a web image (it will be a random object), 2) use object recognition to classify the object, 3) declare the object recognized with speech, 4) bring the object in the proper location (i.e., move to that location) according to the table below.

fruit (banana, orange, pineapple) -> kitchen
cup, water_bottle, plastic_bag -> living_room
volleyball, teddy
-> bedroom

Version 1A. Use Python
Version 1B. Use MATLAB