Skip to content
Daniel Stonier edited this page May 14, 2013 · 45 revisions

Overview

We have two wrappers around wstool and catkin_xxx to help setup builds. I did this because ros currently doesn't have anything yet to help much in three different cases:

  1. Multiple underlays (have to repeatedly source setup.bash scripts and remember which ones you sourced!)
  2. Handling the increasing number of cmake variables (use cmake caches!)
  3. Cross compiling

Installation

> sudo pip install -U yujin_tools
# Make sure you have the latest ros build environment tools
> sudo apt-get install python-rospkg python-catkin-pkg ros-groovy-catkin

Using the tools is a three step process, 1) init workspace, 2) init build, 3) make

Workspace Initialisation

Regular wstool style initialisation with url's:

> yujin_init_workspace ~/ecl_lite https://raw.github.com/stonier/ecl_core/groovy-devel/ecl_lite.rosinstall

You can also set and use a custom rosinstall database (e.g. yujin's database). These are subdivided by track/rosdistro. Usage:

# use set instead of get to configure the following
> yujin_tools_settings --get-rosinstall-database-uri  
> yujin_tools_settings --get-default-track
> yujin_init_workspace --track=groovy --list-rosinstalls
# Finally, initialise a workspace from your rosinstall database
> yujin_init_workspace ~/ecl_lite ecl_lite

Build Cases

Key concept here is the creation of build directories - alot of customisation goes into this build directory which needs to be distinct from other builds and is constant from compile to compile, hence the separate yujin_init_build command.

For the following, we assume an ecl_lite workspace initialisation as above:

Single Build Folder

The default ros style for 99% of people.

> yujin_init_workspace ~/ecl_lite ecl_lite
> cd ~/ecl_lite
> yujin_init_build .
# Edit your config.cmake
> yujin_make
> yujin_make --tests
> yujin_make --run_tests
> yujin_make --install
> yujin_make --pre-clean

Note that you don't have to reset the configuration when pre-cleaning. It reuses the cache in config.cmake.

Parallel Build Folder

> yujin_init_workspace ~/ecl_lite ecl_lite
> cd ~/ecl_lite
> yujin_init_build native
> cd native; yujin_make

Cross Compiling

Toolchains and platform flags are selectable from a list (stored in the python module) or directories in ~/.yujin_tools/toolchains, ~/.yujin_tools/platforms. Here we truly make use of the parallel builds without alot of awkward sourcing as we jump from one build to another!

First off, let's download some cross compilers - cmake toolchain and platform modules are of no use unless you actually download those toolchains!

> sudo apt-get install g++-arm-linux-gnueabi g++-arm-linux-gnueabihf

Now the fun begins:

# Native build
> yujin_init_workspace ~/ecl_lite ecl_lite
> cd ~/ecl_lite
> yujin_init_build native
> cd native; yujin_make
# List the available cmake toolchain and platform modules
> yujin_init_build --list-toolchains
> yujin_init_build --list-platforms
> cd ~/ecl_lite
> yujin_init_build --toolchain=ubuntu/arm-linux-gnueabi --platform=arm/arm1176jzf-s arm
> cd arm; yujin_make

Note that you have to know which platform can be used by which toolchain. For example, the arm1176jzf-s settings we used above, don't work for the armhf compiler.

Underlays

If you want to modularise your workspaces you can set up underlays to the current workspace (note we don't do any setup.bash sourcing here, all the information necessary comes from the underlay list). Example below assumes I've got the usual rosinstalls floating around in ~, but url's could also be used.

> yujin_init_workspace ~/ecl ecl
> yujin_init_workspace ~/kobuki kobuki
> cd ~/ecl
> yujin_init_build .; yujin_make
> cd ~/kobuki
> yujin_init_build --underlays="~/ecl/devel" .; yujin_make
> yujin_make

When no catkin can be found in your sources or your underlays, then it will automatically try and add the underlay specified by the currently set default track (yujin_tools_settings), e.g. /opt/ros/groovy for track groovy.

You can also underlay to the individual or even a shared install space rather than the devel spaces.

> yujin_init_workspace ~/ecl ecl
> yujin_init_workspace ~/kobuki kobuki
> cd ~/ecl
> yujin_init_build --install=/opt/yujin/groovy .
> yujin_make --install
> cd ~/kobuki
> yujin_init_build --install=/opt/yujin/groovy --underlays="/opt/yujin/groovy;/opt/ros/groovy" .
> yujin_make --install
# Further workspaces can also use the shared /opt/yujin/groovy underlay as well.

Note the underlays should be directed so that priority is given to the first in the list.

Conveniences

The yujin init build function also sets up our usual gnome-terminal, konsole and eclipse launch scripts. These source the devel workspace's setup.bash if it is ready. Use them as launchers.

Other Notes

Rosinstalls

Previously, we used to fix fuerte or groovy information in the rosinstalls. e.g.

[
{'setup-file': {'local-name': '/opt/ros/groovy/setup.sh'}},
{'other': {'local-name': '/opt/ros/groovy/share/ros'}},
{'other': {'local-name': '/opt/ros/groovy/share'}},
{'other': {'local-name': '/opt/ros/groovy/stacks'}},
{'git': {'local-name': 'rocon_multimaster', 'version': 'master', 'uri':'https://github.com/robotics-in-concert/rocon_multimaster.git'}},
{'git': {'local-name': 'rocon_msgs', 'version': 'master', 'uri': 'https://github.com/robotics-in-concert/rocon_msgs.git'}}
]

This is no longer needed. wstool is purely concerned about source directories and source directories only, so it should now be:

[
{'git': {'local-name': 'rocon_multimaster', 'version': 'master', 'uri':'https://github.com/robotics-in-concert/rocon_multimaster.git'}},
{'git': {'local-name': 'rocon_msgs', 'version': 'master', 'uri': 'https://github.com/robotics-in-concert/rocon_msgs.git'}}
]

If you think about it, this makes sense. Suppose you want to native and cross-compile - the same sources should be usable, but /opt/ros/groovy is totally not usable for the cross compile. Such configuration should be in the build directory, not the source directory. Also note - the former rosinstall file is no longer compatible with newer versions of wstool.

Clone this wiki locally