Page tree
Skip to end of metadata
Go to start of metadata

 

Hello everyone. This is ENS Rebecca Greenberg and I am writing up how to get the Kinect sensor up and running using Ubuntu Linux 14.04 and ROS Indigo. This assumes that you have these systems already on the computer. Please either go through the tutorials on the ROS wiki or use the page on the Robotics Computation wiki Ubuntu 14.04, ROS Indigo and MATLAB 2015b. This setup also assumes that you have installed git.

 

There are two main up to date nodes to run the Kinect openni_launch and freenect_launch. For my thesis I chose to use openni_launch because I felt I could find more documentation and troubleshooting information. The setup is very similar for freenect_launch.

An easy setup was found on Installing Kinect on Just Sophie's Blog. It ensures that libopenni libraries are downloaded which is not told to do on the setup page for openni_launch.

 

The Step’s from Sophie’s Blog are copied here but edited to include cloning the rgbd_launch repository that is also needed to run the openni launch code.

1. Open terminal and do an apt-cache search of libopenni and install both the -dev and 0 libraries.

sudo apt-get install libopenni0 libopenni-dev

2. Clone openni_camera and openni_launch from Github into your catkin_ws/src and catkin_make in the workspace folder.

cd ~/catkin_ws/src
git clone https://github.com/ros-drivers/openni_launch
git clone https://github.com/ros-drivers/openni_camera

git clone https://github.com/ros-drivers/rgbd_launch.git
cd ..
catkin_make
catkin_make install

3. Connect the Kinect, and run the openni_launch file

roscore
roslaunch openni_launch openni.launch

If the above steps fail, go to this repo and cd into the Bin folder and extract the relevant file

git clone https://github.com/avin2/SensorKinect
cd SensorKinect/Bin
tar xjf SensorKinect093-Bin-Linux-x64-v5.1.2.1.tar.bz2
cd Sensor-Bin-Linux-x64-v5.1.2.1
sudo ./install.sh

4. Test out your build by running

roslaunch openni_launch openni.launch

5. I also had to download the image_common git package in order to have the camera_info_manager that openni_launch opens

cd ~/catkin_ws/src

git clone http://wiki.ros.org/image_common

cd ..

catkin_make

catkin_make install

The openni_launch wiki page has a quickstart tutorial that allows for quick visualization of the Kinect sensor data once it is up and running using openni_launch. Rviz 3d visualization environment must be installed.

 

Setting up openni_tracker and issues seen

Openni_tracker is a ROS node that broadcasts the OpenNi skeleton frames using tf transforms. The package says it is updated through hydro however it does work with indigo. A discussion on this can be found at this link.

 

cd ~/catkin_ws/src

git clone https://github.com/ros-drivers/openni_tracker

cd ..

catkin_make

catkin_make install

 

Next the NITE library, NITE v1.5.2.23, has to be installed and it can be found and downloaded from http://www.openni.ru/openni-sdk/openni-sdk-history-2/.

Use the terminal to move into the location where you saved the NITE download un zip the file. Make sure that the files have moved to a folder which you can cd into. Run the install file

sudo ./install.sh

 

The tracker is a rosnode not a launch file so open a separate terminal and run

roscore

In the second terminal

rosrun openni_tracker openni_tracker

 

In order to have the transforms be in terms of the Kinect frame as parent run the following line of code but change camera_link to whichever your Kinect data is using. Doing this makes it possible to see both pointcloud data and tf transforms in rviz at one time.

rosparam set /openni_tracker/camera_frame_id camera_link

rosrun openni_tracker openni_tracker

 

For my thesis the Kinect skeleton tracking was hoped to be used to have a p3dx(Pioneer 3-wheel indoor robot) follow a person in an indoor environment. In simulation in rviz with the p3dx on blocks, it seemed that this would be possible. However, when taken mobile, the robots motion as well as any clutter in the background led the node to think that there were multiple people moving through the field of view. Even with the sensor stabilized and a filter used to stop any sudden movements, the tracker still consistently lost the person it was tracking. Another issue seen was that the node seemed to zero our its position when calibration occurred, so if a person was calibrated and the robot tracking, but the robot lost them due to motion and so was re calibrated, the sensor would zero out the tf at the new location.

  • No labels