ROS with Raspberry Pi and Xbox 360 Kinect [Question]

Hello,

I have been banging my head against this problem for a week or two now and decided it was time to ask for help. I am trying to run any type of SLAM on a robot using ROS. I will do my best to clearly outline the problem below.

Goal

The end goal is to be able to tell the robot to drive to an arbitrary XY location while avoiding obstacles.

Hardware

  • Xbox 360 Kinect

  • Raspberry Pi 3 (using ubiquity robotics SD card image that has Ubuntu and ROS Kinetic_

  • Laptop with VMware running Ubuntu Xenial 16.04 (I have gotten it to talk to the Pi)

  • MPU6050 (common IMU)

  • Motors w/quadrature encoders

  • Differential drive robot

  • Eight ultrasonic range sensors

  • Eight IR range sensors

Attempted Solution

I have been disregarding all sensor data except for the Kinect data at the moment. I would like to perform some kind of SLAM using the Kinect data. I was thinking the other sensors would be able to then be used to enhance the localization using a Kalman filter or some other type of Bayesian estimator. I am able to connect with and view data streaming from the Kinect. This is done by running freenect_launch on the pi and rviz on the laptop. I have been running roscore on the pi. I have made an attempt at using rtabmap and rgbdslam, both of which do not seem to work. It seems like the SLAM nodes aren’t able to read the sensor stream from the Kinect. I find this odd though since I am able to visualize it in rviz and have checked to make sure that the subscribed topics are all correct. I also used rqt_graph to verify that data was being sent as expected. At this point in time, I have not changed any code, I have simply been running the launch files provided by freenect and rgbdslam.

If anyone could offer some insight into this, I would greatly appreciate it! I will also be sure to update the community once I am able to implement this.

Thank you!

submitted by /u/thingythangabang
[link] [comments]