Simulating/testing a robot for later control in VR (Unity)

I’m in the early stages of going through my first design process of a robot that requires simulation and I’m lost after looking through my options. The robot I’m designing will eventually be built and will navigate using SLAM or VSLAM, will get odometry data from it’s Kobuki (Turtlebot) base, and will have an arm mounted on the top to interact with machines/objects (aided by an camera).

I’d like to use a simulation to work on/troubleshoot navigation and interaction with the physical environment. This simulation will take place in an Unity VR environment that someone else has built (a replica of the real workspace in which the robot will eventually operate). Ultimately, a user will be able to control the robot in this virtual environment and the real robot will respond in kind.

I know simulation can be done with ROS and I’ve seen there are packages that allow set up a subscriber node in Unity to port ROS data to a Unity environment. I’ve also seen some projects where people exclusively use Unity to simulate a robot (I’ve only seen a couple papers and there’s less information on how this is done or how effective it would be to port to a real system later).

I’m curious if anyone has experience with either of these two methods or another method they think would work better for this project?

My main priorities:

  1. Being able to design/test the navigation and arm control in a way that can be implemented in the real system.
  2. Low-latency connection between simulated and real robot for when user controls the real robot using the virtual environment.

submitted by /u/applebear909
[link] [comments]