Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Next revisionBoth sides next revision
tutorials:advanced:unreal [2019/05/16 18:24] – files and usage description (started) hawkintutorials:advanced:unreal [2019/06/07 14:55] hawkin
Line 1: Line 1:
 +======  Using the Cram-KnowRob-VR package ======
 +
 +Tested under CRAM version v 0.7.0 
 +
 This tutorial will introduce you to the ''cram_knowrob_vr (short: kvr)'' package, which uses the data recorded in the Virtual Reality environment using [[http://robcog.org/|RobCog]], extracts information from them using [[http://www.knowrob.org/|KnowRob]] and executes the CRAM high-level-plans based on this data either on the real robot or in the CRAM [[tutorials:advanced:bullet_world|bullet world]]. This tutorial will introduce you to the ''cram_knowrob_vr (short: kvr)'' package, which uses the data recorded in the Virtual Reality environment using [[http://robcog.org/|RobCog]], extracts information from them using [[http://www.knowrob.org/|KnowRob]] and executes the CRAM high-level-plans based on this data either on the real robot or in the CRAM [[tutorials:advanced:bullet_world|bullet world]].
  
Line 9: Line 13:
 Launch a ''roscore'' first. Then, in a new terminal for each launch file, launch the bullet_world, json_prolog and roslisp_repl Launch a ''roscore'' first. Then, in a new terminal for each launch file, launch the bullet_world, json_prolog and roslisp_repl
 <code bash> <code bash>
-    $ roslaunch cram_bullet_world_tutorial world.launch+    $ roslaunch cram_pr2_pick_place_demo sandbox.launch 
     $ roslaunch json_prolog json_prolog.launch      $ roslaunch json_prolog json_prolog.launch 
     $ roslisp_repl     $ roslisp_repl
Line 19: Line 23:
  
 === Usage === === Usage ===
 +Here we will first explain on what needs to be done to get the robot to execute and perform a pick and place plan in the simulated bullet world. In the next paragraph, we will take a closer look at the individual source files and explain their function.
 +
 Before you load the package, navigate to the ''init.lisp'' file and set the ''*episode-path*'' parameter to the path of your episode data. This is important. Otherwise it won't be possible to load the episode data properly. Before you load the package, navigate to the ''init.lisp'' file and set the ''*episode-path*'' parameter to the path of your episode data. This is important. Otherwise it won't be possible to load the episode data properly.
  
Line 31: Line 37:
 </code> </code>
  
-This will create a lisp ros node, clean up the belief-state, load the episodes that get passed to the init function as a list of strings, spawn the semantic map of the episode and the items and initialize the location costmap. In the code section below it will be explained in more detail, what is loaded when.+This will create a lisp ros node, clean up the belief-state, load the episodes that get passed to the init function as a list of strings, e.g. in our case "ep1", spawn the semantic map of the episode and the items and initialize the location costmap. In the code section below it will be explained in more detail, what is loaded and when. This process may take a while, so please have some patience. When the function has finished running through your bullet world should look like this:  
 + 
 +Now, let's execute the pick and place plan: 
 +<code lisp> 
 +CL-USER> (cram-pr2-projection:with-simulated-robot (kvr::demo)) 
 +</code> 
 +With this call we first say that we want to use the simulated bullet-world PR2 robot instead of the real one, and then we simply call the demo. The demo will read out the VR episode data and extract the positions of the objects that have been manipulated, which hand was used and the positions of the human head and hand.  
  
 === Code === === Code ===