Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Next revisionBoth sides next revision
tutorials:advanced:unreal [2019/05/16 13:58] – Added Roslaunch, Prerequisites, Installation, Performance hawkintutorials:advanced:unreal [2019/06/07 14:55] hawkin
Line 1: Line 1:
 +======  Using the Cram-KnowRob-VR package ======
 +
 +Tested under CRAM version v 0.7.0 
 +
 This tutorial will introduce you to the ''cram_knowrob_vr (short: kvr)'' package, which uses the data recorded in the Virtual Reality environment using [[http://robcog.org/|RobCog]], extracts information from them using [[http://www.knowrob.org/|KnowRob]] and executes the CRAM high-level-plans based on this data either on the real robot or in the CRAM [[tutorials:advanced:bullet_world|bullet world]]. This tutorial will introduce you to the ''cram_knowrob_vr (short: kvr)'' package, which uses the data recorded in the Virtual Reality environment using [[http://robcog.org/|RobCog]], extracts information from them using [[http://www.knowrob.org/|KnowRob]] and executes the CRAM high-level-plans based on this data either on the real robot or in the CRAM [[tutorials:advanced:bullet_world|bullet world]].
  
Line 9: Line 13:
 Launch a ''roscore'' first. Then, in a new terminal for each launch file, launch the bullet_world, json_prolog and roslisp_repl Launch a ''roscore'' first. Then, in a new terminal for each launch file, launch the bullet_world, json_prolog and roslisp_repl
 <code bash> <code bash>
-    $ roslaunch cram_bullet_world_tutorial world.launch+    $ roslaunch cram_pr2_pick_place_demo sandbox.launch 
     $ roslaunch json_prolog json_prolog.launch      $ roslaunch json_prolog json_prolog.launch 
     $ roslisp_repl     $ roslisp_repl
Line 15: Line 19:
 The bullet world is needed for visualization. The json_prolog node allows us to access information in KnowRob from CRAM.  The bullet world is needed for visualization. The json_prolog node allows us to access information in KnowRob from CRAM. 
  
-==== Initialization (within Emacs) ====+==== Usage and Code ==== 
 +The following will describe what the different files and their functions do, when and how to use them and why they are needed. The explanation will follow the order files in the .asd file. It is separated into a usage and a files section. The usage section will focus on how to get everything  to run and how to execute a demo while the files section will look a bit more in depth into the code, and explain what is going on there.  
 + 
 +=== Usage === 
 +Here we will first explain on what needs to be done to get the robot to execute and perform a pick and place plan in the simulated bullet world. In the next paragraph, we will take a closer look at the individual source files and explain their function. 
 + 
 +Before you load the package, navigate to the ''init.lisp'' file and set the ''*episode-path*'' parameter to the path of your episode data. This is important. Otherwise it won't be possible to load the episode data properly. 
 + 
 +Now you can load the ''cram_knowrob_vr'' package with: 
 +<code lisp> 
 +CL-USER>  (ros-load:load-system "cram_knowrob_vr" :cram-knowrob-vr) 
 +</code> 
 + 
 +To launch all the necessary components, simply execute:  
 +<code lisp> 
 +CL-USER> (kvr::init-full-simulation '("ep1")) 
 +</code> 
 + 
 +This will create a lisp ros node, clean up the belief-state, load the episodes that get passed to the init function as a list of strings, e.g. in our case "ep1", spawn the semantic map of the episode and the items and initialize the location costmap. In the code section below it will be explained in more detail, what is loaded and when. This process may take a while, so please have some patience. When the function has finished running through your bullet world should look like this:  
 + 
 +Now, let's execute the pick and place plan: 
 +<code lisp> 
 +CL-USER> (cram-pr2-projection:with-simulated-robot (kvr::demo)) 
 +</code> 
 +With this call we first say that we want to use the simulated bullet-world PR2 robot instead of the real one, and then we simply call the demo. The demo will read out the VR episode data and extract the positions of the objects that have been manipulated, which hand was used and the positions of the human head and hand.  
 + 
 + 
 +=== Code === 
 +== mesh-list.lisp ==  
 +Contains a list of all the meshes which we want to spawn based on their locations in the semantic map. Some of them are commented out, e.g. walls, lamps and the objects we interact with, in order to keep the bullet world neat and clean. In unreal however, the walls and lamps are being spawned. We simply currently don't need them in bullet. 
 + 
 +== mapping-urdf-semantic.lisp == 
 +Mapps the urdf kitchen to the semantic map, since they differ in how some furniture is organized and called.  
 + 
 +== init.lisp ==
 TODO TODO
  
 +== queries.lisp ==
 +TODO
  
 +== query-based-calculations.lisp ==
 +TODO
  
 +== designator-integration.lisp ==
 +TODO
  
 +== fetch-and-deliver-based-demo.lisp ==
 +TODO
 ==== Importing new episode data into MongoDB and KnowRob(Additional information) ==== ==== Importing new episode data into MongoDB and KnowRob(Additional information) ====
 In order for us to be able to query data for information, we first need to import that data into KnowRob and MongoDB. In order for us to be able to query data for information, we first need to import that data into KnowRob and MongoDB.