Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
tutorials:advanced:bullet_world_boxy [2015/09/11 15:33] – created gkazhoyatutorials:advanced:bullet_world_boxy [2015/09/17 14:58] (current) – created gkazhoya
Line 4: Line 4:
  
 The starting point of this tutorial is an installation of CRAM including projection with PR2. If you followed the installation manual, you should have all the necessary packages already. The starting point of this tutorial is an installation of CRAM including projection with PR2. If you followed the installation manual, you should have all the necessary packages already.
 +
 +
 +===== Robot URDF description =====
 +
 +The first thing that we actually need is the robot, more precisely, its URDF description.
 +For Boxy it is located in a repo on Github, so let's clone it into our ROS workspace:
 +
 +<code bash>
 +cd ROS_WORKSPACE_FOR_LISP_CODE
 +cd src
 +git clone https://github.com/code-iai/iai_robots.git
 +cd ..
 +catkin_make
 +</code>
 +
 +CRAM takes the URDF descriptions of the robots from the ROS parameter server, i.e., you will need to upload the URDFs of your robots as a ROS parameter. For Boxy there is a launch file doing that, you will find it here:
 +
 +<code bash>
 +roscd iai_boxy_description/launch/ && ls -l
 +</code>
 +
 +It's called ''upload_boxy.launch'' and has the following contents:
 +
 +<code xml>
 +<launch>
 +  <arg name="urdf-name" default="boxy_description.urdf.xacro"/>
 +  <arg name="urdf-path" default="$(find iai_boxy_description)/robots/$(arg urdf-name)"/>
 +
 +  <param
 +    name="robot_description"
 +    command="$(find xacro)/xacro.py '$(arg urdf-path)'" />
 +</launch>
 +</code>
 +
 +As we will need to know the names of the TF frames later, we will launch a general file that includes uploading the URDF as well as a robot state publisher:
 +
 +<code bash>
 +roslaunch iai_boxy_description display.launch 
 +</code>
 +
 +(It also starts a GUI to play with the joint angles but let's ignore that.)
 +
 +Let's check if the URDF's on the parameter server using RViz:
 +
 +<code bash>
 +rosrun rviz rviz
 +Add -> RobotModel -> Robot Description: robot_description
 +Add -> TF
 +</code>
 +
 +To be able to see the TF frames choose ''base_footprint'' as the global fixed frame.
 +
 +{{ :tutorials:advanced:boxy_rviz.png?direct&800 |}}
 +
 +
  
 ===== Directory / file setup ===== ===== Directory / file setup =====
  
-First of all, let'create a directory for our code and the metapackage: in the ''src'' of a catkin workspace create a new directory, we will call it ''cram_boxy''. Therein create a catkin metapackage with the same name:+Now let'setup a directory structure for our own packages: a directory (repo) and the metapackage inside: in the ''src'' of a catkin workspace create a new directory, we will call it ''cram_boxy''. Therein create a catkin metapackage with the same name:
  
 <code bash> <code bash>
Line 31: Line 86:
 </code> </code>
  
-Now let's create the corresponding ASD file called ''cram_boxy_knowledge/cram-boxy-knowledge.asd'':+Now let's create the corresponding ASDF file called ''cram_boxy_knowledge/cram-boxy-knowledge.asd'':
  
 <code lisp> <code lisp>
Line 61: Line 116:
 </code> </code>
  
 +Also, an empty ''cram_boxy/cram_boxy_knowledge/src/boxy-knowledge.lisp''.
  
-===== Robot URDF description =====+Now, let's compile our new packages and load them through the REPL:
  
-Now, before creating the file with the actual code (called ''boxy-knowledge.lisp'' as you can see from the ''cram-boxy-knowledge.asd'' file), we first need a URDF description of our robot. For Boxy it is located in a repo on Githubso let's clone it into our ROS workspace: +<code lisp
- +CL-USER> 
-<code bash> +ros-load-system 
-cd ROS_WORKSPACE_FOR_LISP_CODE +cram_boxy_knowledge 
-cd src +cram-boxy-knowledge
-git clone https://github.com/code-iai/iai_robots.git+
 </code> </code>
  
-Now let's compile our workspace such that ROS learns about the new ''cram_boxy'' packages as well as the ''iai_robots''. 
- 
-CRAM takes the URDF descriptions of the robots from the ROS parameter server, i.e., you will need to upload the URDFs of your robots. For Boxy there is a launch file doing that, you will find it here: 
- 
-<code bash> 
-roscd iai_boxy_description/launch/ && ls -l 
-</code> 
- 
-It's called ''upload_boxy.launch'' and has the following contents: 
- 
-<code xml> 
-<launch> 
-  <arg name="urdf-name" default="boxy_description.urdf.xacro"/> 
-  <arg name="urdf-path" default="$(find iai_boxy_description)/robots/$(arg urdf-name)"/> 
- 
-  <param 
-    name="robot_description" 
-    command="$(find xacro)/xacro.py '$(arg urdf-path)'" /> 
-</launch> 
-</code> 
- 
-As we will need to know the names of the TF frames later, we will launch a general file that includes uploading the URDF as well as a robot state publisher: 
- 
-<code bash> 
-roslaunch iai_boxy_description upload_boxy.launch  
-</code> 
- 
-(It starts a GUI to play with the joint angles but let's ignore that.) 
- 
-Let's check if it's there using RViz: 
- 
-<code bash> 
-rosrun rviz rviz 
-Add -> Robot Model -> Robot description: robot description 
-Add -> TF 
-</code> 
- 
-To be able to see the TF frames choose ''base_footprint'' as the global fixed frame. 
- 
- 
-{{ :tutorials:advanced:boxy_rviz.png?direct&800 |}} 
  
 ===== Boxy Prolog description ===== ===== Boxy Prolog description =====
  
-Now that we have a URDF on the ROS parameter server, let's describe our robot also in Prolog.+Now, let's describe our robot in Prolog, such that we could do some reasoning with it.
  
-We create a file ''cram_boxy/cram_boxy_knowledge/src/boxy-knowledge.lisp'' and fill it in with the simplest descriptions, which are the ones relevant for navigation and vision. We will leave out manipulation for now as it is too complex for the scope of this tutorial.+We fill in the file ''cram_boxy/cram_boxy_knowledge/src/boxy-knowledge.lisp'' with some simple descriptions of Boxy, which are the ones relevant for navigation and vision. We will leave manipulation out for now as it is too complex for the scope of this tutorial.
  
 As can be seen from the TF tree, our robot has 3 camera frames, 1 depth and 1 RGB frame from a Kinect camera, and an RGB frame from a Kinect2 camera. As can be seen from the TF tree, our robot has 3 camera frames, 1 depth and 1 RGB frame from a Kinect camera, and an RGB frame from a Kinect2 camera.
Line 144: Line 158:
   (<- (robot-pan-tilt-joints boxy "head_pan_joint" "head_tilt_joint")))   (<- (robot-pan-tilt-joints boxy "head_pan_joint" "head_tilt_joint")))
 </code> </code>
 +
 +Finally, compile the file (''Ctrl-c Ctrl-k'').
 +
 +
  
 ===== Loading the world ===== ===== Loading the world =====
  
-Now let's try to set up the world. +Now let's try to load our robot also into the Bullet world. 
-We will only spawn the floor, the robot, and a couple of household objects.+We will spawn the floor, the robot, and a couple of household objects.
  
 For that, we  For that, we 
-  - load the bullet reasoning designators package in the REPL+  - load the bullet reasoning package in the REPL
   - start a ROS node   - start a ROS node
   - spawn the robot and the floor   - spawn the robot and the floor
-  - +  - spawn a big box for a table and a couple of household objects to go on top
  
 <code lisp> <code lisp>
-(asdf:load-system :cram-bullet-reasoning-designators)+;; step 1 
 +(asdf:load-system :cram-bullet-reasoning)
 (in-package :btr) (in-package :btr)
  
 +;; step 2
 +(roslisp:start-ros-node "cram_bullet_boxy")
  
 +;; step 3
 +(let ((robot-urdf (cl-urdf:parse-urdf (roslisp:get-param "robot_description"))))
 +  (prolog
 +     `(and
 +       (clear-bullet-world)
 +       (bullet-world ?w)
 +       (debug-window ?w)
 +       (robot ?robot)
 +       (assert (object ?w :static-plane floor ((0 0 0) (0 0 0 1))
 +                       :normal (0 0 1) :constant 0
 +                       :disable-collisions-with (?robot)))
 +       (assert (object ?w :urdf ?robot ((0 0 0) (0 0 0 1)) :urdf ,robot-urdf)))))
 +
 +;; step 4
 +(prolog
 + `(and
 +   (bullet-world ?w)
 +   (assert (object ?w :box table-0 ((2.1 0 0.5) (0 0 0 1))
 +                   :size (0.7 1.5 1) :mass 10.0))
 +   (assert (object ?w :mesh pot-0 ((2 0 1.1) (0 0 0 1)) :mass 0.5 :color (0.1 0.2 0.3)
 +                   :mesh :pot))
 +   (assert (object ?w :mesh mug-0 ((2.2 -0.2 1.07) (0 0 0 1)) :mass 0.2 :color (0.8 0.3 0)
 +                   :mesh :mug))
 +   (assert (object ?w :mesh mug-1 ((2.3 0 1.07) (0 0 0 1)) :mass 0.2 :color (0.5 0.8 0)
 +                   :mesh :mug))))
 +</code>
 +
 +{{ :tutorials:advanced:boxy_bullet.png?direct&800 |}}
 +
 +
 +
 +===== Reasoning with Boxy =====
 +
 +Let's try some reasoning:
 +
 +<code lisp>
 +BTR> (prolog '(visible ?world ?robot pot-0))
 +NIL
 +</code>
 +
 +The query asks if the object ''pot-0'' is visible to the robot.
 +The answer is "No", as our robot is very tall and look straight ahead.
 +Now let's put the object higher and try again:
 +
 +<code lisp>
 +BTR> (prolog '(and 
 +               (bullet-world ?w)
 +               (assert (object-pose ?w pot-0 ((2 0 2) (0 0 0 1))))))
 +BTR> (prolog '(visible ?world ?robot pot-0))
 +(((?WORLD . #<BT-REASONING-WORLD {100D78C5F3}>)
 +  (?ROBOT . CRAM-BOXY-KNOWLEDGE::BOXY))
 + . #S(CRAM-UTILITIES::LAZY-CONS-ELEM :GENERATOR #<CLOSURE # {101275A8FB}>))
 +</code>
 +
 +Now it is visible.
 +
 +Let's put the pot down:
 +
 +<code lisp>
 +BTR> (simulate *current-bullet-world* 50)
 +</code>
 +
 +Now let's do something more complex: let's generate a visibility costmap, that will generate poses from where Boxy will be able to see the pot.
 +As we're going to be using costmaps, we will need to define costmap metadata first:
 +
 +<code lisp>
 +BTR> (def-fact-group costmap-metadata ()
 +       (<- (location-costmap:costmap-size 12 12))
 +       (<- (location-costmap:costmap-origin -6 -6))
 +       (<- (location-costmap:costmap-resolution 0.05))
 +
 +       (<- (location-costmap:costmap-padding 0.35))
 +       (<- (location-costmap:costmap-manipulation-padding 0.35))
 +       (<- (location-costmap:costmap-in-reach-distance 1.0))
 +       (<- (location-costmap:costmap-reach-minimal-distance 0.1)))
 +</code>
 +
 +We should also load the ''cram_bullet_reasoning_designators'' package, as visibility costmap is implemented in it:
 +<code lisp>
 +BTR> (asdf:load-system :cram-bullet-reasoning-designators)
 +</code>
 +
 +Now let's create a location designator for a pose to see the ''mug-0'' object, which is the orange mug in the corner, and try to resolve it:
 +
 +<code lisp>
 +BTR> (let ((location-to-see (desig:make-designator :location '((:to :see)
 +                                                               (:object mug-0)))))
 +       (desig:reference location-to-see))
 +</code>
  
 +{{ :tutorials:advanced:boxy_visibility_cm.png?direct&800 |}}