Bullet world demonstration

This tutorial demonstrates how the internal belief state of the robot is represented in CRAM, how to update the world, how to perform specialized reasoning in it and how to use lightweight simulation for quick testing of plans.

This tutorial assumes that you have a basic understanding of how the belief state is implemented internally. You can read about this in the documentation entry for geometric reasoning.

Environment setup

In order to be able to demonstrate many different features of the belief state representation we will set up the environment of the robot similar to how it is done for the real system: we will need an occupancy map for navigation, an IK solver for manipulation, a semantic map of the environment with the furniture etc. Setting up the environment is out of the scope of this tutorial so we will use the spatial_relations_demo package (from the cram_projection_demos repository / metapackage): it has a launch file to set up the environment and a number of utility functions to start things up in Lisp. As the system we will be working with will get quite complex things might not go as expected. If you get into trouble while following this tutorial feel free to consult the Support page or contact the person who last edited this tutorial, you can find the username in the footer.

First, let's start the roscore and set up the environment in two different terminals:

$ roscore

$ roslaunch spatial_relations_demo demo.launch

The demo launch uploads a PR2 robot URDF description to the ROS parameter server, let's check if it's there:

$ rosparam get /robot_description_lowres | awk 'NR == 5' # read the fifth line of the URDF

It also uploads a URDF of the kitchen environment:

$ rosparam get /kitchen_description | awk 'NR == 5'

In addition to uploading URDFs to the parameter server the launch file starts Moveit for manipulation, a JSON communication to KnowRob, an occupancy map publisher etc.

Now, let's load the package in the REPL:

CL-USER> (asdf:load-system :spatial-relations-demo)
CL-USER> (in-package :spatial-relations-demo)

As we will be talking to the ROS master, let's first start a node:

SPATIAL-RELATIONS-DEMO> (roslisp-utilities:startup-ros)
This starts a ROS node and executes the ROS initialization functions that are registered in CRAM through roslisp-utilities:register-ros-init-function. You can see which functions are registered by executing the following:
SPATIAL-RELATIONS-DEMO> (alexandria:hash-table-keys roslisp-utilities::*ros-init-functions*)
One of the functions initializes the occupancy map for robot navigation that it gets from the /map topic, the other one initializes a publisher for some markers to RViz etc.

The way CRAM is designed is that the belief state (the Bullet world) is only updated through Prolog queries, as we consider the belief state to be “robot knowledge” that we can query and that can get things asserted to it, and we try to manipulate the knowledge only through Prolog and not through Lisp directly. To learn more about using Prolog in CRAM you can quickly go through the Prolog tutorial.

A new Bullet world instance is created when we load the bullet_reasoning package, it is stored in the *current-bullet-world* variable, which we can access through the following Prolog query:

SPATIAL-RELATIONS-DEMO> (prolog '(bullet-world ?world))

Objects in the world

To be able to see what is going on in the belief state and to ease the debugging process we visualize the current world instance with OpenGL:

SPATIAL-RELATIONS-DEMO> (prolog '(and (bullet-world ?world)
                                      (debug-window ?world)))

Please note, that the visualization window does not necessarily need to be open when the robot is operating and using its belief state, in fact, it should be closed to save processing power.

The “bullet visualization” window that appears is empty at the moment. Let's spawn some objects.

The first thing we need is the floor plane:

DEMO> (prolog '(and (bullet-world ?w)
                    (assert (object ?w :static-plane floor ((0 0 0) (0 0 0 1))
                                    :normal (0 0 1) :constant 0))))

The robot URDF representation has been uploaded to the ROS parameter server under the name robot_description_lowres, let's load it into the world:

DEMO> 
(let ((robot-urdf-lowres
        (cl-urdf:parse-urdf
         (roslisp:get-param "robot_description_lowres"))))
  (prolog `(and (bullet-world ?w)
                (robot ?robot)
                (assert (object ?w :urdf ?robot ((0 0 0) (0 0 0 1)) :urdf ,robot-urdf-lowres))
                (robot-arms-parking-joint-states ?robot ?joint-states)
                (assert (joint-state ?w ?robot ?joint-states))
                (assert (joint-state ?w ?robot (("torso_lift_joint" 0.16825d0)))))))

Finally, let's spawn the kitchen environment (it will take some time as we query KnowRob for the semantic environment map):

DEMO> 
(let ((kitchen-urdf (cl-urdf:parse-urdf (roslisp:get-param "kitchen_description"))))
  (prolog `(and (bullet-world ?w)
                (assert (object ?w :semantic-map my-kitchen
                                ((-3.45 -4.35 0) (0 0 1 0))
                                :urdf ,kitchen-urdf)))))

Now we can also spawn some household objects:

DEMO> 
(prolog '(and (bullet-world ?w)
              (assert (object ?w :mesh mug-1 ((0 0 2) (0 0 0 1))
                                 :mass 0.2 :color (1 0 0) :mesh :mug))))

There appears a mug above the robot's head. We can also move objects:

DEMO> (prolog '(and (bullet-world ?w)
                    (assert (object-pose ?w mug-1 ((-1 1 1) (0 0 0 1))))))
Now the mug is above the kitchen island table.

To get the instance of the spawned Bullet object there is a function btr:object:

DEMO> (object *current-bullet-world* 'mug-1)
We can get the pose of that object through the function btr:pose:
DEMO> (pose (object *current-bullet-world* 'mug-1))
#<CL-TRANSFORMS:POSE 
   #<3D-VECTOR (-1.0d0 1.0d0 1.0d0)>
   #<QUATERNION (0.0d0 0.0d0 0.0d0 1.0d0)>>

Geometric reasoning

Our mug is floating in the air. The belief state world in CRAM is static, in order to apply the physics to it we use the Bullet physics engine to simulate the world for a certain amount of time.

DEMO> (prolog '(and (bullet-world ?w)
                    (simulate ?w 10)))
This simulates the world for 1 second and the mug falls onto the table.

We can also ask if the scene is stable or, if we simulate the world for longer time, would objects change positions:

DEMO> (prolog '(and (bullet-world ?w)
                    (stable ?w)))
(NIL)
This copies our current world and simulates it for a certain amount of time and compares the poses of objects before and after simulation. If the result we got is (NIL) then the mug is stably standing on the table.

Now, let's ask if the mug is currently visible for the robot:

DEMO> (prolog '(and (bullet-world ?w)
                    (robot ?robot)
                    (visible ?w ?robot mug-1)))
NIL

Abstract entity descriptions

Now let's generate a distribution of locations from which the robot would be able to see the mug. For that we create an abstract location description that we call a designator. The abstract description gets grounded into specific geometric coordinates with the reference function.

DEMO> 
(let* ((mug-pose (pose (object *current-bullet-world* 'mug-1)))
       (mug-pose-designator (make-designator :location `((:pose ,mug-pose))))
       (to-see-designator (make-designator :location `((:to :see)
                                                       (:location ,mug-pose-designator)))))
    (reference to-see-designator))
#<CL-TRANSFORMS-STAMPED:POSE-STAMPED 
   FRAME-ID: "map", STAMP: 0.0
   #<3D-VECTOR (-0.2800002098083496d0 1.6999998092651367d0 0.0d0)>
   #<QUATERNION (0.0d0 0.0d0 -0.9338500185757329d0 0.3576648470371437d0)>>

We create two symbolic descriptions: one is for the pose that we want to perceive, and the other one is a pose for the robot from where it would be able to see the mug.

We can also completely abstract away from geometric poses (we used the mug-pose variable which was a quantitative value) by describing the mug as an object standing on the table called the “kitchen_island” in the URDF. This will be equivalent to telling the robot: “go stand in a position such that you would be able to perceive a mug standing on the kitchen island table top”. Please note, that we do not explicitly specify which mug we are searching for, we just say that somewhere on the table there should be a mug.

DEMO> 
(let* ((on-counter (make-designator :location `((:on "CounterTop")
                                                (:name "kitchen_island_counter_top"))))
       (the-object (make-designator :object `((:type :mug)
                                              (:at ,on-counter))))
       (pos-to-see (make-designator :location `((:to :see)
                                                (:object ,the-object)))))
    (reference pos-to-see))
#<CL-TRANSFORMS-STAMPED:POSE-STAMPED 
   FRAME-ID: "map", STAMP: 0.0
   #<3D-VECTOR (-0.07000017166137695d0 1.8199996948242188d0 0.0d0)>
   #<QUATERNION (0.0d0 0.0d0 -0.9072884056004601d0 0.42050891674609586d0)>>

Here, as we didn't specifically say which pose we are trying to perceive the robot randomly samples one pose from all the poses on the kitchen island table and generates a distribution to perceive that pose, i.e., it looks in the general direction of the table.

Light-weight simulation (projection of plans)

Now, let's use light-weight simulation to try to find our particular red mug:

DEMO> 
(with-projection-environment pr2-bullet-projection-environment
  (top-level
    (with-designators
        ((on-counter :location `((:on "CounterTop")
                                 (:name "kitchen_island_counter_top")))
         (an-object :object `((:type :mug)
                              (:at ,on-counter))))
      (reference on-counter)
      (format t "Searching for the mug~%")
      (let ((perceived-object (perceive-object 'a an-object)))
        (unless (desig-equal an-object perceived-object)
          (equate an-object perceived-object))      
        an-object))))
We execute our plan in a projection environment which substitutes the real robot controllers and sensors with their simplified models. Next, we define our object that we are searching for in a symbolic abstract way. After that, we execute the perceive-object plan from the standard CRAM plan library. We search for “a mug”, as opposed to “the mug”, the latter would throw an error if there would be multiple mugs found on the table. Finally, we specify that the new perceived-object that the robot found while executing the plan is the same object that we described in the designator an-object, so that whenever we use the variable an-object from now on, the robot will know exactly what mug we are referring to.

To see an execution of a complete pick and place plan try the following:

DEMO> (start-ros-and-bullet)
      (execute-demo :pancake-making)
This will execute the plan from the pancake making demo. The code can be found in the spatial_relations_demo package, in the pancake-making.lisp file.


The package cram_bullet_reasoning_utilities has a number of utility functions to make the rapid prototyping with the Bullet world faster and easier, there are functions such as spawn-object, move-object, kill-all-objects, move-robot etc. that execute the Prolog queries with default values of parameters that are not important, e.g. default colors for objects. By pressing . while holding Alt (Alt-.) while the Emacs cursor is on the function name you will be redirected to the definition of the function to see what exactly it does. Alt-, brings back the previous window. Try this for start-ros-and-bullet, for example.