**//Tested with Cram v0.8.0, ROS version: Noetic, Ubuntu 20.04//**
====== Bullet world demonstration ======
This tutorial demonstrates how the internal world state of the robot is represented in CRAM, how to update the world, how to perform specialized reasoning in it and how to use lightweight simulation for quick testing of plans.
This tutorial assumes that you have a basic understanding of how the world state is implemented internally. You can read about this in the [[doc:reasoning:overview|documentation entry for geometric reasoning]]. You can find the code of this tutorial on [[https://github.com/cram2/cram/tree/master/cram_tutorials/cram_bullet_world_tutorial|Github]].
==== Installing the projection project ====
As the system we will be working with will get quite complex things might not go as expected. If you get into trouble while following this tutorial feel free to consult the [[:support|Support]] page or contact the person who last edited this tutorial, you can find the username in the footer.
If you followed the installation instructions for the recommended CRAM version on the Installation page, you should be ready to go. Just in case, check the version with which this tutorial has been last tested, it is at the very top of the tutorial.
==== Environment setup ====
We will set up the environment of the robot similar to how it is done for the real system: we will need an occupancy map for navigation, an IK solver for manipulation, a representation of the environment with the furniture etc.
First, let's set up the environment in our terminal by calling the launch file. It already brings up a ''roscore'' from within:
$ roslaunch cram_bullet_world_tutorial world.launch
Let's look into that launchfile.
$ roscd cram_bullet_world_tutorial
$ cat launch/world.launch
The robot needs to know who he is and what body parts it has, therefore we upload such a robot description to the ROS parameter server. To check if PR2's URDF was uploaded to the server, we can execute the following:
$ rosparam get /robot_description | awk 'NR == 5' # read the fifth line of the URDF
The name of the parameter is "robot_description".
To simulate simple grasping tasks with the PR2 robot in the bullet-world we need an inverse kinematics solver for the arms. In the launch file those are the ''pr2_arm_kinematics'' package nodes.
In addition to the description of itself, the robot also can be told how its environment looks like.
This is very practical if the environment does not change often and a model thereof is available.
Then the robot will not have to perceive and classify the furniture in its surroundings (which is very difficult and time consuming).
We have a model of the laboratory kitchen of our Institute for AI, where our PR2 lives.
The kitchen is represented with the URDF format like the PR2, as we can consider kitchen to be a robot with translational, rotational and fixed joints: translational for drawers, rotational for doors and fixed for, e.g., a fixed handle on the fridge door. We can, thus, open the fridge door by changing the joint state of the door joint.
To check if the kitchen URDF was uploaded to the ROS parameter server we can execute the following:
$ rosparam get /kitchen_description | awk 'NR == 5'
As you can see, the name of the kitchen parameter is "kitchen_description".
(The semantic map is used to reason about semantically-described places in the kitchen, such as that an iai_fridge_main is an object of class Fridge, which is a container for perishable items. For that we query KnowRob. Thus, the semantic map can only be activated if you have KnowRob installed. Per default CRAM does not come with KnowRob, so we will not discuss semantic maps in this tutorial.)
If you retrieve data from the ''rosparam get'' commands, we can get going.
====REPL setup====
Now, let's load the package in the REPL (start the REPL with ''$ roslisp_repl''):
CL-USER> (ros-load:load-system "cram_bullet_world_tutorial" :cram-bullet-world-tutorial)
CL-USER> (in-package :cram-bullet-world-tutorial)
As we will be talking to the ROS master, let's first start a node:
BTW-TUT> (roslisp:start-ros-node "bullet_world")
First, let's do all the initializations:
BTW-TUT> (cram-occupancy-grid-costmap::init-occupancy-grid-costmap) ; Inits the occupancy map for navigation.
BTW-TUT> (cram-bullet-reasoning-belief-state::ros-time-init) ; Resets the simulated time.
BTW-TUT> (cram-location-costmap::location-costmap-vis-init) ; Inits publishing of markers in RViz.
BTW-TUT> (cram-tf::init-tf) ; Inits the TF listener object.
The functions above initialize and clean up the environment of the REPL.
Currently, we call those functions per hand for educational purposes.
However, there exists a function that does the initialization automatically and spawns the completely set up visualization window. It will be described later.
The way CRAM is designed is that the world state (the Bullet world) is only updated through Prolog queries, as we consider the world state to be "robot knowledge" that we can query and that can get things asserted to it, and we try to manipulate the knowledge only through Prolog and not through Lisp directly. To learn more about using Prolog in CRAM you can quickly go through the [[:tutorials:beginner:cram_prolog|Prolog tutorial]].
A new Bullet world instance is created when we load the ''cram-bullet-reasoning'' ASDF system (which is the dependency of our tutorial package). It is stored in the ''*current-bullet-world*'' variable, which we can access through the following Prolog query:
BTW-TUT> (prolog:prolog '(btr:bullet-world ?world))
==== Bullet world initialization ====
To be able to see what is going on in the world state and to ease the debugging process we visualize the current world instance with OpenGL:
BTW-TUT> (prolog:prolog '(and (btr:bullet-world ?world)
(btr:debug-window ?world)))
Please note, that the visualization window does not necessarily need to be open when the robot is operating and using its world state, in fact, it should be closed to save processing power.
The "bullet visualization" window that appears should be empty at the moment. Let's spawn some objects.
The first thing we need is the floor plane:
BTW-TUT> (prolog:prolog '(and (btr:bullet-world ?world)
(assert (btr:object ?world :static-plane :floor ((0 0 0) (0 0 0 1))
:normal (0 0 1) :constant 0))))
This ''assert'' query must not be confused with the Common Lisp test function ''assert''. To show you the origin of this ''assert'' query you can look up its definition. Since it is not a function but a Prolog fact you must search with ''rgrep'' in the ''cram_3d_world/cram_bullet_reasoning'' directory:
BTW-TUT> Alt-x
rgrep RET
(<- (assert RET
*.lisp RET
/path/to/your/workspace/src/cram_3d_world/cram_bullet_reasoning/src/ RET
There are quite a few definitions here, serving various queries. All of them have in common, that they **change** something in the world. The one we are reaching by the previous call is this one, located in ''world-facts.lisp'':
(<- (assert (object ?world ?object-type ?name ?pose . ?args))
(assert ?world (object ?object-type ?name ?pose . ?args)))
In the assert query, the '':static-plane'' is the type of the object we're spawning (it can be '':static-plane'', '':sphere'', '':box'', '':mesh'', '':urdf'' etc.), '':floor'' is the name of the object, ''( (0 0 0) (0 0 0 1) )'' is the coordinate.
Via the launch file at the beginning of this tutorial the robot URDF representation has been uploaded to the ROS parameter server under the name ''robot_description''. Let's load it into the world:
BTW-TUT> (let ((robot-urdf
(cl-urdf:parse-urdf
(roslisp:get-param "robot_description"))))
(prolog:prolog
`(and (btr:bullet-world ?world)
(cram-robot-interfaces:robot ?robot)
(assert (btr:object ?world :urdf ?robot ((0 0 0) (0 0 0 1)) :urdf ,robot-urdf))
(-> (rob-int:robot-joint-states ?robot :arm :left :park ?left-joint-states)
(assert (btr:joint-state ?world ?robot ?left-joint-states))
(true))
(-> (rob-int:robot-joint-states ?robot :arm :right :park ?right-joint-states)
(assert (btr:joint-state ?world ?robot ?right-joint-states))
(true)))))
The line ''(roslisp:get-param "robot_description")'' gets the URDF description of the robot as a string.
''(cl-urdf:parse-urdf ...)'' then parses this string into a ''cl-urdf:robot'' object.
''(cram-robot-interfaces:robot ?robot)'' predicate reads the name of the robot that was specified in a robot description Lisp package. For the PR2 it is the ''cram_pr2_description'' package, which defines the robot name as ''pr2'', so ''?robot'' gets bound to ''pr2''.
This line:
(assert (btr:object ?world :urdf ?robot ((0 0 0) (0 0 0 1)) :urdf ,robot-urdf))
spawns the robot.
Keep in mind, that spawning a robot in the world creates a new object. The first '':urdf'' defines the type of the created object, while the second is used as a keyword to provide the objects urdf.
Then we read out the hard-coded values for robot arm parking states, where the arms are not in the way, and assert those joint states to our PR2 object of the Bullet world.
Finally, let's spawn the kitchen environment:
BTW-TUT> (let ((kitchen-urdf
(cl-urdf:parse-urdf
(roslisp:get-param "kitchen_description"))))
(prolog:prolog
`(and (btr:bullet-world ?world)
(assert (btr:object ?world :urdf :kitchen ((0 0 0) (0 0 0 1))
:urdf ,kitchen-urdf
:collision-group :static-filter
:collision-mask (:default-filter :character-filter)
:compound T)))))
In the visualization window you can now see the kitchen loaded from the URDF with the semantic map information attached to it.
You can see the environment in rviz as well. Open up a terminal and type ''"rosrun rviz rviz"'' or just ''"rviz"''. Tweak your setting as shown in the image below.
{{:tutorials:advanced:bullet_world_rviz_iai_kitchen.png?800|}}
If there are parameters missing (left box) you can add them using the 'Add' button below. Take the robot model of the kitchen for example:
{{:tutorials:advanced:bullet_world_rviz_robot_model.png?800|}}
To see the URDF of the kitchen in RViz you need its description name 'kitchen_description' and the TF prefix 'iai_kitchen'. In the launchfile 'world.launch' you can see, that the name of the kitchen URDF is remapped from 'robot_description' (the default) to 'kitchen_description', which is the name you need in your RViz configuration. When you open the robot model parameter, you have two checkboxes for 'Visual Enabled' and 'Collision Enabled'. The first shows a detailed mesh of each component in the kitchen, the latter shows the collision objects, which are usually less detailed (in computer games and also in robotics). We use collision objects when navigating the PR2, since the calculation of potential collisions would be very expensive when using visual meshes.
When you are ready with the robot model of the kitchen, look at the 'TF' visualization in RViz. It tells you if any poses can't be resolved, e.g. when their parent link is not published in TF. Open up the 'Tree' parameter of 'TF' (left panel of RViz). This is a tree representing all the TF poses with their respective parents and children. If you change the pose of a frame, all of its children will change poses as well.
As you can see, this environment visualizes all the poses of TF frames of our environment that we can use in our tutorial. Get back to the debug window (Bullet visualization window) to see the actual meshes of the objects. The debug window only shows the collision meshes, not the detailed visuals, since we want to keep the bullet world fast and lightweight and concentrate on collisions not visuals.
Please note that you will not be able to see the robot TF frames with our 'world.launch' launch file, only the TF frames of the kitchen.
The environment is now initialized. For convenience, there exists a function that will execute all of the above in one call.
If you call it now it will reinitialize everything, which is also fine:
BTW-TUT> (roslisp-utilities:startup-ros)
This starts a ROS node and executes the ROS initialization functions, which are registered in CRAM through ''roslisp-utilities:register-ros-init-function''. You can see which functions are registered by executing the following:
BTW-TUT> (alexandria:hash-table-keys roslisp-utilities::*ros-init-functions*)
You should find all of the functions that we manually called above and one additional, the ''(init-projection)'' function, which initializes the Bullet world with the robot and the kitchen environment.
==== Manipulating the kitchen ====
Let us inspect the kitchen now:
BTW-TUT> (btr:object btr:*current-bullet-world* :iai-kitchen)
If needed, change the name of ':kitchen' to the name you used when spawning the kitchen. This call should give you a robot-object into the REPL. Inspect it by right-clicking on the ''#'' and choosing ''Inspect''. You can look into all its joints and states, how they are linked and in what state they are (press ''L'' to go up in the inspector). Now we are about to change one of those joints, causing the kitchen to open the door of the fridge.
BTW-TUT>
(btr:set-robot-state-from-joints
'(("iai_fridge_door_joint" 0.3d0))
(btr:object btr:*current-bullet-world* :iai-kitchen))
If you did everything right, the fridge door should now be slightly open. Check the other joints and change them as you wish. Watch their limits though.
==== Spawn and change objects in the world ====
From this point on let us use the fast initialization function to start the bullet-world:
BTW-TUT> (roslisp-utilities:startup-ros)
Now let us spawn some household objects:
BTW-TUT>
(prolog:prolog '(and (btr:bullet-world ?world)
(assert (btr:object ?world :mesh mug-1 ((0 0 2) (0 0 0 1))
:mass 0.2 :color (1 0 0) :mesh :mug))))
There appears a mug above the robot's head. We can also move objects:
BTW-TUT> (prolog:prolog '(and (btr:bullet-world ?world)
(assert (btr:object-pose ?world mug-1 ((-1 1 1) (0 0 0 1))))))
Now the mug is above the kitchen island table.
To get the instance of the spawned Bullet object there is a function ''btr:object'':
BTW-TUT> (btr:object btr:*current-bullet-world* 'mug-1)
We can get the pose of that object through the function ''btr:pose'':
BTW-TUT> (btr:pose (btr:object btr:*current-bullet-world* 'mug-1))
#
#>
==== Geometric reasoning ====
Our mug is floating in the air. The world state representation in CRAM is static, in order to apply the physics to it we use the Bullet physics engine to simulate the world for a certain amount of time.
BTW-TUT> (prolog:prolog '(and (btr:bullet-world ?world)
(btr:simulate ?world 10)))
This simulates the world for 1 second and the mug falls onto the table.
We can also ask if the scene is stable:
BTW-TUT> (prolog:prolog '(and (btr:bullet-world ?world)
(btr:stable ?world)))
(((?WORLD . ...)))
This copies our current world and simulates it for a certain amount of time and compares the poses of objects before and after
simulation. If the result we got is ''(NIL)'' or anything else except ''NIL'' then the mug is stably standing on the table.
Now, let's ask if the mug is currently visible for the robot:
BTW-TUT> (prolog:prolog '(and (btr:bullet-world ?world)
(rob-int:robot ?robot)
(btr:visible ?world ?robot mug-1)))
NIL
Let's move PR2 such that the cup is in its visibility range:
BTW-TUT> (prolog:prolog '(and (btr:bullet-world ?world)
(assert (btr:object-pose ?world cram-pr2-description:pr2
((0.5 1 0) (0 0 1 0))))))
and ask the visibility query again: instead of typing, put the cursor where you would type and press ''Ctrl-Up'' (repeatedly). This will find previously executed commands.
==== Abstract entity descriptions ====
Now let's generate a distribution of locations from which the robot would be able to see the mug. Before we do anything let's define the costmap metadata, which will tell the grid size and other parameters needed to get the distribuion:
BTW-TUT>
(def-fact-group costmap-metadata ()
(<- (costmap-size 12 12))
(<- (costmap-origin -6 -6))
(<- (costmap-resolution 0.04))
(<- (costmap-padding 0.3))
(<- (costmap-manipulation-padding 0.4))
(<- (costmap-in-reach-distance 0.7))
(<- (costmap-reach-minimal-distance 0.2))
(<- (visibility-costmap-size 2))
(<- (orientation-samples 2))
(<- (orientation-sample-step 0.1)))
Now, we create an abstract location description that we call a //designator//. The abstract description gets grounded into specific geometric coordinates with the ''reference'' function.
BTW-TUT> (let* ((mug-pose (btr:pose (btr:object btr:*current-bullet-world* 'mug-1)))
(?mug-pose-stamped (cl-transforms-stamped:pose->pose-stamped "map" 0 mug-pose))
(?mug-location-designator (desig:a location (pose ?mug-pose-stamped)))
(to-see-designator (desig:a location (visible-for pr2)
(location ?mug-location-designator))))
(desig:reference to-see-designator))
#
#>
We create two symbolic descriptions: one is for the pose that we want to perceive, which we parse to a pose-stamped, and the other one is a pose for the robot from where it would be able to see the mug.
We can also completely abstract away from geometric poses (we used the ''mug-pose'' variable which was a quantitative value) by describing the mug as an object standing on the table called the ''"kitchen-island-surface"'' in the URDF. This will be equivalent to telling the robot: "go stand in a position such that you would be able to perceive a mug standing on the kitchen island table top". Please note that we do not explicitly specify which mug we are searching for, we just say that somewhere on the table there should be a mug.
BTW-TUT>
(let* ((?on-counter (desig:a location
(on (desig:an object
(type counter-top)
(urdf-name kitchen-island-surface)
(part-of iai-kitchen)))))
(location-to-see (desig:a location
(visible-for pr2)
(location ?on-counter))))
(desig:reference location-to-see))
#
#>
Here, as we didn't specifically say which pose we are trying to perceive: the robot randomly samples one pose from all the poses on the kitchen island table and generates a distribution to perceive that pose, i.e., it looks in the general direction of the table. If you look closely at the Bullet window, you might be able to see the different steps that happen during the referencing:
* first an area, which corresponds to the location on the kitchen-island-surface, is highlighted red,
* then a sample is being drawn from that distribution, which is visualized as a red dot,
* then a Gaussian is generated around that dot to represent a location for the robot to see the object,
* and, finally, a sample is taken from the Gaussian visualized again as a red dot.
You can execute the above command multiple times, and each time you will get a different distribution of locations for the robot to stand. Sometimes, you might get an error that the location cannot be referenced. This happens if the sample that the robot
picks on the table is too close to the wall, and there is no location to stand to perceive that point on the table that would not create collisions of the robot with the environment.
Let us move the robot to a location to perceive a mug on the table:
BTW-TUT> (btr-utils:move-robot *)
{{:tutorials:advanced:bullet_move_pr2.png?800|}}
==== Attaching objects to the robot ====
When manipulating objects in the environment, e.g., carrying a cup in the gripper, the simulation needs to be told that such objects should move when the respective robot part moves. If the PR2 robot grasps an object, it is automatically attached to the gripper, more specifically, to the link of the rigid body of the gripper. Via prolog it is possible to attach objects to robots parts manually. This comes in handy when placing objects into environment containers, e.g., we spawn a spoon in a drawer and when the drawer closes, the spoon should follow.
In this part of the tutorial, we will place a fork inside a drawer and attach it to the link of the drawer.
Assuming that the cram_bullet_world_tutorial is already loaded, we need to reset the world:
(roslisp-utilities:startup-ros) ;; restart world
(add-objects-to-mesh-list) ;; tell Bullet where the fork mesh is stored on the hard drive
The function ''set-robot-state-from-joints'' can change the state of joints. A joint can be the hinge of a door, the rails of a drawer, the elbow of an arm of the PR2, basically everything that connects two parts and can be moved in some kind of way. The state of such a joint indicates its current position, e.g., how hard the elbow is bent or how far the drawer is pulled open. The function takes two arguments, a list of joints with their desired positions, and the corresponding robot, in our case that will be the kitchen.
Let us first open the drawer:
(btr:set-robot-state-from-joints
'(("sink_area_left_upper_drawer_main_joint" 0.4))
(btr:object btr:*current-bullet-world* :iai-kitchen))
The drawer is called ''"sink_area_left_upper_drawer_main_joint"'' and we would like to open it to 0.4 meters.
{{ :tutorials:intermediate:btw-tut-opendrawer.png?600 |}}
Now let us spawn the fork inside the drawer:
(prolog:prolog '(and (btr:bullet-world ?world)
(assert (btr:object ?world :mesh fork-1 ((1.0 0.9 0.75) (0 0 0 1))
:mass 0.2 :color (0.5 0.5 0.5) :mesh :fork-plastic))))
If this throws an error, you probably forgot to use ''(add-objects-to-mesh-list)'' previously, since we need the fork mesh. The whole scene should now look like this:
{{ :tutorials:intermediate:btw-tut-drawerwfork.png?600 |}}
Try and move the drawer a bit with ''set-robot-state-from-joints''. You'll see, that the fork stays in place, even if the scene is simulated with ''(btr:simulate btr:*current-bullet-world* 10)''. To solve this, the fork needs to be attached to this specific drawer. All parts and links of the kitchen can be inspected with ''(btr:object btr:*current-bullet-world* :kitchen)'', but to save some time the following example already contains the correct link to attach to:
(prolog '(and (btr:bullet-world ?world)
(btr:%object ?world fork-1 ?fork)
(assert (btr:attached ?world :iai-kitchen "sink_area_left_upper_drawer_main" ?fork))))
Notice, that the joint name differs from the link name. Now the fork moves when the drawer is moved.
(btr:set-robot-state-from-joints
'(("sink_area_left_upper_drawer_main_joint" 0.3))
(btr:object btr:*current-bullet-world* :iai-kitchen))
Every attachment can be checked with the following predicate:
(prolog '(and (btr:bullet-world ?world)
(btr:attached ?world :iai-kitchen ?_ fork-1)))
This checks if there is any attachments between kitchen and fork. If needed, it is possible to set the name of a link to be specifically checked. Or set the ''?_'' to ''?link'', to get the list of links the object is attached to. To detach an object, the ''retract'' predicate does the job.
(prolog '(and (btr:bullet-world ?world)
(btr:%object ?world fork-1 ?fork-instance)
(btr:retract (btr:attached ?world :iai-kitchen ?fork-instance))))
This detaches the fork from all the links of the kitchen that it has been attached to.
If you only want to detach from a specific link and not others, this is also possible, but it is out of scope of this tutorial.
==== Using TF in the Bullet world ====
Per default, the TF listener is not set up in the REPL, when you are working with the Bullet world.
To have it running, we need a TF context. It is possible to create it manually, but to not overcomplicate this tutorial,
we will use the environment provided by the ''pr2-proj:with-simulated-robot'' macro. It will enable us to lookup transforms between robot frames, kitchen frames etc.
For example, the following does not work (unless you have a real robot running in your ROS ecosystem):
BTW-TUT> (cl-tf:lookup-transform cram-tf:*transformer* "map" "l_gripper_tool_frame")
and the following does:
BTW-TUT> (urdf-proj:with-simulated-robot
(cl-tf:lookup-transform cram-tf:*transformer* "map" "l_gripper_tool_frame"))
Here, ''"l_gripper_tool_frame"'' is the tool frame of PR2's left hand.
==== Moving the robot in the Bullet world ====
In this part of the tutorial we will look into moving the robot and it's body parts as well as perceiving objects through the Bullet world. We will use functions from the ''cram-urdf-projection'' package, which implements a simple robot simulator in the Bullet world. This robot simulator does not execute motions in a continuous manner, but by teleporting through key poses.
This teleporting is done by directly calling Prolog predicates that move objects in the world (for navigating the robot, simply teleport it to the goal), changing joint angles (to move the arm simply teleport the arm to given joint values) etc. ''cram-urdf-projection'' also uses Prolog predicates for attaching and detaching objects to the robot, as we did with the fork and the drawer.
Another package that we will use in this part of the tutorial is ''cram_bullet_reasoning_utilities'', which has a number of utility functions to make rapid prototyping with the Bullet world faster and easier.
Until now we had to write lengthy Prolog queries to access the world state and assert changes to it.
From now on we will use the utility functions from ''cram_bullet_reasoning_utilities'' package, which wrap around Prolog and simplify the interface.
There are functions such as ''spawn-object'', ''move-object'', ''kill-all-objects'', ''move-robot'' etc. that execute the Prolog queries with default values of parameters that are not important, e.g. default colors for objects.
By pressing ''.'' while holding ''Alt'' (''Alt-.'') while the Emacs cursor is on a name of the function, e.g., ''btr-utils:spawn-object'', you will be redirected to the definition of the function to see what exactly does it do. ''Alt-,'' brings back the previous window.
We need a clean environment for this tutorial, so let's clean the world:
BTW-TUT> (roslisp-utilities:startup-ros)
Let us try to perceive an object.
For that we will use a mesh of a bottle loaded from the ''resources'' subdirectory of the tutorial.
BTW-TUT> (add-objects-to-mesh-list)
Then we spawn an object of type bottle and move it onto the table.
BTW-TUT> (btr-utils:spawn-object 'bottle-1 :bottle)
BTW-TUT> (btr-utils:move-object 'bottle-1
(cl-transforms:make-pose
(cl-transforms:make-3d-vector -2 -1 0.83)
(cl-transforms:make-identity-rotation)))
Lastly we simulate the world for 10 seconds to make sure, nothing moves unexpectedly at runtime.
BTW-TUT> (btr:simulate btr:*current-bullet-world* 10)
Before we perceive the bottle, we need to move PR2's arms out of sight, navigate the base in front of the bottle and look at it. The pose we want the base to navigate to can be hard coded and saved temporarily.
BTW-TUT> (defparameter ?grasp-base-pose
(cl-transforms-stamped:make-pose-stamped
"map"
0.0
(cl-transforms:make-3d-vector -1.8547d0 -0.251d0 0.0d0)
(cl-transforms:axis-angle->quaternion (cl-transforms:make-3d-vector 0 0 1)
(/ pi -2))))
The same thing can be done with the point we want to look at.
BTW-TUT> (defparameter ?grasp-look-pose
(cl-transforms-stamped:make-pose-stamped
"base_footprint"
0.0
(cl-transforms:make-3d-vector 0.65335d0 0.076d0 0.758d0)
(cl-transforms:make-identity-rotation)))
Putting all these together we end up with the following:
BTW-TUT>
(urdf-proj:with-simulated-robot
(urdf-proj::move-joints '(1.9652919379395388d0
-0.26499816732737785d0
1.3837617139225473d0
-2.1224566064321584d0
16.99646118944817d0
-0.07350789589924167d0
0.0)
'(-1.712587449591307d0
-0.2567290370386635d0
-1.4633501125737374d0
-2.1221670650093913d0
1.7663253481913623d0
-0.07942669250968948d0
0.05106258161229582d0))
(urdf-proj::drive ?grasp-base-pose)
(urdf-proj::look-at ?grasp-look-pose nil))
As some of the functions in ''cram-urdf-projection'' package need a running TF listener object, we wrapped our calls in ''urdf-proj:with-simulated-robot''.
The function ''urdf-proj::move-joints'' moves the joints of both arms, which brings them into a specific position, specified in the arguments, so they don't hang around the field of view. ''urdf-proj::drive'' moves the robot, by internally calling
(prolog:prolog '(btr:assert ?world (btr:object-pose ?robot ?target-pose)))
''urdf-proj::look-at'' calculates the pan and tilt angles of the robot's neck such that it ends up looking at the specified point, and asserts these angles to the neck joints.
Now, let us finally perceive the object and store the result in the ''*perceived-object*'' variable:
BTW-TUT>
(defvar *perceived-object* nil "Object designator returned from perception")
(urdf-proj:with-simulated-robot
(setf *perceived-object*
(urdf-proj::detect (desig:an object (type bottle)))))
With that resulting perceived object we could perform the picking up action. With the torso so far down we might not be able to reach the bottle, so we need to push the torso up:
(urdf-proj:with-simulated-robot
(urdf-proj::move-torso 0.3))
As there is no atomic motion for picking up an object, in fact, picking up is comprised of multiple move-arm motions,
pick up is implemented within a plan and called by performing an action designator. Performing motion and action designators in the Bullet world (or on the real robot) is explained in the [[http://cram-system.org/tutorials/intermediate/simple_mobile_manipulation_plan|next tutorial on writing simple mobile manipulation plans]].