Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
tutorials:intermediate:bullet_world [2019/07/08 16:05] – [Abstract entity descriptions] gkazhoyatutorials:intermediate:bullet_world [2023/05/02 14:15] (current) – [Abstract entity descriptions] gkazhoya
Line 1: Line 1:
-**//Tested with Cram v0.7.0, ROS version: Kinetic, Ubuntu 16.04//**+**//Tested with Cram v0.8.0, ROS version: Noetic, Ubuntu 20.04//**
  
 ====== Bullet world demonstration ====== ====== Bullet world demonstration ======
Line 153: Line 153:
                     (cram-robot-interfaces:robot ?robot)                     (cram-robot-interfaces:robot ?robot)
                     (assert (btr:object ?world :urdf ?robot ((0 0 0) (0 0 0 1)) :urdf ,robot-urdf))                     (assert (btr:object ?world :urdf ?robot ((0 0 0) (0 0 0 1)) :urdf ,robot-urdf))
-                    (cram-robot-interfaces:robot-arms-parking-joint-states ?robot ?joint-states) +                    (-> (rob-int:robot-joint-states ?robot :arm :left :park ?left-joint-states) 
-                    (assert (btr:joint-state ?world ?robot ?joint-states)) +                        (assert (btr:joint-state ?world ?robot ?left-joint-states)) 
-                    (assert (btr:joint-state ?world ?robot (("torso_lift_joint" 0.15d0)))))))+                        (true)) 
 +                    (-> (rob-int:robot-joint-states ?robot :arm :right :park ?right-joint-states) 
 +                        (assert (btr:joint-state ?world ?robot ?right-joint-states)) 
 +                        (true)))))
 </code> </code>
  
Line 223: Line 226:
  
 <code lisp> <code lisp>
-BTW-TUT> (btr:object btr:*current-bullet-world* :kitchen)+BTW-TUT> (btr:object btr:*current-bullet-world* :iai-kitchen)
 </code> </code>
  
Line 232: Line 235:
 (btr:set-robot-state-from-joints (btr:set-robot-state-from-joints
  '(("iai_fridge_door_joint"  0.3d0))  '(("iai_fridge_door_joint"  0.3d0))
- (btr:object btr:*current-bullet-world* :kitchen))+ (btr:object btr:*current-bullet-world* :iai-kitchen))    
 </code> </code>
  
Line 305: Line 308:
 <code lisp> <code lisp>
 BTW-TUT> (prolog:prolog '(and (btr:bullet-world ?world) BTW-TUT> (prolog:prolog '(and (btr:bullet-world ?world)
-                              (cram-robot-interfaces:robot ?robot)+                              (rob-int:robot ?robot)
                               (btr:visible ?world ?robot mug-1)))                               (btr:visible ?world ?robot mug-1)))
 NIL NIL
Line 329: Line 332:
 BTW-TUT> BTW-TUT>
 (def-fact-group costmap-metadata () (def-fact-group costmap-metadata ()
-    (<- (location-costmap:costmap-size 12 12)) +    (<- (costmap-size 12 12)) 
-    (<- (location-costmap:costmap-origin -6 -6)) +    (<- (costmap-origin -6 -6)) 
-    (<- (location-costmap:costmap-resolution 0.04)) +    (<- (costmap-resolution 0.04)) 
- +  
-    (<- (location-costmap:costmap-padding 0.3)) +    (<- (costmap-padding 0.3)) 
-    (<- (location-costmap:costmap-manipulation-padding 0.4)) +    (<- (costmap-manipulation-padding 0.4)) 
-    (<- (location-costmap:costmap-in-reach-distance 0.7)) +    (<- (costmap-in-reach-distance 0.7)) 
-    (<- (location-costmap:costmap-reach-minimal-distance 0.2)) +    (<- (costmap-reach-minimal-distance 0.2)) 
-    (<- (location-costmap:visibility-costmap-size 2)) +    (<- (visibility-costmap-size 2)) 
-    (<- (location-costmap:orientation-samples 2)) +    (<- (orientation-samples 2)) 
-    (<- (location-costmap:orientation-sample-step 0.1)))+    (<- (orientation-sample-step 0.1)))
 </code> </code>
 Now, we create an abstract location description that we call a //designator//. The abstract description gets grounded into specific geometric coordinates with the ''reference'' function.  Now, we create an abstract location description that we call a //designator//. The abstract description gets grounded into specific geometric coordinates with the ''reference'' function. 
Line 365: Line 368:
                                            (type counter-top)                                            (type counter-top)
                                            (urdf-name kitchen-island-surface)                                            (urdf-name kitchen-island-surface)
-                                           (part-of kitchen)))))+                                           (part-of iai-kitchen)))))
        (location-to-see (desig:a location         (location-to-see (desig:a location 
                                  (visible-for pr2)                                  (visible-for pr2)
Line 376: Line 379:
 </code> </code>
  
-Here, as we didn't specifically say which pose we are trying to perceive: the robot randomly samples one pose from all the poses on the kitchen island table and generates a distribution to perceive that pose, i.e., it looks in the general direction of the table. If you look at the debug window now you can see an area that looks like part of a Gaussian bell near the tableWe can use that pose to move the robot there.+Here, as we didn't specifically say which pose we are trying to perceive: the robot randomly samples one pose from all the poses on the kitchen island table and generates a distribution to perceive that pose, i.e., it looks in the general direction of the table. If you look closely at the Bullet windowyou might be able to see the different steps that happen during the referencing:  
 + 
 +  * first an area, which corresponds to the location on the kitchen-island-surface, is highlighted red, 
 +  * then a sample is being drawn from that distribution, which is visualized as red dot,  
 +  * then a Gaussian is generated around that dot to represent a location for the robot to see the object, 
 +  * and, finally, a sample is taken from the Gaussian visualized again as a red dot. 
 + 
 +You can execute the above command multiple times, and each time you will get a different distribution of locations for the robot to stand. Sometimes, you might get an error that the location cannot be referenced. This happens if the sample that the robot 
 +picks on the table is too close to the wall, and there is no location to stand to perceive that point on the table that would not create collisions of the robot with the environment. 
 + 
 +Let us move the robot to a location to perceive a mug on the table:
  
 <code lisp> <code lisp>
Line 392: Line 405:
 ==== Attaching objects to the robot ==== ==== Attaching objects to the robot ====
  
-When manipulating objects in the environment e.g. carrying a cup in the gripper, the simulation needs to be told that such objects should movewhen the respective robot part moves. If the PR2 robot grasps an object, it automatically attaches the object to the gripper, more specific, to the link of the rigid body of the gripper. Via prolog it is possible to attach objects to robots parts manually. This comes in handy when cutlery is stored in a drawer and should movewhen the drawer opensFollowing we will place fork inside a drawer and attach it to the link of the drawer. This requires the kitchen to be loaded as URDFnot semantic map. See the chapter about how to load the kitchen, if in doubt. If you work on the latest version of this tutorials repository, you should be fine.+When manipulating objects in the environmente.g.carrying a cup in the gripper, the simulation needs to be told that such objects should move when the respective robot part moves. If the PR2 robot grasps an object, it is automatically attached to the gripper, more specifically, to the link of the rigid body of the gripper. Via prolog it is possible to attach objects to robots parts manually. This comes in handy when placing objects into environment containerse.g., we spawn spoon in a drawer and when the drawer closes, the spoon should follow
  
 +In this part of the tutorial, we will place a fork inside a drawer and attach it to the link of the drawer. 
 Assuming that the cram_bullet_world_tutorial is already loaded, we need to reset the world: Assuming that the cram_bullet_world_tutorial is already loaded, we need to reset the world:
 <code lisp> <code lisp>
 (roslisp-utilities:startup-ros) ;; restart world (roslisp-utilities:startup-ros) ;; restart world
-(add-objects-to-mesh-list) ;; load fork mesh+(add-objects-to-mesh-list) ;; tell Bullet where the fork mesh is stored on the hard drive
 </code> </code>
  
-The function ''set-robot-state-from-joints'' can change the state of joints. A joint can be the hinge of a door, the rails of a drawer, the elbow of an arm of the PR2, basically everything that joins two parts and can be moved in some kind of way. The state of such a joint indicates its current position, e.g. how hard the elbow is bent or how far the drawer is pulled. The function takes two arguments, a list of joints with their desired positions, and the corresponding robot, in this case the kitchen.+The function ''set-robot-state-from-joints'' can change the state of joints. A joint can be the hinge of a door, the rails of a drawer, the elbow of an arm of the PR2, basically everything that connects two parts and can be moved in some kind of way. The state of such a joint indicates its current position, e.g.how hard the elbow is bent or how far the drawer is pulled open. The function takes two arguments, a list of joints with their desired positions, and the corresponding robot, in our case that will be the kitchen.
  
-Since a closed drawer would disguise the fork, lets open it:+Let us first open the drawer:
 <code lisp> <code lisp>
 (btr:set-robot-state-from-joints (btr:set-robot-state-from-joints
- '(("sink_area_left_upper_drawer_main_joint"  0.5)) + '(("sink_area_left_upper_drawer_main_joint"  0.4)) 
- (btr:object btr:*current-bullet-world* :kitchen))+ (btr:object btr:*current-bullet-world* :iai-kitchen))
 </code> </code>
 +The drawer is called ''"sink_area_left_upper_drawer_main_joint"'' and we would like to open it to 0.4 meters.
  
-Spawn the fork inside the drawer:+{{ :tutorials:intermediate:btw-tut-opendrawer.png?600 |}} 
 + 
 +Now let us spawn the fork inside the drawer:
 <code lisp> <code lisp>
-(btr-utils:spawn-object +(prolog:prolog '(and (btr:bullet-world ?world) 
- 'fork-1 +                     (assert (btr:object ?world :mesh fork-((1.0 0.9 0.75) (0 0 0 1)) 
- :fork-plastic +                                         :mass 0.2 :color (0.5 0.5 0.5) :mesh :fork-plastic))))
- :pose '((1.0 0.9 0.75) (0 0 0 1)))+
 </code> </code>
  
 If this throws an error, you probably forgot to use ''(add-objects-to-mesh-list)'' previously, since we need the fork mesh. The whole scene should now look like this: If this throws an error, you probably forgot to use ''(add-objects-to-mesh-list)'' previously, since we need the fork mesh. The whole scene should now look like this:
  
-{{:tutorials:intermediate:btw-tut-opendrawer.png?800|}} {{:tutorials:intermediate:btw-tut-drawerwfork.png?800|}}+{{ :tutorials:intermediate:btw-tut-drawerwfork.png?600 |}}
  
-Try and move the drawer a bit with ''set-robot-state-from-joints''. You'll see, that the fork stays in place, even if the scene is simulated with ''(btr:simulate btr:*current-bullet-world* 10)''. To solve this, the fork needs to be attached to this specific drawer. All parts and links of the kitchen can be inspected with ''(btr:object btr:*current-bullet-world* :kitchen)'', but to save some time the following example already contains the correct link.+Try and move the drawer a bit with ''set-robot-state-from-joints''. You'll see, that the fork stays in place, even if the scene is simulated with ''(btr:simulate btr:*current-bullet-world* 10)''. To solve this, the fork needs to be attached to this specific drawer. All parts and links of the kitchen can be inspected with ''(btr:object btr:*current-bullet-world* :kitchen)'', but to save some time the following example already contains the correct link to attach to:
 <code lisp> <code lisp>
 (prolog '(and (btr:bullet-world ?world) (prolog '(and (btr:bullet-world ?world)
               (btr:%object ?world fork-1 ?fork)               (btr:%object ?world fork-1 ?fork)
-              (assert (btr:attached ?world :kitchen "sink_area_left_upper_drawer_main" ?fork))))+              (assert (btr:attached ?world :iai-kitchen "sink_area_left_upper_drawer_main" ?fork))))
 </code> </code>
-Notice, that the joint name differs from the link name. Now the fork moves when the drawer is moved. Every attachment can be checked with the following predicate:+Notice, that the joint name differs from the link name. Now the fork moves when the drawer is moved. 
 +<code lisp> 
 +(btr:set-robot-state-from-joints 
 + '(("sink_area_left_upper_drawer_main_joint"  0.3)) 
 + (btr:object btr:*current-bullet-world* :iai-kitchen)) 
 +</code> 
 + Every attachment can be checked with the following predicate:
 <code lisp> <code lisp>
 (prolog '(and (btr:bullet-world ?world) (prolog '(and (btr:bullet-world ?world)
-              (btr:attached ?world :kitchen ?_ fork-1)))+              (btr:attached ?world :iai-kitchen ?_ fork-1)))
 </code> </code>
-This checks if there is any attachments between kitchen and fork. If needed, it is possible to set the name of a link to be specifically checked. Or set the blank to ''?link'', to get the list of links the object is attached to. To detach an object, the ''retract'' predicate does the job.+This checks if there is any attachments between kitchen and fork. If needed, it is possible to set the name of a link to be specifically checked. Or set the ''?_'' to ''?link'', to get the list of links the object is attached to. To detach an object, the ''retract'' predicate does the job.
 <code lisp> <code lisp>
 (prolog '(and (btr:bullet-world ?world) (prolog '(and (btr:bullet-world ?world)
-              (assert (btr:retract ?world :kitchen fork-1 ?_))))+              (btr:%object ?world fork-1 ?fork-instance) 
 +              (btr:retract (btr:attached ?world :iai-kitchen ?fork-instance))))
 </code> </code>
-In the fourth argument of ''retract'', which is kept blank here, it is possible to specify the link namethe object should be detached from, but this is only needed if the object is attached to multiple links+This detaches the fork from all the links of the kitchen that it has been attached to
 +If you only want to detach from a specific link and not othersthis is also possible, but it is out of scope of this tutorial.
  
  
Line 443: Line 467:
  
  
 +
 +
 +<html><!--
  
 ==== Visualizing coordinate frames of poses ==== ==== Visualizing coordinate frames of poses ====
Line 456: Line 483:
 </code> </code>
  
-{{:tutorials:intermediate:axis_spawning_object.png |}} +{{ :tutorials:intermediate:axis_spawning_object.png?600 |}}
-                 +
                                    
 +--></html>                
                                    
                                                                    
Line 499: Line 526:
  
  
-==== Executing motions ====+==== Using TF in the Bullet world ====
  
-From now on we will use the utility functions from ''cram_bullet_reasoning_utilities'' package, to save time on writing lengthy Prolog queries+Per default, the TF listener is not set up in the REPL, when you are working with the Bullet world. 
-The package ''cram_bullet_reasoning_utilities'' has a number of utility functions to make the rapid prototyping with the Bullet world faster and easier, there are functions such as ''spawn-object'', ''move-object'', ''kill-all-objects'', ''move-robot'' etc. that execute the Prolog queries with default values of parameters that are not important, e.g. default colors for objects.  +To have it running, we need a TF context. It is possible to create it manually, but to not overcomplicate this tutorial, 
-By pressing ''.'' while holding ''Alt'' (''Alt-.'') while the Emacs cursor is on the function name you will be redirected to the definition of the function to see what exactly it does. ''Alt-,'' brings back the previous window. Try this for ''init-projection'', for example.+we will use the environment provided by the ''pr2-proj:with-simulated-robot'' macro. It will enable us to lookup transforms between robot frames, kitchen frames etc. 
 + 
 +For example, the following does not work (unless you have a real robot running in your ROS ecosystem): 
 +<code lisp> 
 +BTW-TUT> (cl-tf:lookup-transform cram-tf:*transformer* "map" "l_gripper_tool_frame"
 +</code> 
 +and the following does: 
 +<code lisp> 
 +BTW-TUT> (urdf-proj:with-simulated-robot 
 +           (cl-tf:lookup-transform cram-tf:*transformer* "map" "l_gripper_tool_frame")) 
 +</code> 
 +Here, ''"l_gripper_tool_frame"'' is the tool frame of PR2's left hand. 
 + 
 +==== Moving the robot in the Bullet world ==== 
 + 
 +In this part of the tutorial we will look into moving the robot and it's body parts as well as perceiving objects through the Bullet world. We will use functions from the ''cram-urdf-projection'' package, which implements a simple robot simulator in the Bullet world. This robot simulator does not execute motions in a continuous manner, but by teleporting through key poses. 
 +This teleporting is done by directly calling Prolog predicates that move objects in the world (for navigating the robot, simply teleport it to the goal), changing joint angles (to move the arm simply teleport the arm to given joint values) etc. ''cram-urdf-projection'' also uses Prolog predicates for attaching and detaching objects to the robot, as we did with the fork and the drawer
 + 
 +Another package that we will use in this part of the tutorial is ''cram_bullet_reasoning_utilities'', which has a number of utility functions to make rapid prototyping with the Bullet world faster and easier
 +Until now we had to write lengthy Prolog queries to access the world state and assert changes to it. 
 +From now on we will use the utility functions from ''cram_bullet_reasoning_utilities'' packagewhich wrap around Prolog and simplify the interface. 
 +There are functions such as ''spawn-object'', ''move-object'', ''kill-all-objects'', ''move-robot'' etc. that execute the Prolog queries with default values of parameters that are not important, e.g. default colors for objects.  
 +By pressing ''.'' while holding ''Alt'' (''Alt-.'') while the Emacs cursor is on a name of the function, e.g., ''btr-utils:spawn-object'', you will be redirected to the definition of the function to see what exactly does it do. ''Alt-,'' brings back the previous window.
  
 We need a clean environment for this tutorial, so let's clean the world: We need a clean environment for this tutorial, so let's clean the world:
Line 510: Line 559:
 </code> </code>
  
-Now, let'try to grasp an object. For that we will use a mesh of a bottle loaded from the ''resources'' subdirectory of the tutorial.+Let us try to perceive an object. 
 +For that we will use a mesh of a bottle loaded from the ''resources'' subdirectory of the tutorial.
 <code lisp> <code lisp>
 BTW-TUT> (add-objects-to-mesh-list) BTW-TUT> (add-objects-to-mesh-list)
Line 519: Line 569:
 BTW-TUT> (btr-utils:move-object 'bottle-1 BTW-TUT> (btr-utils:move-object 'bottle-1
                                 (cl-transforms:make-pose                                 (cl-transforms:make-pose
-                                 (cl-transforms:make-3d-vector -2 -0.9 0.83)+                                 (cl-transforms:make-3d-vector -2 -0.83)
                                  (cl-transforms:make-identity-rotation)))                                  (cl-transforms:make-identity-rotation)))
 </code> </code>
-Lastly we simulate the world for 10 seconds to make sure, nothing moves unexpectedly on runtime.+Lastly we simulate the world for 10 seconds to make sure, nothing moves unexpectedly at runtime.
 <code lisp> <code lisp>
-BTW-TUT> (btr:simulate btr:*current-bullet-world* 100)+BTW-TUT> (btr:simulate btr:*current-bullet-world* 10)
 </code> </code>
-Before we grasp the bottle, let'first prepare the PR2, just three steps: move the arms out of sight, navigate the base in front of the bottle and look at the bottle. The point we want the base to navigate to can be hard coded and saved temporarily.+Before we perceive the bottle, we need to move PR2's arms out of sight, navigate the base in front of the bottle and look at it. The pose we want the base to navigate to can be hard coded and saved temporarily.
 <code lisp> <code lisp>
-BTW-TUT> (setf ?grasp-base-pose +BTW-TUT> (defparameter ?grasp-base-pose 
                (cl-transforms-stamped:make-pose-stamped                (cl-transforms-stamped:make-pose-stamped
                 "map"                 "map"
Line 538: Line 588:
 The same thing can be done with the point we want to look at. The same thing can be done with the point we want to look at.
 <code lisp> <code lisp>
-BTW-TUT> (setf ?grasp-look-pose +BTW-TUT> (defparameter ?grasp-look-pose 
                (cl-transforms-stamped:make-pose-stamped                (cl-transforms-stamped:make-pose-stamped
                 "base_footprint"                 "base_footprint"
Line 545: Line 595:
                 (cl-transforms:make-identity-rotation)))                 (cl-transforms:make-identity-rotation)))
 </code> </code>
-To execute any plan in CRAM, we need a top-level context. Besides that we also use a macro to specify that the demo should be executed in simulation, not on the real robot. Putting your plan under ''pr2-proj:with-simulated-robot'' will indicate that your robot is in the projection environment and it also has a top level call within it. The ''with-simulated-robot'' is a way to abstract out the robot details from your plans and its counterpart to execute on a real robot would be ''pr2-pms:with-real-robot''. Also note that without mentioning the robot that you want to execute on, the TF for it is not published and you'll run into errors. Putting all these together we end up with this plan:+ 
 +<html><!-- 
 +To execute any plan in CRAM, we need a top-level context. Besides that we also use a macro to specify that the demo should be executed in simulation, not on the real robot. Putting your plan under ''pr2-proj:with-simulated-robot'' will indicate that your robot is in the projection environment and it also has a top level call within it. The ''with-simulated-robot'' is a way to abstract out the robot details from your plans and its counterpart to execute on a real robot would be ''pr2-pms:with-real-robot''. Also note that without mentioning the robot that you want to execute on, the TF for it is not published and you'll run into errors.  
 +We can execute some movements in parallel, if they use different joints of the robot. That's what ''cpl:par'' is for.  
 +We have used a simple call to low level methods to achieve motions like move to the ''?grasp-base-pose'' and look at ''?grasp-look-pose''. These can be achieved by corresponding motion designators, which we will look at in later tutorials. 
 +--></html> 
 + 
 +Putting all these together we end up with the following:
 <code lisp> <code lisp>
 BTW-TUT> BTW-TUT>
-(pr2-proj:with-simulated-robot +(urdf-proj:with-simulated-robot 
-  (cpl:par +  (urdf-proj::move-joints '(1.9652919379395388d0 
-    (pp-plans:park-arms+                           -0.26499816732737785d0 
-    (pr2-proj::drive ?grasp-base-pose)+                           1.3837617139225473d0 
-  (pr2-proj::look-at-pose-stamped ?grasp-look-pose))+                           -2.1224566064321584d0 
 +                           16.99646118944817d0 
 +                           -0.07350789589924167d0 
 +                           0.0) 
 +                         '(-1.712587449591307d0 
 +                           -0.2567290370386635d0 
 +                           -1.4633501125737374d0 
 +                           -2.1221670650093913d0 
 +                           1.7663253481913623d0 
 +                           -0.07942669250968948d0 
 +                           0.05106258161229582d0)
 +  (urdf-proj::drive ?grasp-base-pose) 
 +  (urdf-proj::look-at ?grasp-look-pose nil)) 
 +</code> 
 +As some of the functions in ''cram-urdf-projection'' package need a running TF listener object, we wrapped our calls in ''urdf-proj:with-simulated-robot''.
  
 +The function ''urdf-proj::move-joints'' moves the joints of both arms, which brings them into a specific position, specified in the arguments, so they don't hang around the field of view. ''urdf-proj::drive'' moves the robot, by internally calling 
 +<code lisp>
 +(prolog:prolog '(btr:assert ?world (btr:object-pose ?robot ?target-pose)))
 </code> </code>
-We can execute some movements in parallel, if they use different joints of the robot. That's what ''cpl:par'' is for. The function ''park-arms'' performs a motion on the joints of both arms, which brings them into a specific position, so they don't hang around the field of view. We have used a simple call to low level methods to achieve motions like move to the ''?grasp-base-pose'' and look at ''?grasp-look-pose''. These can be achieved by corresponding motion designators, which we will look at in later tutorials.+''urdf-proj::look-at'' calculates the pan and tilt angles of the robot's neck such that it ends up looking at the specified point, and asserts these angles to the neck joints.
  
-To grasp the bottle we need to have its pose in the room. Thereforewe first perceive it and store the result in ''*perceived-object*'':+Nowlet us finally perceive the object and store the result in the ''*perceived-object*'' variable:
 <code lisp> <code lisp>
 BTW-TUT> BTW-TUT>
 (defvar *perceived-object* nil "Object designator returned from perception") (defvar *perceived-object* nil "Object designator returned from perception")
-(pr2-proj:with-simulated-robot+(urdf-proj:with-simulated-robot
   (setf *perceived-object*   (setf *perceived-object*
-        (pr2-proj::detect (desig:an object (type bottle)))))+        (urdf-proj::detect (desig:an object (type bottle)))))
 </code> </code>
  
-With that resulting perceived object we perform the picking up action. With the torso so far down we might not be able to reach for the bottle, so we need to also push the torso up:+With that resulting perceived object we could perform the picking up action. With the torso so far down we might not be able to reach the bottle, so we need to push the torso up:
  
 <code lisp> <code lisp>
-(pr2-proj:with-simulated-robot +(urdf-proj:with-simulated-robot 
-    (let ((?perceived-bottle-desig *perceived-object*)) +      (urdf-proj::move-torso 0.3))
-      (pr2-proj::move-torso 0.3)+
 </code> </code>
  
 As there is no atomic motion for picking up an object, in fact, picking up is comprised of multiple move-arm motions, As there is no atomic motion for picking up an object, in fact, picking up is comprised of multiple move-arm motions,
-so pick up is implemented within a plan and called by performing an action designator. This is explained in the [[http://cram-system.org/tutorials/intermediate/simple_mobile_manipulation_plan|next tutorial on writing simple mobile manipulation plans]].+pick up is implemented within a plan and called by performing an action designator. Performing motion and action designators in the Bullet world (or on the real robot) is explained in the [[http://cram-system.org/tutorials/intermediate/simple_mobile_manipulation_plan|next tutorial on writing simple mobile manipulation plans]].