Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
tutorials:intermediate:bullet_world [2019/07/08 16:44] – [Executing motions] gkazhoyatutorials:intermediate:bullet_world [2023/05/02 14:15] (current) – [Abstract entity descriptions] gkazhoya
Line 1: Line 1:
-**//Tested with Cram v0.7.0, ROS version: Kinetic, Ubuntu 16.04//**+**//Tested with Cram v0.8.0, ROS version: Noetic, Ubuntu 20.04//**
  
 ====== Bullet world demonstration ====== ====== Bullet world demonstration ======
Line 153: Line 153:
                     (cram-robot-interfaces:robot ?robot)                     (cram-robot-interfaces:robot ?robot)
                     (assert (btr:object ?world :urdf ?robot ((0 0 0) (0 0 0 1)) :urdf ,robot-urdf))                     (assert (btr:object ?world :urdf ?robot ((0 0 0) (0 0 0 1)) :urdf ,robot-urdf))
-                    (cram-robot-interfaces:robot-arms-parking-joint-states ?robot ?joint-states) +                    (-> (rob-int:robot-joint-states ?robot :arm :left :park ?left-joint-states) 
-                    (assert (btr:joint-state ?world ?robot ?joint-states)) +                        (assert (btr:joint-state ?world ?robot ?left-joint-states)) 
-                    (assert (btr:joint-state ?world ?robot (("torso_lift_joint" 0.15d0)))))))+                        (true)) 
 +                    (-> (rob-int:robot-joint-states ?robot :arm :right :park ?right-joint-states) 
 +                        (assert (btr:joint-state ?world ?robot ?right-joint-states)) 
 +                        (true)))))
 </code> </code>
  
Line 223: Line 226:
  
 <code lisp> <code lisp>
-BTW-TUT> (btr:object btr:*current-bullet-world* :kitchen)+BTW-TUT> (btr:object btr:*current-bullet-world* :iai-kitchen)
 </code> </code>
  
Line 232: Line 235:
 (btr:set-robot-state-from-joints (btr:set-robot-state-from-joints
  '(("iai_fridge_door_joint"  0.3d0))  '(("iai_fridge_door_joint"  0.3d0))
- (btr:object btr:*current-bullet-world* :kitchen))+ (btr:object btr:*current-bullet-world* :iai-kitchen))    
 </code> </code>
  
Line 305: Line 308:
 <code lisp> <code lisp>
 BTW-TUT> (prolog:prolog '(and (btr:bullet-world ?world) BTW-TUT> (prolog:prolog '(and (btr:bullet-world ?world)
-                              (cram-robot-interfaces:robot ?robot)+                              (rob-int:robot ?robot)
                               (btr:visible ?world ?robot mug-1)))                               (btr:visible ?world ?robot mug-1)))
 NIL NIL
Line 329: Line 332:
 BTW-TUT> BTW-TUT>
 (def-fact-group costmap-metadata () (def-fact-group costmap-metadata ()
-    (<- (location-costmap:costmap-size 12 12)) +    (<- (costmap-size 12 12)) 
-    (<- (location-costmap:costmap-origin -6 -6)) +    (<- (costmap-origin -6 -6)) 
-    (<- (location-costmap:costmap-resolution 0.04)) +    (<- (costmap-resolution 0.04)) 
- +  
-    (<- (location-costmap:costmap-padding 0.3)) +    (<- (costmap-padding 0.3)) 
-    (<- (location-costmap:costmap-manipulation-padding 0.4)) +    (<- (costmap-manipulation-padding 0.4)) 
-    (<- (location-costmap:costmap-in-reach-distance 0.7)) +    (<- (costmap-in-reach-distance 0.7)) 
-    (<- (location-costmap:costmap-reach-minimal-distance 0.2)) +    (<- (costmap-reach-minimal-distance 0.2)) 
-    (<- (location-costmap:visibility-costmap-size 2)) +    (<- (visibility-costmap-size 2)) 
-    (<- (location-costmap:orientation-samples 2)) +    (<- (orientation-samples 2)) 
-    (<- (location-costmap:orientation-sample-step 0.1)))+    (<- (orientation-sample-step 0.1)))
 </code> </code>
 Now, we create an abstract location description that we call a //designator//. The abstract description gets grounded into specific geometric coordinates with the ''reference'' function.  Now, we create an abstract location description that we call a //designator//. The abstract description gets grounded into specific geometric coordinates with the ''reference'' function. 
Line 365: Line 368:
                                            (type counter-top)                                            (type counter-top)
                                            (urdf-name kitchen-island-surface)                                            (urdf-name kitchen-island-surface)
-                                           (part-of kitchen)))))+                                           (part-of iai-kitchen)))))
        (location-to-see (desig:a location         (location-to-see (desig:a location 
                                  (visible-for pr2)                                  (visible-for pr2)
Line 417: Line 420:
 (btr:set-robot-state-from-joints (btr:set-robot-state-from-joints
  '(("sink_area_left_upper_drawer_main_joint"  0.4))  '(("sink_area_left_upper_drawer_main_joint"  0.4))
- (btr:object btr:*current-bullet-world* :kitchen))+ (btr:object btr:*current-bullet-world* :iai-kitchen))
 </code> </code>
 The drawer is called ''"sink_area_left_upper_drawer_main_joint"'' and we would like to open it to 0.4 meters. The drawer is called ''"sink_area_left_upper_drawer_main_joint"'' and we would like to open it to 0.4 meters.
Line 438: Line 441:
 (prolog '(and (btr:bullet-world ?world) (prolog '(and (btr:bullet-world ?world)
               (btr:%object ?world fork-1 ?fork)               (btr:%object ?world fork-1 ?fork)
-              (assert (btr:attached ?world :kitchen "sink_area_left_upper_drawer_main" ?fork))))+              (assert (btr:attached ?world :iai-kitchen "sink_area_left_upper_drawer_main" ?fork))))
 </code> </code>
 Notice, that the joint name differs from the link name. Now the fork moves when the drawer is moved. Notice, that the joint name differs from the link name. Now the fork moves when the drawer is moved.
Line 444: Line 447:
 (btr:set-robot-state-from-joints (btr:set-robot-state-from-joints
  '(("sink_area_left_upper_drawer_main_joint"  0.3))  '(("sink_area_left_upper_drawer_main_joint"  0.3))
- (btr:object btr:*current-bullet-world* :kitchen))+ (btr:object btr:*current-bullet-world* :iai-kitchen))
 </code> </code>
  Every attachment can be checked with the following predicate:  Every attachment can be checked with the following predicate:
 <code lisp> <code lisp>
 (prolog '(and (btr:bullet-world ?world) (prolog '(and (btr:bullet-world ?world)
-              (btr:attached ?world :kitchen ?_ fork-1)))+              (btr:attached ?world :iai-kitchen ?_ fork-1)))
 </code> </code>
 This checks if there is any attachments between kitchen and fork. If needed, it is possible to set the name of a link to be specifically checked. Or set the ''?_'' to ''?link'', to get the list of links the object is attached to. To detach an object, the ''retract'' predicate does the job. This checks if there is any attachments between kitchen and fork. If needed, it is possible to set the name of a link to be specifically checked. Or set the ''?_'' to ''?link'', to get the list of links the object is attached to. To detach an object, the ''retract'' predicate does the job.
Line 455: Line 458:
 (prolog '(and (btr:bullet-world ?world) (prolog '(and (btr:bullet-world ?world)
               (btr:%object ?world fork-1 ?fork-instance)               (btr:%object ?world fork-1 ?fork-instance)
-              (btr:retract (btr:attached ?world :kitchen ?fork-instance))))+              (btr:retract (btr:attached ?world :iai-kitchen ?fork-instance))))
 </code> </code>
 This detaches the fork from all the links of the kitchen that it has been attached to. This detaches the fork from all the links of the kitchen that it has been attached to.
Line 523: Line 526:
  
  
-==== Executing motions ====+==== Using TF in the Bullet world ====
  
 +Per default, the TF listener is not set up in the REPL, when you are working with the Bullet world.
 +To have it running, we need a TF context. It is possible to create it manually, but to not overcomplicate this tutorial,
 +we will use the environment provided by the ''pr2-proj:with-simulated-robot'' macro. It will enable us to lookup transforms between robot frames, kitchen frames etc.
 +
 +For example, the following does not work (unless you have a real robot running in your ROS ecosystem):
 +<code lisp>
 +BTW-TUT> (cl-tf:lookup-transform cram-tf:*transformer* "map" "l_gripper_tool_frame")
 +</code>
 +and the following does:
 +<code lisp>
 +BTW-TUT> (urdf-proj:with-simulated-robot
 +           (cl-tf:lookup-transform cram-tf:*transformer* "map" "l_gripper_tool_frame"))
 +</code>
 +Here, ''"l_gripper_tool_frame"'' is the tool frame of PR2's left hand.
 +
 +==== Moving the robot in the Bullet world ====
 +
 +In this part of the tutorial we will look into moving the robot and it's body parts as well as perceiving objects through the Bullet world. We will use functions from the ''cram-urdf-projection'' package, which implements a simple robot simulator in the Bullet world. This robot simulator does not execute motions in a continuous manner, but by teleporting through key poses.
 +This teleporting is done by directly calling Prolog predicates that move objects in the world (for navigating the robot, simply teleport it to the goal), changing joint angles (to move the arm simply teleport the arm to given joint values) etc. ''cram-urdf-projection'' also uses Prolog predicates for attaching and detaching objects to the robot, as we did with the fork and the drawer.
 +
 +Another package that we will use in this part of the tutorial is ''cram_bullet_reasoning_utilities'', which has a number of utility functions to make rapid prototyping with the Bullet world faster and easier.
 Until now we had to write lengthy Prolog queries to access the world state and assert changes to it. Until now we had to write lengthy Prolog queries to access the world state and assert changes to it.
 From now on we will use the utility functions from ''cram_bullet_reasoning_utilities'' package, which wrap around Prolog and simplify the interface. From now on we will use the utility functions from ''cram_bullet_reasoning_utilities'' package, which wrap around Prolog and simplify the interface.
-The package ''cram_bullet_reasoning_utilities'' has a number of utility functions to make the rapid prototyping with the Bullet world faster and easier, there are functions such as ''spawn-object'', ''move-object'', ''kill-all-objects'', ''move-robot'' etc. that execute the Prolog queries with default values of parameters that are not important, e.g. default colors for objects. +There are functions such as ''spawn-object'', ''move-object'', ''kill-all-objects'', ''move-robot'' etc. that execute the Prolog queries with default values of parameters that are not important, e.g. default colors for objects. 
 By pressing ''.'' while holding ''Alt'' (''Alt-.'') while the Emacs cursor is on a name of the function, e.g., ''btr-utils:spawn-object'', you will be redirected to the definition of the function to see what exactly does it do. ''Alt-,'' brings back the previous window. By pressing ''.'' while holding ''Alt'' (''Alt-.'') while the Emacs cursor is on a name of the function, e.g., ''btr-utils:spawn-object'', you will be redirected to the definition of the function to see what exactly does it do. ''Alt-,'' brings back the previous window.
  
Line 535: Line 559:
 </code> </code>
  
-Now, let'try to grasp an object. For that we will use a mesh of a bottle loaded from the ''resources'' subdirectory of the tutorial.+Let us try to perceive an object. 
 +For that we will use a mesh of a bottle loaded from the ''resources'' subdirectory of the tutorial.
 <code lisp> <code lisp>
 BTW-TUT> (add-objects-to-mesh-list) BTW-TUT> (add-objects-to-mesh-list)
Line 544: Line 569:
 BTW-TUT> (btr-utils:move-object 'bottle-1 BTW-TUT> (btr-utils:move-object 'bottle-1
                                 (cl-transforms:make-pose                                 (cl-transforms:make-pose
-                                 (cl-transforms:make-3d-vector -2 -0.9 0.83)+                                 (cl-transforms:make-3d-vector -2 -0.83)
                                  (cl-transforms:make-identity-rotation)))                                  (cl-transforms:make-identity-rotation)))
 </code> </code>
-Lastly we simulate the world for 10 seconds to make sure, nothing moves unexpectedly on runtime.+Lastly we simulate the world for 10 seconds to make sure, nothing moves unexpectedly at runtime.
 <code lisp> <code lisp>
-BTW-TUT> (btr:simulate btr:*current-bullet-world* 100)+BTW-TUT> (btr:simulate btr:*current-bullet-world* 10)
 </code> </code>
-Before we grasp the bottle, let'first prepare the PR2, just three steps: move the arms out of sight, navigate the base in front of the bottle and look at the bottle. The point we want the base to navigate to can be hard coded and saved temporarily.+Before we perceive the bottle, we need to move PR2's arms out of sight, navigate the base in front of the bottle and look at it. The pose we want the base to navigate to can be hard coded and saved temporarily.
 <code lisp> <code lisp>
-BTW-TUT> (setf ?grasp-base-pose +BTW-TUT> (defparameter ?grasp-base-pose 
                (cl-transforms-stamped:make-pose-stamped                (cl-transforms-stamped:make-pose-stamped
                 "map"                 "map"
Line 563: Line 588:
 The same thing can be done with the point we want to look at. The same thing can be done with the point we want to look at.
 <code lisp> <code lisp>
-BTW-TUT> (setf ?grasp-look-pose +BTW-TUT> (defparameter ?grasp-look-pose 
                (cl-transforms-stamped:make-pose-stamped                (cl-transforms-stamped:make-pose-stamped
                 "base_footprint"                 "base_footprint"
Line 570: Line 595:
                 (cl-transforms:make-identity-rotation)))                 (cl-transforms:make-identity-rotation)))
 </code> </code>
-To execute any plan in CRAM, we need a top-level context. Besides that we also use a macro to specify that the demo should be executed in simulation, not on the real robot. Putting your plan under ''pr2-proj:with-simulated-robot'' will indicate that your robot is in the projection environment and it also has a top level call within it. The ''with-simulated-robot'' is a way to abstract out the robot details from your plans and its counterpart to execute on a real robot would be ''pr2-pms:with-real-robot''. Also note that without mentioning the robot that you want to execute on, the TF for it is not published and you'll run into errors. Putting all these together we end up with this plan:+ 
 +<html><!-- 
 +To execute any plan in CRAM, we need a top-level context. Besides that we also use a macro to specify that the demo should be executed in simulation, not on the real robot. Putting your plan under ''pr2-proj:with-simulated-robot'' will indicate that your robot is in the projection environment and it also has a top level call within it. The ''with-simulated-robot'' is a way to abstract out the robot details from your plans and its counterpart to execute on a real robot would be ''pr2-pms:with-real-robot''. Also note that without mentioning the robot that you want to execute on, the TF for it is not published and you'll run into errors.  
 +We can execute some movements in parallel, if they use different joints of the robot. That's what ''cpl:par'' is for.  
 +We have used a simple call to low level methods to achieve motions like move to the ''?grasp-base-pose'' and look at ''?grasp-look-pose''. These can be achieved by corresponding motion designators, which we will look at in later tutorials. 
 +--></html> 
 + 
 +Putting all these together we end up with the following:
 <code lisp> <code lisp>
 BTW-TUT> BTW-TUT>
-(pr2-proj:with-simulated-robot +(urdf-proj:with-simulated-robot 
-  (cpl:par +  (urdf-proj::move-joints '(1.9652919379395388d0 
-    (pp-plans:park-arms+                           -0.26499816732737785d0 
-    (pr2-proj::drive ?grasp-base-pose)+                           1.3837617139225473d0 
-  (pr2-proj::look-at-pose-stamped ?grasp-look-pose))+                           -2.1224566064321584d0 
 +                           16.99646118944817d0 
 +                           -0.07350789589924167d0 
 +                           0.0) 
 +                         '(-1.712587449591307d0 
 +                           -0.2567290370386635d0 
 +                           -1.4633501125737374d0 
 +                           -2.1221670650093913d0 
 +                           1.7663253481913623d0 
 +                           -0.07942669250968948d0 
 +                           0.05106258161229582d0)
 +  (urdf-proj::drive ?grasp-base-pose) 
 +  (urdf-proj::look-at ?grasp-look-pose nil)) 
 +</code> 
 +As some of the functions in ''cram-urdf-projection'' package need a running TF listener object, we wrapped our calls in ''urdf-proj:with-simulated-robot''.
  
 +The function ''urdf-proj::move-joints'' moves the joints of both arms, which brings them into a specific position, specified in the arguments, so they don't hang around the field of view. ''urdf-proj::drive'' moves the robot, by internally calling 
 +<code lisp>
 +(prolog:prolog '(btr:assert ?world (btr:object-pose ?robot ?target-pose)))
 </code> </code>
-We can execute some movements in parallel, if they use different joints of the robot. That's what ''cpl:par'' is for. The function ''park-arms'' performs a motion on the joints of both arms, which brings them into a specific position, so they don't hang around the field of view. We have used a simple call to low level methods to achieve motions like move to the ''?grasp-base-pose'' and look at ''?grasp-look-pose''. These can be achieved by corresponding motion designators, which we will look at in later tutorials.+''urdf-proj::look-at'' calculates the pan and tilt angles of the robot's neck such that it ends up looking at the specified point, and asserts these angles to the neck joints.
  
-To grasp the bottle we need to have its pose in the room. Thereforewe first perceive it and store the result in ''*perceived-object*'':+Nowlet us finally perceive the object and store the result in the ''*perceived-object*'' variable:
 <code lisp> <code lisp>
 BTW-TUT> BTW-TUT>
 (defvar *perceived-object* nil "Object designator returned from perception") (defvar *perceived-object* nil "Object designator returned from perception")
-(pr2-proj:with-simulated-robot+(urdf-proj:with-simulated-robot
   (setf *perceived-object*   (setf *perceived-object*
-        (pr2-proj::detect (desig:an object (type bottle)))))+        (urdf-proj::detect (desig:an object (type bottle)))))
 </code> </code>
  
-With that resulting perceived object we perform the picking up action. With the torso so far down we might not be able to reach for the bottle, so we need to also push the torso up:+With that resulting perceived object we could perform the picking up action. With the torso so far down we might not be able to reach the bottle, so we need to push the torso up:
  
 <code lisp> <code lisp>
-(pr2-proj:with-simulated-robot +(urdf-proj:with-simulated-robot 
-    (let ((?perceived-bottle-desig *perceived-object*)) +      (urdf-proj::move-torso 0.3))
-      (pr2-proj::move-torso 0.3)+
 </code> </code>
  
 As there is no atomic motion for picking up an object, in fact, picking up is comprised of multiple move-arm motions, As there is no atomic motion for picking up an object, in fact, picking up is comprised of multiple move-arm motions,
-so pick up is implemented within a plan and called by performing an action designator. This is explained in the [[http://cram-system.org/tutorials/intermediate/simple_mobile_manipulation_plan|next tutorial on writing simple mobile manipulation plans]].+pick up is implemented within a plan and called by performing an action designator. Performing motion and action designators in the Bullet world (or on the real robot) is explained in the [[http://cram-system.org/tutorials/intermediate/simple_mobile_manipulation_plan|next tutorial on writing simple mobile manipulation plans]].