This is an old revision of the document!

Reasoning about objects

Before starting this tutorial, you should have completed the following ones:

Load an environment specification

We have separated the semantic environment map and the set of (smaller) objects inside the environment, such as objects inside the cupboards and drawers. This is commonly a good practice since it allows to use the same (often mostly static) map in different settings. In real tasks, the set of objects is usually detected by the robot during operation and updated over time, but for testing purposes, we load some objects from a file.

With the following commands, you can launch KnowRob including the semantic map of our laboratory kitchen, and then load the OWL file containing a set of demo objects. You need to have the knowrob_tutorials repository in your catkin workspace.

 rosrun rosprolog rosprolog knowrob_basics_tutorial

Visualize the environment, including CAD models of objects

Once the map and the example objects have been loaded, you can start the visualization canvas, open the web-based visualization in your browser, and add the objects to this 3D canvas. If definitions for CAD models of objects are loaded, these models are automatically used for visualizing object instances. Otherwise, the system uses grey boxes as default visualization models.

?- visualisation_canvas.
% open your browser at http://localhost:1111
?- owl_individual_of(Map, knowrob:'SemanticEnvironmentMap'), add_object_with_children(Map).

Similarly, we can also visualize small objects in the kitchen:

 (owl_individual_of(A, knowrob:'FoodUtensil');
  owl_individual_of(A, knowrob:'FoodVessel');
 owl_individual_of(A, knowrob:'FoodOrDrink')),
 add_object(A, _).

Query for objects with certain properties

Since all objects in the map are instances of the respective object classes, one can query for objects that have certain properties or belong to a certain class, for example:

 % perishable objects:
 ?- owl_individual_of(A, knowrob:'Perishable').
 A = knowrob:'butter1' ;
 A = knowrob:'buttermilk1' ;
 A = knowrob:'cheese1' ;
 A = knowrob:'milk1' ;
 A = knowrob:'yogurt1' ;
 A = knowrob:'sausage1'
 % all HandTools (e.g. silverware)
 ?- owl_individual_of(A, knowrob:'HandTool').
 A = knowrob:'knife1' ;
 A = knowrob:'fork1'
 % all FoodVessels (i.e. pieces of tableware)
 ?- owl_individual_of(A, knowrob:'FoodVessel').
 A = knowrob:'cup1' ;
 A = knowrob:'plate1' ;
 A = knowrob:'saucer1'
 % everything with a handle:
 ?- owl_has(A, knowrob:properPhysicalParts, H), 
    owl_individual_of(H,  knowrob:'Handle').
 A = knowrob:'Dishwasher37',
 H = knowrob:'Handle145'

Infer likely storage locations

For some tasks, robots need to reason about the nominal locations of objects, for example when cleaning up or when unpacking a shopping basket. There are different techniques for inferring the location where an object should be placed.

  • Using assertions of the storagePlaceFor property is a rather generic, though not very adaptive technique that allows to state e.g. that perishable items belong into the refrigerator. It does not require any knowledge about the environment, but since it works on the level of object classes, it cannot choose between containers of the same type, e.g. different cupboards.
  • A finer-grained solution is based on organizational principles that places objects at the location where semantically similar objects are stored. It requires some (partial) knowledge about the distribution of other objects in the environment.

Query for likely storage location

The simple option based on the storagePlaceFor predicate can be queried as follows in order to determine where an object (instance or class) shall be stored, or which known objects are to be stored in a given container:

 ?- storagePlaceFor(Place, map_obj:'butter1').
 Place = knowrob:'Refrigerator67'
 ?- storagePlaceFor(knowrob:'Refrigerator67', Obj).
 Obj = knowrob:'butter1' ;
 Obj = knowrob:'buttermilk1' ;
 Obj = knowrob:'cheese1' ;
 Obj = knowrob:'milk1';

An extension of the storagePlaceFor predicate also reads why the system came to the conclusion that something is to be stored at some place:

 ?- storagePlaceForBecause(Place, map_obj:'butter1', Because).
 Place = '',
 Because = ''

TODO: add orgprinciples calls

Check for objects that are not at their storage location

By combining the semantic map, giving the current object locations, with the knowledge about the locations where objects should be, the robot can determine which objects are mis-placed. This may for instance be useful when tidying up an environment.

The following code first compute which objects are inside which other ones, then selects those that are food or drink, computes their most likely storage place (which, in this example, is usually the refrigerator), and compares the two places. As a result, the system finds out that the butter and the buttermilk are misplaced in a drawer and in the oven, respectively.

 ?- rdf_triple(knowrob:'in-ContGeneric', Obj, ActualPlace), 
    owl_individual_of(Obj, knowrob:'FoodOrDrink'), 
 Obj = '',
 ActualPlace = '',
 StoragePlace = '' ;
 Obj = '',
 ActualPlace = '',
 StoragePlace = ''

Read object component hierarchy

Composed objects and their parts are linked by an inverse part-of hierarchy, described using the properPhysicalParts property in OWL. This property is transitive, i.e. a part of a part of an object is also a part of the object itself. You can read all parts and sub-parts of an object using owl_has, which takes the transitivity into account. In the example below, Handle160 is a part of Door70, which by itself is part of Refrigerator67.

 ?- owl_has(knowrob:'Refrigerator67', knowrob:properPhysicalParts, P).
 P = '' ;
 P = '' ;
 P = ''

Special case: articulated objects such as cupboards

Articulated objects like cupboards that have doors or drawers are represented in a special way to describe, on the one hand, the component hierarchy, and on the other hand the fixed and moving connections. Like for other composed objects, there is a part-of hierarchy (properPhysicalParts). Joints are parts of the cupboard/parent object, and are fixed-to (connectedTo-Rigidly) both the parent and the child (e.g. the door). In addition, the child is hingedTo or prismaticallyConnectedTo the parent.

Joints are described using the following properties, which are compatible to the representation used by the [ ROS articulation stack].

  • Type: At the moment, rotational and prismatic joints are supported (knowrob:'Hinge' and knowrob:'PrismaticJoint')
  • Parent, Child: resp. object instances
  • Pose: like for normal objects using e.g. a SemanticMapPerception instance
  • Direction: vector giving the opening direction of a prismatic joint
  • Radius: radius of a rotational joint (e.g. between handle and hinge)
  • Qmin, Qmax: lower and upper joint limits

Read articulation information

There are some convenience predicates for reading, creating, updating and deleting joints from articulated objects. This task is on the one hand rather common, on the other hand somewhat complex because the structure visualized in the previous image needs to be established.

To create a joint of type knowrob:'HingedJoint' between two object instances roboearth:'IkeaExpedit2x40' and roboearth:'IkeaExpeditDoor13' at position (1,1,1) with unit orientation, radius 0.33m and joint limits 0.1 and 0.5 respectively, one can use the following statement:

 create_joint_information('HingedJoint', roboearth:'IkeaExpedit2x40', roboearth:'IkeaExpeditDoor13', 
                           [1,0,0,1,0,1,0,1,0,0,1,1,0,0,0,1], [], '0.33', '0.1', '0.5', Joint).

If a prismatic joint is to be created instead, the empty list [] needs to be replaced with a unit vector describing the joint's direction, e.g. [0,0,1] for a joint opening in z-direction, and the joint type needs to be set as 'PrismaticJoint'.

Joint information can conveniently be read using the following predicate that requires a joint instance as argument:

 read_joint_information(Joint, Type, Parent, Child, Pose, Direction, Radius, Qmin, Qmax).

To update joint information, one can use the following predicate:

 update_joint_information(Joint, 'HingedJoint', [1,0,0,2,0,1,0,2,0,0,1,2,0,0,0,1], [1,2,3], 0.32, 1.2, 1.74).

Read and convert units of measure

All numerical values can optionally be annotated with a unit of measure. To keep the system backwards compatible, other values are interpreted to be given in the respective SI unit (e.g. meter, second).

Full article incl. explanation of design choices and links to further information: Measurement_units

 $ roscd knowrob_common/owl
 $ rosrun rosprolog rosprolog ias_knowledge_base
 ?- owl_parse('knowrob_units.owl', false, false, true).
 ?- consult('../prolog/').
 % read information that is asserted for a test instance
 ?- rdf_has('',
            '', O).
 O = literal(type('','12.0')) .
 % manual conversion into other units
 ?- convert_to_unit($O, '', P).
 P = 0.00012.
 ?- convert_to_unit($O, '', P).
 P = 0.12.
 ?- convert_to_unit($O, '', P).
 P = 120.0.

The integration with the rdf_triple computables allows to transparently convert values into the desired unit of measure:

 % transparent conversion during the query  
 ?- rdf_triple('', 
                literal(type('', Val))).
 Val = 0.12 ;
 ?- rdf_triple('', 
                literal(type('', Val))).
 Val = 0.00012 ;

Query for qualitative spatial relations

Using computables that calculate qualitative spatial relations between objects, we can query e.g. in which container we expect to find sausage1, ask for the content of Refrigerator67, or ask what is on top of Dishwasher37:

 ?- rdf_triple(knowrob:'in-ContGeneric', map_obj:sausage1, C).
 C = ''
 ?- rdf_triple(knowrob:'in-ContGeneric', O, knowrob:'Refrigerator67').
 O = '' ;
 O = '' ;
 O = ''
 rdf_triple(knowrob:'on-Physical', A, knowrob:'Dishwasher37').
 A = '' ;
 A = ''