header photo

Specifying Dual-Arm Robot Planning Problems Through Natural Language and Demonstration

[PDF, Video, Showcases,Bag files, Editable template, Source codes, STAAMS solver]

Authors

J.K.Behrens*(1), K.Stepanova*(2), R.Lange(1), R.Skoviera(2)

* Both authors contributed equally to this work
(1)Robert Bosch GmbH, Corporate Sector Research and Advance Engineering, Renningen, Germany, jan.behrens@de.bosch.com, ralph.lange@de.bosch.com
(2)Czech Technical University in Prague, Czech Institute of Informatics, Robotics, and Cybernetics, karla.stepanova@ciirc.cvut.cz, radoslav.skoviera@ciirc.cvut.cz

Abstract

Multi-modal robot programming with natural language and demonstration is a promising technique for efficient teaching of manipulation tasks in industrial environments. In particular with modern dual-arm robots, which are designed to quickly take over tasks at typical industrial workplaces, the direct teaching of task sequences hardly utilizes the robots' capabilities. We therefore propose a two-staged approach that combines linguistic instructions and demonstration with simultaneous task allocation and motion scheduling. Instead of providing a task description and demonstration that is replayed to a large extent, the user describes tasks to be scheduled with all relevant constraints and demonstrates relevant locations and storages relative to workpieces and other objects. Constraint optimization is used to schedule task and motion sequences to minimize the makespan. Naming and grouping enables systematic reuse of sub-tasks ensembles and referencing of relevant locations. The proposed approach can generalize between different workspaces and is evaluated with gluing showcases from furniture assembly.

Attachment video

Video showing transfer from demonstration and linguistic instructions to the setup with two KUKA robots.

Additional resources

Submitted manuscript [PDF]

Showcases (Bag files and transcripts)

1. Showcase 1 - applying glue in demonstrated locations (3 GB)

  • Bag file with video + audio + HTC Vive pose (controller1) + button press (joy2):
    [Download] (3 GB)

2. Showcase 2 - applying glue in demonstrated locations with respect to ordering constraints

3. Showcase 3 - applying glue and gluing objects (pick and place)

4. Showcase 4 - applying glue and gluing objects (pick and place) with respect to ordering constraints

Source codes

  • Sentence translator (Source code): translating a sentence gained from Google speech api to the sentence so it can be further processed by our algorithm. Homophones and synonyms are substituted:
    [translator.py, dictionary.yaml, README.txt]
  • Source codes for NLP processing will be released until the 31st of October 2018.

Editable template of the abstract task description

Template which serves as an input to STAAMS solver (created by our NLP processing algorithm)

STAAMS solver [PDF, GIT]

  • Manuscript submitted to ICRA 2019 [ONLY FOR REVIEW PROCESS]: J. K. Behrens, R. Lange, and M. Mansouri, “A constraint programming approach to simultaneous task allocation and motion scheduling for industrial dual-arm manipulation tasks,” in submission to ICRA2019, 2018 [PDF]
  • (The code and the setup details) will be released on Github until the 31st of October 2018 at https://github.com/boschresearch/STAAMS-SOLVER

User study

  • Experiment outline:
    • Video tutorial: Part A
    • Showcase 1 - Showcase 3, next to them are short written instructions
    • Video tutorial: Part B
    • Showcase 3 - Showcase 4 (they can use task templates and location denominators for repeated tasks instead of performing the whole action)
    • OVC 5 minutes tutorial
    • Showcase 3 - textual programming with available written instructions
  • Video tutorial: Part A
  • Part B