MATLAB implementation of PRobabilistically-Informed Motion Primitives, a learning-from-demonstration method on Lie group. It is published in IEEE Transactions on Robotics (T-RO).
- Publication: T-RO
 - Project page: https://chirikjianlab.github.io/primp-page/
 - Python implementation is available here.
 
Sipu Ruan, Weixiao Liu, Xiaoli Wang, Xin Meng and Gregory S. Chirikjian
All test files are located in /test folder. To run scripts for LfD methods:
- Download the data from Google Drive. All the demonstrated datasets are locataed in 
/demonstrationsfolder. - Generate 
/datafolder that stores all demonstration data - Copy all the demonstration sets into the 
/datafolder (Only put folders inside/demonstrationsinto/data) - (Optional) To generate data trials for real-world experiments, please also download datasets from 
/experimentsfolder (Put the whole folder). - Run scripts in 
/testfolder 
After data preparation, the structure of /data folder should look like
.
└───data
│   └───panda_arm
|   |   └───real
|   |   |   └───trajectory
|   |   |   |   └───...
|   |   └───simulation
|   |   |   └───...
│   └───lasa_handwriting
|   |   |   └───pose_data
|   |   |   |   └───...
│   └───experiments
|   |   └───...
To run Orientation-KMP method,
- Clone from Orientation-KMP Repository
 - Add correct paths of functions
 
addpath path-prefix/pbdlib-matlab/demos/m_fcts/
addpath path-prefix/robInfLib-matlab/fcts/
DTW is required to evaluate method performance. The source C code is /src/util/dtw_c.c, which needs to be generated as a .mex file.
- Go to 
/src/util - Mex the C code for DTW
 
mex dtw_c.c
- Plot LfD dataset 
/test/demo_lfd_dataset.m: Plots all demonstration trajectories and store .png files in/testfolder. 
- Encode demonstrations and condition on via poses 
/test/demo_primp_lfd.m - LfD for scooping task 
/test/demo_primp_lfd_scooping.m - Encode demonstrations with and without GORA 
/test/demo_primp_lfd_gora.m - Adaptation: Pass through via poses with uncertainties 
/test/demo_primp_condition_via_poses.m - Adaptation: Equivariance on the change of viewing frame 
/test/demo_primp_change_view.m - Adaptation: Fusion with workspace density 
/test/demo_primp_lfd_fusion_wd.m 
- Encode demonstrations and condition on via poses 
/test/demo_kmp_lfd.m 
- Generate random via-point poses data for benchmarks (Required before running benchmark scripts) 
/test/generate_benchmark_trials.m - Benchmark LfD for PRIMP 
/test/benchmark_lfd_primp.m - Benchmark LfD for PRIMP with storage of learned trajectory distribution 
/test/benchmark_lfd_primp_trajectory.m - Benchmark LfD for Orientation-KMP 
/test/benchmark_lfd_kmp.m - Benchmark for PRIMP on synthetic data between SE(3) and PCG(3) formulation 
/test/benchmark_primp_se_pcg.m - Benchmark for PRIMP on single demonstration 
/test/benchmark_lfd_primp_single_demo.m - Benchmark for PRIMP on equivariant change of view 
/test/benchmark_lfd_primp_change_view.m - Qualitative comparisons among LfD methods for extrapolation cases 
/test/comparison_lfd_extrapolation.m 
- Ablation study for GORA as pre-process 
/test/ablation_primp_gora.m - Ablation study for fusion with robot-specific workspace density 
/test/ablation_primp_wd.m 
- Generate reference trajectory using PRIMP for real-world experiments 
/test/generate_real_task_trials.m 
S. Ruan, W. Liu, X. Wang, X. Meng and G. S. Chirikjian, "PRIMP: PRobabilistically-Informed Motion Primitives for Efficient Affordance Learning from Demonstration," in IEEE Transactions on Robotics, doi: 10.1109/TRO.2024.3390052.
BibTex
@ARTICLE{10502164,
  author={Ruan, Sipu and Liu, Weixiao and Wang, Xiaoli and Meng, Xin and Chirikjian, Gregory S.},
  journal={IEEE Transactions on Robotics}, 
  title={PRIMP: PRobabilistically-Informed Motion Primitives for Efficient Affordance Learning from Demonstration}, 
  year={2024},
  volume={},
  number={},
  pages={1-20},
  keywords={Trajectory;Robots;Probabilistic logic;Planning;Affordances;Task analysis;Manifolds;Learning from Demonstration;Probability and Statistical Methods;Motion and Path Planning;Service Robots},
  doi={10.1109/TRO.2024.3390052}}