Skip to content

[CoRL22] Motion Style Transfer: Modular Low-Rank Adaptation for Deep Motion Forecasting

Notifications You must be signed in to change notification settings

vita-epfl/motion-style-transfer-l5kit

Repository files navigation

Motion Style Transfer

Paper | Teaser | Video | Poster | SDD

This is an official implementation for the paper

Motion Style Transfer: Modular Low-Rank Adaptation for Deep Motion Forecasting
6th Conference on Robot Learning (CoRL), 2022.
Parth Kothari*, Danya Li*, Yuejiang Liu, Alexandre Alahi
École Polytechnique Fédérale de Lausanne (EPFL)

Overview

We propose efficient adaptation of deep motion forecasting models pretrained in one domain with sufficient data to new styles with limited samples through the following designs:

  • a low-rank motion style adapter, which projects and adapts the style features at a low-dimensional bottleneck

  • a modular adapter strategy, which disentangles the features of scene context and motion history to facilitate a fine-grained choice of adaptation layers

Setup

Install pipenv

After pipenv installation:

cd l5kit

pipenv install --dev -e .

pipenv shell

cd ..

Data Download

Due to License issues, we cannot provide data from the L5Kit dataset. Please follow the instructions from the L5Kit authors for downloading and setting the path to data directory.

Running adaptation scripts (using pre-trained model)

Get Model

Run: sh get_pretrained_model.sh

Run Script

For Full model finetuning: make scene_transfer_full_finetune

For partial model finetuning (last layers): make scene_transfer_partial_finetune

For adaptive normalization: make scene_transfer_adaptive_layernorm

For motion style adapters (ours): make scene_transfer_mosa

Model Pre-training (takes 1 day)

make pretrain_l5kit

Note: Larger batchsize speeds up the training process.

Acknowledgements

Our code is built upon the public code of the following repositories:

About

[CoRL22] Motion Style Transfer: Modular Low-Rank Adaptation for Deep Motion Forecasting

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages