Skip to content

vita-epfl/trajnetplusplustools

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tools

  • summary table and plots: python -m trajnetplusplustools.summarize <dataset_files>
  • plot sample trajectories: python -m trajnetplusplustools.trajectories <dataset_file>
  • visualize interactions: python -m trajnetplusplustools.visualize_type <dataset_file>
  • obtain distribution of trajectory types: python -m trajnetplusplustools.dataset_stats <dataset_file>

APIs

  • trajnetplusplustools.Reader: class to read the dataset_file
  • trajnetplusplustools.show: module containing contexts for visualizing rows and paths
  • trajnetplusplustools.writers: write a trajnet dataset file
  • trajnetplusplustools.metrics: contains unimodal metrics: average_l2(), final_l2() and collision() and multimodal metrics: topk() and nll() implementations

Dataset

Datasets are split into train, val and test set. Every line is a self contained JSON string (ndJSON).

Scene:

{"scene": {"id": 266, "p": 254, "s": 10238, "e": 10358, "fps": 2.5, "tag": 2}}

Track:

{"track": {"f": 10238, "p": 248, "x": 13.2, "y": 5.85}}

with:

  • id: scene id
  • p: pedestrian id
  • s, e: start and end frame id
  • fps: frame rate
  • tag: trajectory type
  • f: frame id
  • x, y: x- and y-coordinate in meters
  • pred_number: (optional) prediction number for multiple output predictions
  • scene_id: (optional) corresponding scene_id for multiple output predictions

Frame numbers are not recomputed. Rows are resampled to about 2.5 rows per second.

Dev

pylint trajnetplusplustools
python -m pytest
# optional: mypy trajnetplusplustools --disallow-untyped-defs

Dataset Summaries

biwi_hotel:

docs/train/biwi_hotel.ndjson.theta.png docs/train/biwi_hotel.ndjson.speed.png

crowds_students001:

docs/train/crowds_students001.ndjson.theta.png docs/train/crowds_students001.ndjson.speed.png

crowds_students003:

docs/train/crowds_students003.ndjson.theta.png docs/train/crowds_students003.ndjson.speed.png

crowds_zara02:

docs/train/crowds_zara02.ndjson.theta.png docs/train/crowds_zara02.ndjson.speed.png

crowds_zara03:

docs/train/crowds_zara03.ndjson.theta.png docs/train/crowds_zara03.ndjson.speed.png

dukemtmc:

docs/train/dukemtmc.ndjson.theta.png docs/train/dukemtmc.ndjson.speed.png

syi:

docs/train/syi.ndjson.theta.png docs/train/syi.ndjson.speed.png

wildtrack:

docs/train/wildtrack.ndjson.theta.png docs/train/wildtrack.ndjson.speed.png

Interactions

leader_follower:

docs/train/crowds_zara02.ndjson_1_9.png docs/train/crowds_zara02.ndjson_1_9_full.png

collision_avoidance:

docs/train/crowds_zara02.ndjson_2_25.png docs/train/crowds_zara02.ndjson_2_25_full.png

group:

docs/train/crowds_zara02.ndjson_3_9.png docs/train/crowds_zara02.ndjson_3_9_full.png

others:

docs/train/crowds_zara02.ndjson_4_13.png docs/train/crowds_zara02.ndjson_4_13_full.png

Citation

If you find this code useful in your research then please cite

@inproceedings{Kothari2020HumanTF,
  title={Human Trajectory Forecasting in Crowds: A Deep Learning Perspective},
  author={Parth Kothari and Sven Kreiss and Alexandre Alahi},
  year={2020}
}