Add vllm
decoder for model inference
#168
This run and associated checks have been archived and are scheduled for deletion.
Learn more about checks retention