- Download weights for each model;
- Change to your path in the corresponding to the code.
Note: the download weights from huggingface can refer to utils/download_meta_llama2.py.
Download weights:
LLaMA-1-7B: decapoda-research/llama-7b-hf, LLaMA-1-13B: decapoda-research/llama-13b-hf
Download weights:
LLaMA-2-7B: meta-llama/Llama-2-7b-hf, LLaMA-2-13B: meta-llama/Llama-2-13b-hf
We refer to the official repo OpenFlamingo.
Download weights:
We refer to the official repo MiniGPT4.
MiniGPT4(Vicuna 13B), Vicuna 13B.
We refer to the official repo mPLUG-Owl.
Download weights:
We refer to the official repo LLaMA-Adapter.
Download weights:
We refer to the official repo VPGTrans.
Download weights:
We refer to the official repo LLaVA.
Download weights:
For LLaVA-7B, download weights:
For LLaVA-13B, download weights:
Coverting delta weights refering to the "Legacy Models (delta weights)".
We refer to the official repo Multimodal-GPT.
Download weights:
llama-7b-hf, OpenFlamingo-9B, mmgpt-lora-v0-release.pt.
We refer to the official repo LaVIN.
For LaVIN-7B, download weights:
LLaMA-7B, sqa-llama-7b-lite.pth.
For LaVIN-13B, download weights:
LLaMA-13B, sqa-llama-13b-lite.pth.
We refer to the official repo Lynx-llm.
Download weights: