Skip to content

Commit

Permalink
0.12.1 typo in sample.env
Browse files Browse the repository at this point in the history
  • Loading branch information
matatonic committed Jun 19, 2024
1 parent 2beb5e0 commit d7b2b6a
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions vision.sample.env
Original file line number Diff line number Diff line change
Expand Up @@ -77,8 +77,7 @@ HF_HUB_ENABLE_HF_TRANSFER=1
#CLI_COMMAND="python vision.py -m internlm/internlm-xcomposer2-vl-7b --use-flash-attn --device-map cuda:0" # test pass✅, time: 25.6s, mem: 20.1GB, 12/12 tests passed.
#CLI_COMMAND="python vision.py -m internlm/internlm-xcomposer2-vl-7b-4bit --use-flash-attn" # test pass✅, time: 15.4s, mem: 10.8GB, 12/12 tests passed.
#CLI_COMMAND="python vision.py -m llava-hf/llava-1.5-13b-hf --use-flash-attn --device-map cuda:0 --load-in-4bit" # test pass✅, time: 13.6s, mem: 9.5GB, 12/12 tests passed.
#CLI_COMMAND="python vision.py -m llava-hf/llava-1.5-13b-hf --use-flash-attn --device-map cuda:0" # test pass✅, time: 9.8s, mem: 26.7GB, 12/12 test
s passed.
#CLI_COMMAND="python vision.py -m llava-hf/llava-1.5-13b-hf --use-flash-attn --device-map cuda:0" # test pass✅, time: 9.8s, mem: 26.7GB, 12/12 tests passed.
#CLI_COMMAND="python vision.py -m llava-hf/llava-1.5-7b-hf --use-flash-attn --device-map cuda:0 --load-in-4bit" # test pass✅, time: 9.5s, mem: 5.7GB, 12/12 tests passed.
#CLI_COMMAND="python vision.py -m llava-hf/llava-1.5-7b-hf --use-flash-attn --device-map cuda:0" # test pass✅, time: 8.2s, mem: 14.4GB, 12/12 tests passed.
#CLI_COMMAND="python vision.py -m llava-hf/llava-v1.6-34b-hf --use-flash-attn --load-in-4bit" # test pass✅, time: 63.8s, mem: 23.3GB, 12/12 tests passed.
Expand Down

0 comments on commit d7b2b6a

Please sign in to comment.