Skip to content

Commit

Permalink
Update Hugging Face packages to latest versions (pytorch#4306)
Browse files Browse the repository at this point in the history
Summary: Pull Request resolved: pytorch#4306

Reviewed By: kirklandsign

Differential Revision: D59977405

Pulled By: guangy10

fbshipit-source-id: 2ce889e6f49ade545a668244db8aab6e7f7bef01
  • Loading branch information
Guang Yang authored and facebook-github-bot committed Jul 19, 2024
1 parent 7e417f4 commit 5865a57
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 5 deletions.
7 changes: 4 additions & 3 deletions examples/models/llava/install_requirements.sh
Original file line number Diff line number Diff line change
Expand Up @@ -20,11 +20,12 @@ pip install bitsandbytes -I
# numpy needs to be pin to 1.24. 1.26.4 will error out
pip install numpy==1.24

# Newer transformer will give TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position'
pip install transformers==4.37.2

# The deps of llava can have different versions than deps of ExecuTorch.
# For example, torch version required from llava is older than ExecuTorch.
# To make both work, recover ExecuTorch's original dependencies by rerunning
# the install_requirements.sh.
bash -x ./install_requirements.sh --pybind xnnpack

# Newer transformer will give TypeError: LlavaLlamaForCausalLM.forward() got an unexpected keyword argument 'cache_position'
pip install timm==0.6.13
pip install transformers==4.38.2
4 changes: 2 additions & 2 deletions install_requirements.sh
Original file line number Diff line number Diff line change
Expand Up @@ -141,10 +141,10 @@ DEVEL_REQUIREMENTS=(
# pip packages needed to run examples.
# TODO(dbort): Make each example publish its own requirements.txt
EXAMPLES_REQUIREMENTS=(
timm==0.6.13
timm==1.0.7
torchaudio=="2.4.0.${NIGHTLY_VERSION}"
torchsr==1.0.4
transformers==4.38.2
transformers==4.42.4
)

# Assemble the list of requirements to actually install.
Expand Down

0 comments on commit 5865a57

Please sign in to comment.