Skip to content

Commit

Permalink
[CI] Fix proxy & fix vllm (#97)
Browse files Browse the repository at this point in the history
* fix proxy

* remove xformers
  • Loading branch information
xwu99 authored Feb 4, 2024
1 parent 63464ed commit 8917434
Show file tree
Hide file tree
Showing 3 changed files with 4 additions and 7 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/workflow_finetune.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,10 +11,10 @@ on:
default: '10.1.2.13:5000/llmray-build'
http_proxy:
type: string
default: 'http://proxy-chain.intel.com:911'
default: 'http://10.24.221.149:911'
https_proxy:
type: string
default: 'http://proxy-chain.intel.com:911'
default: 'http://10.24.221.149:911'
runner_config_path:
type: string
default: '/home/ci/llm-ray-actions-runner'
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/workflow_inference.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,10 +11,10 @@ on:
default: '10.1.2.13:5000/llmray-build'
http_proxy:
type: string
default: 'http://proxy-chain.intel.com:911'
default: 'http://10.24.221.149:911'
https_proxy:
type: string
default: 'http://proxy-chain.intel.com:911'
default: 'http://10.24.221.149:911'
runner_config_path:
type: string
default: '/home/ci/llm-ray-actions-runner'
Expand Down
3 changes: 0 additions & 3 deletions dev/docker/Dockerfile.vllm
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,3 @@ RUN --mount=type=cache,target=/root/.cache/pip pip install -e .[cpu] -f https://
RUN --mount=type=cache,target=/root/.cache/pip \
source /opt/conda/bin/activate base && ./install-vllm-cpu.sh

# TODO: workaround, remove this when fixed in vllm-cpu upstream
RUN --mount=type=cache,target=/root/.cache/pip \
pip install xformers

0 comments on commit 8917434

Please sign in to comment.