Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the logic for Energy Star #261

Merged
merged 264 commits into from
Nov 25, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
264 commits
Select commit Hold shift + click to select a range
76af5d6
support text-to-image
IlyasMoutawwakil Apr 12, 2024
17bc822
adding yamls and preprocessing
sashavor Apr 12, 2024
4eea069
Merge branch 'energy_star_dev' of github.com:huggingface/optimum-benc…
sashavor Apr 12, 2024
764a3dd
trying ilyas' advice
sashavor Apr 12, 2024
87fb5d0
fixing text generation bug
sashavor Apr 12, 2024
cd1bcc0
image generation isn't working
sashavor Apr 12, 2024
bcdd097
adding image generation, testing object detection
sashavor Apr 15, 2024
55e671f
removing device isolation
sashavor Apr 16, 2024
7c76c65
changing experiment names
sashavor Apr 16, 2024
63084d3
trying to remove batching for now
sashavor Apr 16, 2024
13ea148
removing list
sashavor Apr 16, 2024
26713d5
or like this
sashavor Apr 16, 2024
a5ce3be
adding separate function for preprocessing object detection
sashavor Apr 16, 2024
e80c9e7
oops
sashavor Apr 16, 2024
29b425d
oops
sashavor Apr 16, 2024
75a9327
concentrate
sashavor Apr 16, 2024
2a24b21
trying RGB
sashavor Apr 16, 2024
6a7722a
trying to add text gen options
sashavor Apr 16, 2024
270b8ea
trying like this
sashavor Apr 16, 2024
f8f4dc2
indentation matters
sashavor Apr 16, 2024
e39a310
adding max tokens for summarization
sashavor Apr 16, 2024
e932a70
defining batch size for map
sashavor Apr 17, 2024
d29fad8
writer, not write
sashavor Apr 17, 2024
a36d293
oops
sashavor Apr 17, 2024
06926af
trying a bigger model
sashavor Apr 17, 2024
b3cd16e
ok trying another model
sashavor Apr 17, 2024
6a99ffa
trying this
sashavor Apr 17, 2024
e924d06
trying distilgpt2
sashavor Apr 17, 2024
e0ae7e3
trying opt 125m
sashavor Apr 17, 2024
7b9bf03
trying to hardcode max input length
sashavor Apr 17, 2024
797a3ca
trying olmo
sashavor Apr 17, 2024
e5266b3
trying to trust
sashavor Apr 17, 2024
05a20cc
ok nevermind, gotta hardcode max_length
sashavor Apr 17, 2024
6c6226e
hardcoding token type IDs to False
sashavor Apr 17, 2024
b145a35
device isolation?
sashavor Apr 17, 2024
fafc53b
nevermind
sashavor Apr 17, 2024
02b0c40
merge with main
sashavor Apr 17, 2024
e9278a7
oops
sashavor Apr 17, 2024
094e683
rebase deluxe
IlyasMoutawwakil Apr 17, 2024
d795d33
changing model
sashavor Apr 17, 2024
4db469b
trying phi 2
sashavor Apr 17, 2024
f4b6145
fix decode
IlyasMoutawwakil Apr 17, 2024
d36cb75
Merge branch 'main' into energy_star_dev
sashavor Apr 22, 2024
0a6702d
trying opt
sashavor Apr 22, 2024
008c759
trying opt 125m
sashavor Apr 22, 2024
a93067b
GPT2
sashavor Apr 22, 2024
a23949f
distilgpt2
sashavor Apr 22, 2024
d877590
mistral 7b
sashavor Apr 22, 2024
8d0e692
mistral 7b
sashavor Apr 22, 2024
2f3b8a3
mistral 8x22
sashavor Apr 22, 2024
1ef7275
trying aya
sashavor Apr 22, 2024
7f48dc6
gpt2 xl
sashavor Apr 22, 2024
656a11c
gpt2 med
sashavor Apr 22, 2024
bc2549c
starling
sashavor Apr 22, 2024
504296c
trying falcon
sashavor Apr 22, 2024
1265ce4
maybe I don't need the flag
sashavor Apr 22, 2024
bd277b7
llama 2
sashavor Apr 22, 2024
50a56a7
gemma
sashavor Apr 22, 2024
d3077af
olmo
sashavor Apr 22, 2024
de42741
olmo hf
sashavor Apr 22, 2024
c8bcb84
olmo hf
sashavor Apr 22, 2024
aaf9a62
llava
sashavor Apr 22, 2024
069fd9d
llama 3
sashavor Apr 22, 2024
dacb95e
trying text2text
sashavor Apr 22, 2024
785bebc
going to try 70b
sashavor Apr 22, 2024
4915ef2
ok, trying falcon 180b, will probably fail though
sashavor Apr 22, 2024
e5ce28f
ok, how about 40b
sashavor Apr 22, 2024
f4bf12f
ok, how about mixtral
sashavor Apr 22, 2024
95ee477
adding distributed options
sashavor Apr 23, 2024
0513abc
adding multi gpu
sashavor Apr 23, 2024
e5bb31f
trying deepspeed
sashavor Apr 23, 2024
e2b2ac9
testing sentence similarity
sashavor Apr 23, 2024
42cc563
or not
sashavor Apr 23, 2024
313b75f
another model
sashavor Apr 23, 2024
673bbbb
another model
sashavor Apr 23, 2024
d6ad023
bloom
sashavor Apr 23, 2024
b13faee
summarization
sashavor Apr 23, 2024
e438077
maybe like this?
sashavor Apr 23, 2024
149f7a7
trying another summarization model
sashavor Apr 23, 2024
085c796
oops
sashavor Apr 23, 2024
6db0095
another model
sashavor Apr 23, 2024
f8f81f6
another model
sashavor Apr 23, 2024
35f3f5f
oops
sashavor Apr 23, 2024
40195e0
nevermind
sashavor Apr 23, 2024
c6948cc
removing eos token?
sashavor Apr 23, 2024
dae56fb
adding max_length for all tasks
sashavor Apr 23, 2024
58500bd
trying bart large again (tomorrow)
sashavor Apr 23, 2024
7d76b78
maybe -1?
sashavor Apr 24, 2024
ef1bae8
trying something
sashavor Apr 24, 2024
a1b84d4
ok, how about this
sashavor Apr 24, 2024
85eb903
wth
sashavor Apr 24, 2024
7846de2
ok, giving up on Bart large
sashavor Apr 24, 2024
0390c6c
testing QA
sashavor Apr 24, 2024
cfa27be
trying electra
sashavor Apr 24, 2024
525d5a4
trying text classification
sashavor Apr 24, 2024
441090f
trying twitter model
sashavor Apr 24, 2024
bfd2adc
ok let's try this
sashavor Apr 24, 2024
7969275
ok and this
sashavor Apr 24, 2024
edd42a9
ok printing
sashavor Apr 24, 2024
bc7f095
oops
sashavor Apr 24, 2024
0d4fde7
ok
sashavor Apr 24, 2024
29e4877
trying what Ilyas proposed
sashavor Apr 24, 2024
7fb379a
ok adding config
sashavor Apr 24, 2024
05f3d72
ok this should do it
sashavor Apr 24, 2024
0ab86af
ok, trying this
sashavor Apr 24, 2024
48f0fb7
oops missed one
sashavor Apr 24, 2024
3e336ef
I don't get it
sashavor Apr 24, 2024
55b061d
nevermind
sashavor Apr 24, 2024
9ad2285
but why is it still crashing
sashavor Apr 24, 2024
5954526
ok what about we forget the tokenizer
sashavor Apr 24, 2024
59fe801
printing
sashavor Apr 24, 2024
6f9100e
wut
sashavor Apr 24, 2024
4383bc6
ugh
sashavor Apr 24, 2024
44bd44d
adding the pretrained config everywhere anyway
sashavor Apr 24, 2024
82a91b0
trying another model
sashavor Apr 24, 2024
1df2ab0
ooook
sashavor Apr 24, 2024
ed3c3c6
another model
sashavor Apr 24, 2024
95102e5
a last one
sashavor Apr 24, 2024
da0d19b
testing object detection
sashavor Apr 24, 2024
4df78a3
testing asr
sashavor Apr 24, 2024
ebea01b
hm, and if we remove the squeezing
sashavor Apr 24, 2024
61505d5
ok trying VB's approach
sashavor Apr 24, 2024
f9941d0
umm
sashavor Apr 24, 2024
8ef6fa6
nope
sashavor Apr 24, 2024
84f338a
trying image gen
sashavor Apr 24, 2024
4f33093
image captioning
sashavor Apr 24, 2024
d4971a3
gonna try this
sashavor Apr 24, 2024
259f21e
nevermind
sashavor Apr 24, 2024
a5f6e6d
fixing prefill volume
sashavor Apr 24, 2024
9dc2223
FFS
sashavor Apr 24, 2024
cb9127e
ok trying this
sashavor Apr 24, 2024
c3482b2
huh ok printing
sashavor Apr 24, 2024
706ffcf
I don't get why this isn't truncating
sashavor Apr 24, 2024
b772c43
ok, printing
sashavor Apr 24, 2024
8b733a7
more printing
sashavor Apr 24, 2024
2f4f2c4
oooopsie
sashavor Apr 24, 2024
7356255
making max_length more robust (thanks Zach)
sashavor Apr 24, 2024
8387446
ok printing here
sashavor Apr 24, 2024
78a0815
oopsie
sashavor Apr 24, 2024
f6779a8
ok this should be good
sashavor Apr 24, 2024
271c9e3
removing prints lololol
sashavor Apr 24, 2024
e45fd2f
Add fix for ASR when processor outputs are numpy arrays
regisss Apr 25, 2024
19756b2
trying to add more granularity to logging
sashavor Apr 26, 2024
fe76ebc
how about this
sashavor Apr 26, 2024
206d71c
adding granular run logging to all tasks
sashavor Apr 26, 2024
da990e5
changing tracking mode to get more info
sashavor Apr 29, 2024
36203d1
roberta base
sashavor Apr 29, 2024
c760545
distilroberta
sashavor Apr 29, 2024
e9e4f01
distilbert
sashavor Apr 29, 2024
cd69a1b
distilroberta base
sashavor Apr 29, 2024
75f589a
distilbert base
sashavor Apr 29, 2024
69d6807
toxic comment
sashavor Apr 29, 2024
f192a43
gpt2
sashavor Apr 29, 2024
02d8279
gpt2 large
sashavor Apr 29, 2024
100756b
roberta large
sashavor Apr 29, 2024
15d0986
roberta base
sashavor Apr 29, 2024
0ee6b5c
distilroberta
sashavor Apr 29, 2024
7a8ff95
distilbert
sashavor Apr 29, 2024
ed9f819
bert base
sashavor Apr 29, 2024
aaeee5f
bert large
sashavor Apr 29, 2024
6a7a10f
distilbert
sashavor Apr 29, 2024
8a06943
roberta base
sashavor Apr 29, 2024
e13368e
gpt2
sashavor Apr 29, 2024
19e4715
adding a folder to run structure
sashavor Apr 30, 2024
615a231
bert
sashavor Apr 30, 2024
c393fce
adding t5
sashavor Apr 30, 2024
54a9536
adding preprocessing for t5
sashavor Apr 30, 2024
1bb8b53
maybe like this
sashavor Apr 30, 2024
e21dc5b
or like this
sashavor Apr 30, 2024
fd3f0d3
or like this?
sashavor Apr 30, 2024
c00e163
grrr
sashavor Apr 30, 2024
1fe2d5e
or this
sashavor Apr 30, 2024
692d304
roberta
sashavor Apr 30, 2024
0894660
trying t5 small
sashavor Apr 30, 2024
be47dc4
smdh
sashavor Apr 30, 2024
4da3e47
nevermind
sashavor Apr 30, 2024
847c8b5
starting image classification
sashavor May 1, 2024
4dcc955
starting text gen- no deepspeed
sashavor May 2, 2024
cb5bf30
oops add model name
sashavor May 2, 2024
bdb7a16
gonna do gpt2 for now
sashavor May 2, 2024
f2663b8
distilbert
sashavor May 2, 2024
4813d62
ok gonna try QA instead
sashavor May 2, 2024
c14c280
attempting to add T5 QA}
sashavor May 2, 2024
64193a0
removing empty check for now
sashavor May 3, 2024
4a6bf1e
oops
sashavor May 3, 2024
9080852
ok how about dis
sashavor May 3, 2024
ec25538
ok fine, adding new config field
sashavor May 3, 2024
a67d1b6
maybe like this
sashavor May 3, 2024
103a07e
where is the list
sashavor May 3, 2024
6e3c79b
is this really it
sashavor May 3, 2024
b13f440
ok, making progress
sashavor May 3, 2024
ff6f3ef
ok I think I know how to fix this
sashavor May 3, 2024
04fa5ce
parentheses matter
sashavor May 3, 2024
d10df95
modifying preprocessing to subtract new tokens
sashavor May 7, 2024
7f3702a
ok and like this?
sashavor May 7, 2024
206e2af
how about this
sashavor May 7, 2024
5a4aa33
oops text2text needs to be fixed now
sashavor May 7, 2024
968df99
oops
sashavor May 7, 2024
153d828
Add possibility to specify processor name
regisss May 8, 2024
2e28552
trying diff tokenizer
sashavor May 8, 2024
a790548
starting summarization
sashavor May 8, 2024
5fc9aa5
adding t5 summarization
sashavor May 14, 2024
3c5241f
adding ASR
sashavor May 17, 2024
5e1761a
ITT
sashavor May 21, 2024
b750519
Adding object detection
sashavor May 23, 2024
442c3a6
Fix issue with ASR models that cannot generate
regisss May 24, 2024
941dc13
Fix Seamless m4t ASR preprocessing
regisss May 26, 2024
b6c7a6e
Hack for Stable Cascade
regisss May 29, 2024
b59ad73
adding XL backend option
sashavor May 30, 2024
dfb9c35
Merge branch 'energy_star_dev' of github.com:huggingface/optimum-benc…
sashavor May 30, 2024
c34fd9e
pushing fix for image captioning models
sashavor May 30, 2024
3108d82
Fix SDXL
regisss Jun 17, 2024
fb46d60
adding sdxl fix
sashavor Jun 20, 2024
2615a4e
testing cpu
sashavor Jul 3, 2024
8a296be
Add debug print
regisss Jul 23, 2024
6c2a07d
More
regisss Jul 23, 2024
a803e57
Remove debug prints
regisss Jul 23, 2024
ab9fa90
Update text_classification.yaml
sashavor Jul 25, 2024
ce25d98
Update text_generation.yaml
sashavor Jul 29, 2024
f30462f
Update text_generation.yaml
sashavor Jul 31, 2024
b34b5d4
Update text_generation.yaml
sashavor Aug 1, 2024
ec25640
Add number of iterations to benchmark
regisss Aug 8, 2024
1cd1b7c
Log all iterations
regisss Aug 20, 2024
0b4ac55
updating t5 text classification
sashavor Aug 30, 2024
fb3b2c7
fix for image captioning
sashavor Sep 10, 2024
8cf6001
fix for image captioning 2
sashavor Sep 10, 2024
7ed7539
nevermind
sashavor Sep 10, 2024
65531b1
Merge branch 'main' into energy_star_dev
regisss Sep 12, 2024
bd8574a
Update examples
regisss Sep 12, 2024
82f1ef6
Update code
regisss Sep 17, 2024
e9b5b32
Update optimum_benchmark/scenarios/energy_star/preprocessing_utils.py
IlyasMoutawwakil Nov 19, 2024
552e215
remove backend changes
IlyasMoutawwakil Nov 19, 2024
adfab8b
move and cleanup preprocessors
IlyasMoutawwakil Nov 19, 2024
330ae71
first iteration
IlyasMoutawwakil Nov 19, 2024
c3aa25d
Merge branch 'main' into energy_star_pr
IlyasMoutawwakil Nov 19, 2024
8c80668
configs
IlyasMoutawwakil Nov 22, 2024
85ff0b3
update energy star scenario
IlyasMoutawwakil Nov 22, 2024
1b79eac
update energy star examples
IlyasMoutawwakil Nov 22, 2024
5d267c0
ci/test energy star
IlyasMoutawwakil Nov 22, 2024
c18c9e5
support synonym tasks like sentence-similarity
IlyasMoutawwakil Nov 22, 2024
3f7f1c1
Merge branch 'main' into energy_star_pr
IlyasMoutawwakil Nov 22, 2024
cdd2f3e
rename
IlyasMoutawwakil Nov 22, 2024
a3d53ba
fix summarization
IlyasMoutawwakil Nov 22, 2024
cd4c135
add librosa for audio processing
IlyasMoutawwakil Nov 22, 2024
c15d75e
smaller sentence sim model
IlyasMoutawwakil Nov 22, 2024
e23808f
smaller text2text generation model
IlyasMoutawwakil Nov 22, 2024
2b6db1a
remove old energy star config
IlyasMoutawwakil Nov 22, 2024
41439ec
Merge branch 'main' into energy_star_pr
IlyasMoutawwakil Nov 25, 2024
249b652
fix
IlyasMoutawwakil Nov 25, 2024
3f1c0dd
fix
IlyasMoutawwakil Nov 25, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
49 changes: 49 additions & 0 deletions .github/workflows/test_cli_energy_star.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
name: CLI CUDA Energy Star Tests

on:
workflow_dispatch:
push:
branches:
- main
pull_request:
branches:
- main
types:
- opened
- reopened
- synchronize
- labeled
- unlabeled

concurrency:
cancel-in-progress: true
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}

jobs:
run_cli_energy_star_tests:
if: ${{
(github.event_name == 'push') ||
(github.event_name == 'workflow_dispatch') ||
contains( github.event.pull_request.labels.*.name, 'cli') ||
contains( github.event.pull_request.labels.*.name, 'energy_star') ||
contains( github.event.pull_request.labels.*.name, 'cli_energy_star')
}}

runs-on:
group: aws-g5-4xlarge-plus

container:
image: ghcr.io/huggingface/optimum-benchmark:latest-cuda
options: --ipc host --gpus all

steps:
- name: Checkout
uses: actions/checkout@v4

- name: Install dependencies
run: |
pip install -e .[testing,diffusers,timm,codecarbon] librosa

- name: Run tests
run: |
pytest tests/test_energy_star.py -x -s
20 changes: 20 additions & 0 deletions examples/energy_star/_base_.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
log_report: true
print_report: true

# hydra/cli specific settings
hydra:
run:
# define run directory
dir: runs/${name}
sweep:
# define sweep directory
dir: sweeps/${name}
subdir: ${hydra.job.override_dirname}
job:
# change working directory to the job directory
# so that artifacts are stored there
chdir: true
env_set:
# set environment variable OVERRIDE_BENCHMARKS to 1
# to not skip benchmarks that have been run before
OVERRIDE_BENCHMARKS: 1
28 changes: 28 additions & 0 deletions examples/energy_star/automatic_speech_recognition.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
defaults:
- benchmark
- backend: pytorch
- launcher: process
- scenario: energy_star
- _base_
- _self_

name: automatic_speech_recognition

launcher:
device_isolation: true
device_isolation_action: warn

backend:
device: cuda
device_ids: 0
no_weights: true
model: openai/whisper-large-v3
task: automatic-speech-recognition

scenario:
dataset_name: EnergyStarAI/ASR
audio_column_name: audio
num_samples: 1000

input_shapes:
batch_size: 1
28 changes: 28 additions & 0 deletions examples/energy_star/image_classification.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
defaults:
- benchmark
- backend: pytorch
- launcher: process
- scenario: energy_star
- _base_
- _self_

name: image_classification

launcher:
device_isolation: true
device_isolation_action: warn

backend:
device: cuda
device_ids: 0
no_weights: true
task: image-classification
model: google/vit-base-patch16-224

scenario:
dataset_name: EnergyStarAI/image_classification
image_column_name: image
num_samples: 1000

input_shapes:
batch_size: 1
28 changes: 28 additions & 0 deletions examples/energy_star/image_to_text.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
defaults:
- benchmark
- backend: pytorch
- launcher: process
- scenario: energy_star
- _base_
- _self_

name: image_to_text

launcher:
device_isolation: true
device_isolation_action: warn

backend:
device: cuda
device_ids: 0
no_weights: true
task: image-to-text
model: sashakunitsyn/vlrm-blip2-opt-2.7b

scenario:
dataset_name: EnergyStarAI/image_captioning
image_column_name: image
num_samples: 1000

input_shapes:
batch_size: 1
Original file line number Diff line number Diff line change
Expand Up @@ -6,21 +6,23 @@ defaults:
- _base_
- _self_

name: energy_star
name: object_detection

launcher:
device_isolation: true
device_isolation_action: warn

backend:
model: gpt2
device: cuda
device_ids: 0
no_weights: true
task: feature-extraction

launcher:
device_isolation: true
task: object-detection
model: facebook/detr-resnet-50

scenario:
dataset_name: wikitext
dataset_config: wikitext-2-raw-v1
num_samples: 10
dataset_name: EnergyStarAI/object_detection
image_column_name: image
num_samples: 1000

input_shapes:
batch_size: 1
29 changes: 29 additions & 0 deletions examples/energy_star/question_answering.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
defaults:
- benchmark
- backend: pytorch
- launcher: process
- scenario: energy_star
- _base_
- _self_

name: question_answering

launcher:
device_isolation: true
device_isolation_action: warn

backend:
device: cuda
device_ids: 0
no_weights: true
task: question-answering
model: deepset/electra-base-squad2

scenario:
dataset_name: EnergyStarAI/extractive_qa
question_column_name: question
context_column_name: context
num_samples: 1000

input_shapes:
batch_size: 1
30 changes: 30 additions & 0 deletions examples/energy_star/sentence_similarity.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
defaults:
- benchmark
- backend: pytorch
- launcher: process
- scenario: energy_star
- _base_
- _self_

name: sentence_similarity_udever-bloom-7b1

launcher:
device_isolation: true
device_isolation_action: warn

backend:
device: cuda
device_ids: 0
no_weights: true
library: transformers
task: sentence-similarity
model: sentence-transformers/all-MiniLM-L6-v2

scenario:
dataset_name: EnergyStarAI/sentence_similarity
sentence1_column_name: sentence1
sentence2_column_name: sentence2
num_samples: 1000

input_shapes:
batch_size: 1
33 changes: 33 additions & 0 deletions examples/energy_star/summarization.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
defaults:
- benchmark
- backend: pytorch
- launcher: process
- scenario: energy_star
- _base_
- _self_

name: summarization

launcher:
device_isolation: true
device_isolation_action: warn

backend:
device: cuda
device_ids: 0
no_weights: true
task: summarization
model: sshleifer/distilbart-cnn-12-6

scenario:
dataset_name: EnergyStarAI/summarization
text_column_name: text
num_samples: 1000
truncation: True

input_shapes:
batch_size: 1

generate_kwargs:
max_length: 10
min_new_tokens: 10
32 changes: 32 additions & 0 deletions examples/energy_star/t5_question_answering.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
defaults:
- benchmark
- backend: pytorch
- launcher: process
- scenario: energy_star
- _base_
- _self_

name: question_answering_t5

launcher:
device_isolation: true
device_isolation_action: warn

backend:
device: cuda
device_ids: 0
no_weights: true
model: google-t5/t5-large
task: text2text-generation

scenario:
dataset_name: EnergyStarAI/extractive_qa
question_column_name: question
context_column_name: context
dataset_prefix1: "question: "
dataset_prefix2: " context: "
t5_task: question_answering
num_samples: 1000

input_shapes:
batch_size: 1
35 changes: 35 additions & 0 deletions examples/energy_star/t5_summarization.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
defaults:
- benchmark
- backend: pytorch
- launcher: process
- scenario: energy_star
- _base_
- _self_

name: summarization_t5

launcher:
device_isolation: true
device_isolation_action: warn

backend:
device: cuda
device_ids: 0
no_weights: true
model: google-t5/t5-large
task: text2text-generation

scenario:
dataset_name: EnergyStarAI/summarization
dataset_prefix1: "summarize: "
text_column_name: text
t5_task: summarization
num_samples: 1000
truncation: True

input_shapes:
batch_size: 1

generate_kwargs:
max_new_tokens: 10
min_new_tokens: 10
36 changes: 36 additions & 0 deletions examples/energy_star/t5_text_classification.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
defaults:
- benchmark
- backend: pytorch
- launcher: process
- scenario: energy_star
- _base_
- _self_

name: text_classification_t5

launcher:
device_isolation: true
device_isolation_action: warn

backend:
device: cuda
device_ids: 0
no_weights: true
model: google-t5/t5-large
task: text2text-generation

scenario:
dataset_name: EnergyStarAI/text_classification
dataset_prefix1: "sst2 sentence: "
t5_task: text_classification
text_column_name: text

num_samples: 1000
truncation: True

input_shapes:
batch_size: 1

generate_kwargs:
max_new_tokens: 10
min_new_tokens: 10
Loading
Loading