Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KeyError: 'slots' #20

Open
bannayeva opened this issue May 23, 2021 · 12 comments
Open

KeyError: 'slots' #20

bannayeva opened this issue May 23, 2021 · 12 comments

Comments

@bannayeva
Copy link

Hi, i'm getting an error i cannot seem to get past. I've included it below:

Traceback (most recent call last): File "main.py", line 109, in <module> trn_loader = get_loader(args, "train", tokenizer, datasets, unified_meta) File "/content/ToD-BERT/utils/utils_general.py", line 58, in get_loader dataset = globals()["Dataset_"+task](data_info, tokenizer, args, unified_meta, mode, args["max_seq_length"]) File "/content/ToD-BERT/utils/dataloader_dst.py", line 20, in __init__ self.slots = list(unified_meta["slots"].keys()) KeyError: 'slots'

Is this to do with a specific dataset i need to include in the list of datasets to use? Because i do not want to use them all,just the multiwozs.

@jasonwu0731
Copy link
Owner

Hi,

what was the command you ran?

Please check these few lines to make sure and print what you have for your unified_meta.

@bannayeva
Copy link
Author

bannayeva commented May 30, 2021

Hi, running those lines gave this as output for unified_meta:

{'others': None, 'num_labels': 0, 'resp_cand_trn': {}}

In the definition of the function:

def get_unified_meta(datasets):    
  unified_meta = {"others":None}    
  for ds in datasets:        
    for key, value in datasets[ds]["meta"].items():

What is considered as meta within the datasets? And does it exist in specific ones, and not the others? Because i'm only using 5 of them, within this command:
!python main.py -task dst --do_train --dataset "['metalwoz','multiwoz','oos_intent','schema','taskmaster']" --model_name_or_path bert-base-uncased --my_model 'BertConfig'

@jasonwu0731
Copy link
Owner

The problem here is that some datasets you used do not have the DST labels to be trained as defined similarly to MultiWOZ.

You can first run only using Multiwoz and check what it has in the unified_meta or you can check the DST data loader of Multiwoz as well.

@Bosheng2020
Copy link

Hi Jason,

I have encountered the same issue. I only run the Multiwoz datasets (--dataset='["multiwoz"]' ) and I print out the unified_meta and here is the output: {'others': None, 'num_labels': 0}.

Could you kindly advise on this? I have tried Multiwoz 2.0 and Multiwoz 2.1 and both encountered the same issue.

Here are the error messages:
Traceback (most recent call last):
File "my_tod_pretraining.py", line 1010, in
main()
File "my_tod_pretraining.py", line 973, in main
trn_loader = get_loader(args_dict, "train", tokenizer, datasets, unified_meta, "train")
File "/home/rich/ToD-BERT-master/utils/utils_general.py", line 58, in get_loader
dataset = globals()["Dataset_"+task](data_info, tokenizer, args, unified_meta, mode, args["max_seq_length"])
File "/home/rich/ToD-BERT-master/utils/dataloader_dst.py", line 21, in init
self.slots = list(unified_meta["slots"].keys())
KeyError: 'slots'

Thanks a lot,

Rich

@jasonwu0731
Copy link
Owner

@Bosheng2020

Can you provide the full command you were running?

@Bosheng2020
Copy link

Bosheng2020 commented Jun 8, 2021

gpu=$1
model_type=$2
bert_dir=$3
output_dir=$4
add1=$5
add2=$6
add3=$7
add4=$8
add5=$9

CUDA_VISIBLE_DEVICES=$gpu python my_tod_pretraining.py
--task='dst'
--data_path='/home/rich/ToD-BERT-master/dialog_datasets'
--dataset='["multiwoz"]'
--model_type=${model_type}
--model_name_or_path=${bert_dir}
--output_dir=${output_dir}
--do_train
--do_eval
--mlm
--do_lower_case
--evaluate_during_training
--save_steps=2500 --logging_steps=1000
--per_gpu_train_batch_size=8 --per_gpu_eval_batch_size=8
${add1} ${add2} ${add3} ${add4} ${add5}

@Bosheng2020
Copy link

./run_tod_lm_pretraining.sh 2 bert bert-base-uncased save/pretrain/ToD-BERT-JNT --only_last_turn --add_rs_loss

@Bosheng2020
Copy link

Hi Jason,

I have pasted the full command as above.

Thanks a lot for your prompt reply.

Rich

@jasonwu0731
Copy link
Owner

jasonwu0731 commented Jun 8, 2021

@Bosheng2020

Are you trying to run a pretraining task or DST task?

If the pretraining task, you need to use task=usdl with run_tod_lm_pretraining.sh

If DST task, you need to run the command here

@Bosheng2020
Copy link

Oh I see... I am running a DST task. Thanks a lot for your help!

@bannayeva
Copy link
Author

Hi Jason, I tried using only multiwoz, but unified_meta remained the same. Any more things I could check? Thank you

@jasonwu0731
Copy link
Owner

@aliyabannaeva

Can you check if you can run this command?

If not, please copy and paste the error message here, thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants