diff --git a/nlp/README.md b/nlp/README.md index 9bcc5f3..3a9cc0d 100644 --- a/nlp/README.md +++ b/nlp/README.md @@ -2,40 +2,40 @@ Current content: -- [_Multilingual Sentence Embeddings_ (21/01/2021)](2021_01_21_multilingual_sentence_embeddings): +- [_Multilingual Sentence Embeddings_](multilingual_sentence_embeddings): Gives an overview of various current multilingual sentence embedding techniques and tools, and how they compare given various sequence lengths. -- [_Spacy 3.0_ (01/02/2021)](2021_02_01_spacy_3_projects): +- [_Spacy 3.0_](spacy_3_projects): Spacy 3.0 has just been released and in this tip, we'll have a look at some of the new features. We'll be training a German NER model and streamline the end-to-end pipeline using the brand new spaCy projects! -- [_Compact transformers_ (26/02/2021)](2021_02_26_compact_transformers): +- [_Compact transformers_](compact_transformers): Bigger isn't always better. In this tip we look at some compact BERT-based models that provide a nice balance between computational resources and model accuracy. -- [_Keyword Extraction with pke_ (18/03/2021)](2021_03_18_pke_keyword_extraction): +- [_Keyword Extraction with pke_](pke_keyword_extraction): The KEYNG (read *king*) is dead, long live the KEYNG! In this tip we look at `pke`, an alternative to Gensim for keyword extraction. -- [_Explainable transformers using SHAP_ (22/04/2021)](2021_04_22_shap_for_huggingface_transformers): +- [_Explainable transformers using SHAP_](shap_for_huggingface_transformers): BERT, explain yourself! ๐Ÿ“– Up until recently language model predictions have lacked transparency. In this tip we look at `SHAP`, a way to explain your latest transformer based models. -- [_Transformer-based Data Augmentation_ (18/06/2021)](2021_06_18_data_augmentation): +- [_Transformer-based Data Augmentation_](data_augmentation): Ever struggled with having a limited non-English NLP dataset for a project? Fear not, data augmentation to the rescue โ›‘๏ธ In this week's tip, we look at backtranslation ๐Ÿ”€ and contextual word embedding insertions as data augmentation techniques for multilingual NLP. -- [_Long range transformers_ (14/07/2021)](2021_06_29_long_range_transformers): +- [_Long range transformers_](long_range_transformers): Beyond and above the 512! ๐Ÿ… In this week's tip, we look at novel long range transformer architectures and compare them against the well-known RoBERTa model. -- [_Neural Keyword Extraction_ (10/09/2021)](2021_09_10_neural_keyword_extraction): +- [_Neural Keyword Extraction_](neural_keyword_extraction): Neural Keyword Extraction ๐Ÿง  In this week's tip, we look at neural keyword extraction methods and how they compare to classical methods. -- [_HuggingFace Optimum_ (12/10/2021)](2021_10_12_huggingface_optimum): +- [_HuggingFace Optimum_](huggingface_optimum): HuggingFace Optimum Quantization โœ‚๏ธ In this week's tip, we take a look at the new HuggingFace Optimum package to check out some model quantization techniques. -- [ _Text Augmentation using large-scale LMs and prompt engineering_ (25/11/2021)](2021_11_25_augmentation_lm): +- [ _Text Augmentation using large-scale LMs and prompt engineering_](augmentation_lm): Typically, the more data we have, the better performance we can achieve ๐Ÿค™. However, it is sometimes difficult and/or expensive to annotate a large amount of training data ๐Ÿ˜ž. In this tip, we leverage three large-scale LMs (GPT-3, GPT-J and GPT-Neo) to generate very realistic samples from a very small dataset. diff --git a/nlp/2021_11_25_augmentation_lm/README.md b/nlp/augmentation_lm/README.md similarity index 94% rename from nlp/2021_11_25_augmentation_lm/README.md rename to nlp/augmentation_lm/README.md index fe727a2..9fc1fb4 100644 --- a/nlp/2021_11_25_augmentation_lm/README.md +++ b/nlp/augmentation_lm/README.md @@ -5,4 +5,4 @@ Typically, the more data we have, the better performance we can achieve ๐Ÿค™. Ho Large-scale language models (LMs) are excellent few-shot learners, allowing them to be controlled via natural text prompts. In this tip, we leverage three large-scale LMs (GPT-3, GPT-J and GPT-Neo) and prompt engineering to generate very realistic samples from a very small dataset. The model takes as input two real samples from our dataset, embeds them in a carefully designed prompt and generates an augmented mixed sample influenced by the sample sentences. We use the [Emotion](https://huggingface.co/datasets/emotion) dataset and distilled BERT pre-trained model and show that this augmentation method boosts the model performance and generates very realistic samples. For more information on text augmentation using large-scale LMs check [GPT3Mix](https://arxiv.org/pdf/2104.08826.pdf). We recommend to open the notebook using Colab for an interactive explainable experience and optimal rendering of the visuals ๐Ÿ‘‡: -[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/ml6team/quick-tips/blob/main/nlp/2021_11_25_augmentation_lm/nlp_augmentation_lm.ipynb) +[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/ml6team/quick-tips/blob/main/nlp/augmentation_lm/nlp_augmentation_lm.ipynb) diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/10/dataset.arrow b/nlp/augmentation_lm/data/gpt-3/10/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/10/dataset.arrow rename to nlp/augmentation_lm/data/gpt-3/10/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/10/dataset_info.json b/nlp/augmentation_lm/data/gpt-3/10/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/10/dataset_info.json rename to nlp/augmentation_lm/data/gpt-3/10/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/10/state.json b/nlp/augmentation_lm/data/gpt-3/10/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/10/state.json rename to nlp/augmentation_lm/data/gpt-3/10/state.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/100/dataset.arrow b/nlp/augmentation_lm/data/gpt-3/100/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/100/dataset.arrow rename to nlp/augmentation_lm/data/gpt-3/100/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/100/dataset_info.json b/nlp/augmentation_lm/data/gpt-3/100/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/100/dataset_info.json rename to nlp/augmentation_lm/data/gpt-3/100/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/100/state.json b/nlp/augmentation_lm/data/gpt-3/100/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/100/state.json rename to nlp/augmentation_lm/data/gpt-3/100/state.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/200/dataset.arrow b/nlp/augmentation_lm/data/gpt-3/200/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/200/dataset.arrow rename to nlp/augmentation_lm/data/gpt-3/200/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/200/dataset_info.json b/nlp/augmentation_lm/data/gpt-3/200/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/200/dataset_info.json rename to nlp/augmentation_lm/data/gpt-3/200/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/200/state.json b/nlp/augmentation_lm/data/gpt-3/200/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/200/state.json rename to nlp/augmentation_lm/data/gpt-3/200/state.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/300/dataset.arrow b/nlp/augmentation_lm/data/gpt-3/300/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/300/dataset.arrow rename to nlp/augmentation_lm/data/gpt-3/300/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/300/dataset_info.json b/nlp/augmentation_lm/data/gpt-3/300/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/300/dataset_info.json rename to nlp/augmentation_lm/data/gpt-3/300/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/300/state.json b/nlp/augmentation_lm/data/gpt-3/300/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/300/state.json rename to nlp/augmentation_lm/data/gpt-3/300/state.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/400/dataset.arrow b/nlp/augmentation_lm/data/gpt-3/400/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/400/dataset.arrow rename to nlp/augmentation_lm/data/gpt-3/400/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/400/dataset_info.json b/nlp/augmentation_lm/data/gpt-3/400/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/400/dataset_info.json rename to nlp/augmentation_lm/data/gpt-3/400/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/400/state.json b/nlp/augmentation_lm/data/gpt-3/400/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/400/state.json rename to nlp/augmentation_lm/data/gpt-3/400/state.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/50/dataset.arrow b/nlp/augmentation_lm/data/gpt-3/50/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/50/dataset.arrow rename to nlp/augmentation_lm/data/gpt-3/50/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/50/dataset_info.json b/nlp/augmentation_lm/data/gpt-3/50/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/50/dataset_info.json rename to nlp/augmentation_lm/data/gpt-3/50/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/50/state.json b/nlp/augmentation_lm/data/gpt-3/50/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/50/state.json rename to nlp/augmentation_lm/data/gpt-3/50/state.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/500/dataset.arrow b/nlp/augmentation_lm/data/gpt-3/500/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/500/dataset.arrow rename to nlp/augmentation_lm/data/gpt-3/500/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/500/dataset_info.json b/nlp/augmentation_lm/data/gpt-3/500/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/500/dataset_info.json rename to nlp/augmentation_lm/data/gpt-3/500/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-3/500/state.json b/nlp/augmentation_lm/data/gpt-3/500/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-3/500/state.json rename to nlp/augmentation_lm/data/gpt-3/500/state.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-j/10/dataset.arrow b/nlp/augmentation_lm/data/gpt-j/10/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-j/10/dataset.arrow rename to nlp/augmentation_lm/data/gpt-j/10/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-j/10/dataset_info.json b/nlp/augmentation_lm/data/gpt-j/10/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-j/10/dataset_info.json rename to nlp/augmentation_lm/data/gpt-j/10/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-j/10/state.json b/nlp/augmentation_lm/data/gpt-j/10/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-j/10/state.json rename to nlp/augmentation_lm/data/gpt-j/10/state.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-j/100/dataset.arrow b/nlp/augmentation_lm/data/gpt-j/100/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-j/100/dataset.arrow rename to nlp/augmentation_lm/data/gpt-j/100/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-j/100/dataset_info.json b/nlp/augmentation_lm/data/gpt-j/100/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-j/100/dataset_info.json rename to nlp/augmentation_lm/data/gpt-j/100/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-j/100/state.json b/nlp/augmentation_lm/data/gpt-j/100/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-j/100/state.json rename to nlp/augmentation_lm/data/gpt-j/100/state.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-j/200/dataset.arrow b/nlp/augmentation_lm/data/gpt-j/200/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-j/200/dataset.arrow rename to nlp/augmentation_lm/data/gpt-j/200/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-j/200/dataset_info.json b/nlp/augmentation_lm/data/gpt-j/200/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-j/200/dataset_info.json rename to nlp/augmentation_lm/data/gpt-j/200/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-j/200/state.json b/nlp/augmentation_lm/data/gpt-j/200/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-j/200/state.json rename to nlp/augmentation_lm/data/gpt-j/200/state.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-j/50/dataset.arrow b/nlp/augmentation_lm/data/gpt-j/50/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-j/50/dataset.arrow rename to nlp/augmentation_lm/data/gpt-j/50/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-j/50/dataset_info.json b/nlp/augmentation_lm/data/gpt-j/50/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-j/50/dataset_info.json rename to nlp/augmentation_lm/data/gpt-j/50/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-j/50/state.json b/nlp/augmentation_lm/data/gpt-j/50/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-j/50/state.json rename to nlp/augmentation_lm/data/gpt-j/50/state.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-neo/10/dataset.arrow b/nlp/augmentation_lm/data/gpt-neo/10/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-neo/10/dataset.arrow rename to nlp/augmentation_lm/data/gpt-neo/10/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-neo/10/dataset_info.json b/nlp/augmentation_lm/data/gpt-neo/10/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-neo/10/dataset_info.json rename to nlp/augmentation_lm/data/gpt-neo/10/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-neo/10/state.json b/nlp/augmentation_lm/data/gpt-neo/10/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-neo/10/state.json rename to nlp/augmentation_lm/data/gpt-neo/10/state.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-neo/100/dataset.arrow b/nlp/augmentation_lm/data/gpt-neo/100/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-neo/100/dataset.arrow rename to nlp/augmentation_lm/data/gpt-neo/100/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-neo/100/dataset_info.json b/nlp/augmentation_lm/data/gpt-neo/100/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-neo/100/dataset_info.json rename to nlp/augmentation_lm/data/gpt-neo/100/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-neo/100/state.json b/nlp/augmentation_lm/data/gpt-neo/100/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-neo/100/state.json rename to nlp/augmentation_lm/data/gpt-neo/100/state.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-neo/200/dataset.arrow b/nlp/augmentation_lm/data/gpt-neo/200/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-neo/200/dataset.arrow rename to nlp/augmentation_lm/data/gpt-neo/200/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-neo/200/dataset_info.json b/nlp/augmentation_lm/data/gpt-neo/200/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-neo/200/dataset_info.json rename to nlp/augmentation_lm/data/gpt-neo/200/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-neo/200/state.json b/nlp/augmentation_lm/data/gpt-neo/200/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-neo/200/state.json rename to nlp/augmentation_lm/data/gpt-neo/200/state.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-neo/50/dataset.arrow b/nlp/augmentation_lm/data/gpt-neo/50/dataset.arrow similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-neo/50/dataset.arrow rename to nlp/augmentation_lm/data/gpt-neo/50/dataset.arrow diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-neo/50/dataset_info.json b/nlp/augmentation_lm/data/gpt-neo/50/dataset_info.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-neo/50/dataset_info.json rename to nlp/augmentation_lm/data/gpt-neo/50/dataset_info.json diff --git a/nlp/2021_11_25_augmentation_lm/data/gpt-neo/50/state.json b/nlp/augmentation_lm/data/gpt-neo/50/state.json similarity index 100% rename from nlp/2021_11_25_augmentation_lm/data/gpt-neo/50/state.json rename to nlp/augmentation_lm/data/gpt-neo/50/state.json diff --git a/nlp/2021_11_25_augmentation_lm/nlp_augmentation_lm.ipynb b/nlp/augmentation_lm/nlp_augmentation_lm.ipynb similarity index 98% rename from nlp/2021_11_25_augmentation_lm/nlp_augmentation_lm.ipynb rename to nlp/augmentation_lm/nlp_augmentation_lm.ipynb index 9af6ad6..01b598e 100644 --- a/nlp/2021_11_25_augmentation_lm/nlp_augmentation_lm.ipynb +++ b/nlp/augmentation_lm/nlp_augmentation_lm.ipynb @@ -65,15 +65,15 @@ }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "v-ItvjqP4Cxn" }, + "outputs": [], "source": [ "!pip install -q transformers==4.12.5 datasets==1.16.1 tokenizers==0.10.3 openai==0.11.3 requests==2.23.0 sentencepiece==0.1.96\n", "!pip install pandas==1.1.0" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", @@ -86,9 +86,11 @@ }, { "cell_type": "code", + "execution_count": 2, "metadata": { "id": "hM-09Chsj7wI" }, + "outputs": [], "source": [ "import re\n", "import os\n", @@ -125,18 +127,18 @@ }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "1ltsurmvaaGd" }, + "outputs": [], "source": [ "# load the dataset and filter on samples that have a token count less than 30 \n", "# to use only short tweets\n", "max_input_len = 30\n", "tokenizer = AutoTokenizer.from_pretrained(\"distilbert-base-uncased\")\n", "emotion_ds = load_dataset(\"emotion\").filter(lambda e: len(tokenizer.batch_encode_plus([e['text']]).input_ids[0]) < int(max_input_len))" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", @@ -149,9 +151,11 @@ }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "9YI6-pZdSqnI" }, + "outputs": [], "source": [ "# select 10 random train samples from each of the three emotions\n", "# SADNESS = 0\n", @@ -183,9 +187,7 @@ "\n", "# define the maping between emotions and labels\n", "mapping = ClassLabel(names=['joy', 'anger', 'surprise'])" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", @@ -198,6 +200,7 @@ }, { "cell_type": "code", + "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" @@ -205,23 +208,10 @@ "id": "GD2xmdrg9OLY", "outputId": "bccebaf5-a76e-4dda-c29e-3c02de5dfd8b" }, - "source": [ - "print(\"Train set\")\n", - "print(f\"Total samples: {len(emotion_train_ds)}\\n\")\n", - "print(\"A random sample\")\n", - "print(f\"Text: {emotion_train_ds['text'][10]} \\nLabel: {mapping.int2str(emotion_train_ds['label'][10])}\")\n", - "print(\"\\n\")\n", - "\n", - "print(\"Test set\")\n", - "print(f\"Total samples: {len(emotion_test_ds)}\\n\")\n", - "print(\"A random sample\")\n", - "print(f\"Text: {emotion_test_ds['text'][10]} \\nLabel: {mapping.int2str(emotion_test_ds['label'][10])}\")" - ], - "execution_count": null, "outputs": [ { - "output_type": "stream", "name": "stdout", + "output_type": "stream", "text": [ "Train set\n", "Total samples: 30\n", @@ -239,6 +229,18 @@ "Label: joy\n" ] } + ], + "source": [ + "print(\"Train set\")\n", + "print(f\"Total samples: {len(emotion_train_ds)}\\n\")\n", + "print(\"A random sample\")\n", + "print(f\"Text: {emotion_train_ds['text'][10]} \\nLabel: {mapping.int2str(emotion_train_ds['label'][10])}\")\n", + "print(\"\\n\")\n", + "\n", + "print(\"Test set\")\n", + "print(f\"Total samples: {len(emotion_test_ds)}\\n\")\n", + "print(\"A random sample\")\n", + "print(f\"Text: {emotion_test_ds['text'][10]} \\nLabel: {mapping.int2str(emotion_test_ds['label'][10])}\")" ] }, { @@ -291,9 +293,11 @@ }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "zlI7GYwbGx5m" }, + "outputs": [], "source": [ "def get_two_random_samples():\n", " # define a function that returns two random samples from the train set.\n", @@ -309,9 +313,7 @@ " f\"Tweet: {text2} (Sentiment: {mapping.int2str(label2)})\\n\"\n", " f\"Tweet:\")\n", " return prompt" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", @@ -328,9 +330,11 @@ }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "BAzLn9t83YCS" }, + "outputs": [], "source": [ "# define the number of synthetic samples to generate\n", "n = 10\n", @@ -378,15 +382,15 @@ "# many api requests\n", "synthetic_ds = Dataset.from_dict({'text': new_texts, 'label': new_labels})\n", "synthetic_ds.save_to_disk('./data/gpt-3/' + str(n))" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "rPlF7dXKor6v" }, + "outputs": [], "source": [ "# load the synthetic datasets with 10, 50, 100 and 200 samples\n", "# run this if you do not want to generate samples on your own.\n", @@ -394,16 +398,14 @@ "if not os.path.isdir('./data'):\n", " !git clone https://github.com/ml6team/quick-tips.git\n", " !cd quick-tips \n", - " !mv quick-tips/nlp/2021_11_25_augmentation_lm/data ./data\n", + " !mv quick-tips/nlp/augmentation_lm/data ./data\n", " !rm -rf quick-tips\n", "\n", "synthetic_gpt3_10_ds = load_from_disk('./data/gpt-3/10')\n", "synthetic_gpt3_50_ds = load_from_disk('./data/gpt-3/50')\n", "synthetic_gpt3_100_ds = load_from_disk('./data/gpt-3/100')\n", "synthetic_gpt3_200_ds = load_from_disk('./data/gpt-3/200')" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", @@ -416,20 +418,21 @@ }, { "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "0ug42-3M0TeU" + }, + "outputs": [], "source": [ "def print_random_sample(ds):\n", " # function that prints a random sample from a dataset\n", " idx = random.randint(0, len(ds)-1)\n", " print(f\"Text: {ds['text'][idx]} \\nLabel: {mapping.int2str(ds['label'][idx])}\\n\")" - ], - "metadata": { - "id": "0ug42-3M0TeU" - }, - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "code", + "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" @@ -437,21 +440,10 @@ "id": "6irrOSPrKVU9", "outputId": "39eab1b3-ddf5-4cfc-a42c-a013e7af083e" }, - "source": [ - "print(\"Dataset of 10 synthetic samples:\")\n", - "print_random_sample(synthetic_gpt3_10_ds)\n", - "print(\"Dataset of 50 synthetic samples:\")\n", - "print_random_sample(synthetic_gpt3_50_ds)\n", - "print(\"Dataset of 100 synthetic samples:\")\n", - "print_random_sample(synthetic_gpt3_100_ds)\n", - "print(\"Dataset of 200 synthetic samples:\")\n", - "print_random_sample(synthetic_gpt3_200_ds)" - ], - "execution_count": null, "outputs": [ { - "output_type": "stream", "name": "stdout", + "output_type": "stream", "text": [ "Dataset of 10 synthetic samples:\n", "Text: even if ur not into these kind of things u have to admit it's pretty cool \n", @@ -471,6 +463,16 @@ "\n" ] } + ], + "source": [ + "print(\"Dataset of 10 synthetic samples:\")\n", + "print_random_sample(synthetic_gpt3_10_ds)\n", + "print(\"Dataset of 50 synthetic samples:\")\n", + "print_random_sample(synthetic_gpt3_50_ds)\n", + "print(\"Dataset of 100 synthetic samples:\")\n", + "print_random_sample(synthetic_gpt3_100_ds)\n", + "print(\"Dataset of 200 synthetic samples:\")\n", + "print_random_sample(synthetic_gpt3_200_ds)" ] }, { @@ -497,9 +499,11 @@ }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "XnYf0vZdHGiK" }, + "outputs": [], "source": [ "tokenizer = AutoTokenizer.from_pretrained('EleutherAI/gpt-neo-2.7B')\n", "model = AutoModelForCausalLM.from_pretrained('EleutherAI/gpt-neo-2.7B')\n", @@ -540,32 +544,31 @@ "# define the synthetic dataset and save it to disk \n", "synthetic_ds = Dataset.from_dict({'text': new_texts, 'label': new_labels})\n", "synthetic_ds.save_to_disk('./data/gpt-neo/' + str(n))" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "MncjIfEcHRsM" }, + "outputs": [], "source": [ "if not os.path.isdir('./data'):\n", " !git clone https://github.com/ml6team/quick-tips.git\n", " !cd quick-tips\n", - " !mv quick-tips/nlp/2021_11_25_augmentation_lm/data ./data\n", + " !mv quick-tips/nlp/augmentation_lm/data ./data\n", " !rm -rf quick-tips\n", "\n", "synthetic_gptneo_10_ds = load_from_disk('./data/gpt-neo/10')\n", "synthetic_gptneo_50_ds = load_from_disk('./data/gpt-neo/50')\n", "synthetic_gptneo_100_ds = load_from_disk('./data/gpt-neo/100')\n", "synthetic_gptneo_200_ds = load_from_disk('./data/gpt-neo/200')" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "code", + "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" @@ -573,21 +576,10 @@ "id": "uhdkFMCpHeuO", "outputId": "17e35fec-2119-4bac-de45-18d1d0d10233" }, - "source": [ - "print(\"Dataset of 10 synthetic samples:\")\n", - "print_random_sample(synthetic_gptneo_10_ds)\n", - "print(\"Dataset of 50 synthetic samples:\")\n", - "print_random_sample(synthetic_gptneo_50_ds)\n", - "print(\"Dataset of 100 synthetic samples:\")\n", - "print_random_sample(synthetic_gptneo_100_ds)\n", - "print(\"Dataset of 200 synthetic samples:\")\n", - "print_random_sample(synthetic_gptneo_200_ds)" - ], - "execution_count": null, "outputs": [ { - "output_type": "stream", "name": "stdout", + "output_type": "stream", "text": [ "Dataset of 10 synthetic samples:\n", "Text: happy friday :) \n", @@ -607,6 +599,16 @@ "\n" ] } + ], + "source": [ + "print(\"Dataset of 10 synthetic samples:\")\n", + "print_random_sample(synthetic_gptneo_10_ds)\n", + "print(\"Dataset of 50 synthetic samples:\")\n", + "print_random_sample(synthetic_gptneo_50_ds)\n", + "print(\"Dataset of 100 synthetic samples:\")\n", + "print_random_sample(synthetic_gptneo_100_ds)\n", + "print(\"Dataset of 200 synthetic samples:\")\n", + "print_random_sample(synthetic_gptneo_200_ds)" ] }, { @@ -633,9 +635,11 @@ }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "h7QKbaaekqCr" }, + "outputs": [], "source": [ "tokenizer = AutoTokenizer.from_pretrained(\"EleutherAI/gpt-j-6B\")\n", "model = AutoModelForCausalLM.from_pretrained(\"EleutherAI/gpt-j-6B\")\n", @@ -675,15 +679,15 @@ "# define the synthetic dataset and save it to disk \n", "synthetic_ds = Dataset.from_dict({'text': new_texts, 'label': new_labels})\n", "synthetic_ds.save_to_disk('./data/gpt-j/' + str(n))" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "Yu0JVPbQk1zB" }, + "outputs": [], "source": [ "# load the synthetic datasets with 10, 50, 100 and 200 samples\n", "# run this if you do not want to generate new samples on your own.\n", @@ -691,19 +695,18 @@ "if not os.path.isdir('./data'):\n", " !git clone https://github.com/ml6team/quick-tips.git\n", " !cd quick-tips\n", - " !mv quick-tips/nlp/2021_11_25_augmentation_lm/data ./data\n", + " !mv quick-tips/nlp/augmentation_lm/data ./data\n", " !rm -rf quick-tips\n", "\n", "synthetic_gptj_10_ds = load_from_disk('./data/gpt-j/10')\n", "synthetic_gptj_50_ds = load_from_disk('./data/gpt-j/50')\n", "synthetic_gptj_100_ds = load_from_disk('./data/gpt-j/100')\n", "synthetic_gptj_200_ds = load_from_disk('./data/gpt-j/200')" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "code", + "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" @@ -711,21 +714,10 @@ "id": "Mpq3vNSMFWx_", "outputId": "cc7a8607-81e5-4999-f05d-94916e691297" }, - "source": [ - "print(\"Dataset of 10 synthetic samples:\")\n", - "print_random_sample(synthetic_gptj_10_ds)\n", - "print(\"Dataset of 50 synthetic samples:\")\n", - "print_random_sample(synthetic_gptj_50_ds)\n", - "print(\"Dataset of 100 synthetic samples:\")\n", - "print_random_sample(synthetic_gptj_100_ds)\n", - "print(\"Dataset of 200 synthetic samples:\")\n", - "print_random_sample(synthetic_gptj_200_ds)" - ], - "execution_count": null, "outputs": [ { - "output_type": "stream", "name": "stdout", + "output_type": "stream", "text": [ "Dataset of 10 synthetic samples:\n", "Text: i think it s the easiest time of year to feel dissatisfied \n", @@ -745,6 +737,16 @@ "\n" ] } + ], + "source": [ + "print(\"Dataset of 10 synthetic samples:\")\n", + "print_random_sample(synthetic_gptj_10_ds)\n", + "print(\"Dataset of 50 synthetic samples:\")\n", + "print_random_sample(synthetic_gptj_50_ds)\n", + "print(\"Dataset of 100 synthetic samples:\")\n", + "print_random_sample(synthetic_gptj_100_ds)\n", + "print(\"Dataset of 200 synthetic samples:\")\n", + "print_random_sample(synthetic_gptj_200_ds)" ] }, { @@ -776,9 +778,11 @@ }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "0xqcEiJqBZsB" }, + "outputs": [], "source": [ "metric = load_metric(\"accuracy\")\n", "\n", @@ -792,42 +796,42 @@ "datasets_gpt3 = {'10': synthetic_gpt3_10_ds, '50': synthetic_gpt3_50_ds, '100': synthetic_gpt3_100_ds, '200': synthetic_gpt3_200_ds}\n", "datasets_gptneo = {'10': synthetic_gptneo_10_ds, '50': synthetic_gptneo_50_ds, '100': synthetic_gptneo_100_ds, '200': synthetic_gptneo_200_ds}\n", "datasets_gptj = {'10': synthetic_gptj_10_ds, '50': synthetic_gptj_50_ds, '100': synthetic_gptj_100_ds, '200': synthetic_gptj_200_ds}" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", - "source": [ - "**Disclaimer:** In case you don't want to train the models again, you can import our results by running only this cell and ignoring the other cells of this section." - ], "metadata": { "id": "hFQdGZMbqoot" - } + }, + "source": [ + "**Disclaimer:** In case you don't want to train the models again, you can import our results by running only this cell and ignoring the other cells of this section." + ] }, { "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "Fwya_HG1qm0O" + }, + "outputs": [], "source": [ "# run this if you do not want to train the models on your own.\n", "if not os.path.isdir('./results'):\n", " !git clone https://github.com/ml6team/quick-tips.git\n", " !cd quick-tips\n", - " !mv quick-tips/nlp/2021_11_25_augmentation_lm/results ./results\n", + " !mv quick-tips/nlp/augmentation_lm/results ./results\n", " !rm -rf quick-tips\n", "\n", "run_dicts = pickle.load(open('./results/run_dicts.pkl', 'rb'))" - ], - "metadata": { - "id": "Fwya_HG1qm0O" - }, - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "o-q6U8DAx0Jp" }, + "outputs": [], "source": [ "def compute_metrics(eval_pred):\n", " \"\"\"\n", @@ -901,9 +905,7 @@ " metrics = trainer.evaluate()\n", " \n", " return metrics, logger.acc_logs" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", @@ -917,9 +919,11 @@ }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "5mDmH_lwU84r" }, + "outputs": [], "source": [ "# train our model on the baseline dataset without augmentation\n", "metrics, logs = train_and_evaluate(emotion_train_ds, emotion_test_ds, \"baseline\")\n", @@ -929,9 +933,7 @@ " \"metrics\": metrics,\n", " \"logs\": logs\n", "})" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", @@ -944,6 +946,11 @@ }, { "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "d9VpXbJUW_7O" + }, + "outputs": [], "source": [ "# train the model on the augmented datasets of GPT-3\n", "for i in datasets_gpt3:\n", @@ -955,24 +962,24 @@ " \"metrics\": metrics,\n", " \"logs\": logs\n", " })" - ], - "metadata": { - "id": "d9VpXbJUW_7O" - }, - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", - "source": [ - "### Model with augmented data of GPT-Neo" - ], "metadata": { "id": "B_zFaIeVhWiF" - } + }, + "source": [ + "### Model with augmented data of GPT-Neo" + ] }, { "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "cFCr4PLRhXc2" + }, + "outputs": [], "source": [ "# train our model on the augmented datasets of GPT-Neo\n", "for i in datasets_gptneo:\n", @@ -984,12 +991,7 @@ " \"metrics\": metrics,\n", " \"logs\": logs\n", " })" - ], - "metadata": { - "id": "cFCr4PLRhXc2" - }, - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", @@ -1002,6 +1004,11 @@ }, { "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "CzjoYX46egsA" + }, + "outputs": [], "source": [ "# train the model on the augmented datasets of GPT-J\n", "for i in datasets_gptj:\n", @@ -1013,12 +1020,7 @@ " \"metrics\": metrics,\n", " \"logs\": logs\n", " })" - ], - "metadata": { - "id": "CzjoYX46egsA" - }, - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", @@ -1044,9 +1046,11 @@ }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "6ROh9zOSfMwE" }, + "outputs": [], "source": [ "df = pd.DataFrame(run_dicts)\n", "\n", @@ -1059,9 +1063,7 @@ "df_gpt3 = df.loc[df['id'].isin(gpt3_names)]\n", "df_gptj = df.loc[df['id'].isin(gptj_names)]\n", "df_gptneo = df.loc[df['id'].isin(gptneo_names)]" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", @@ -1074,6 +1076,11 @@ }, { "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "Cp0rIiUl7dQb" + }, + "outputs": [], "source": [ "def plot_accuracy_curve(df, title):\n", " fig = go.Figure()\n", @@ -1096,18 +1103,11 @@ " title=title)\n", "\n", " fig.show()" - ], - "metadata": { - "id": "Cp0rIiUl7dQb" - }, - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "code", - "source": [ - "plot_accuracy_curve(df_gpt3, \"Accuracy of the model in different versions of the dataset (augmentation by GPT-3).\")" - ], + "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/", @@ -1116,10 +1116,8 @@ "id": "xPlldKfg88KP", "outputId": "2aae733f-ee64-42c9-c0de-dca37430ff5a" }, - "execution_count": null, "outputs": [ { - "output_type": "display_data", "data": { "text/html": [ "\n", @@ -1173,24 +1171,26 @@ "" ] }, - "metadata": {} + "metadata": {}, + "output_type": "display_data" } - ] + ], + "source": [ + "plot_accuracy_curve(df_gpt3, \"Accuracy of the model in different versions of the dataset (augmentation by GPT-3).\")" + ] }, { "cell_type": "markdown", - "source": [ - "We observe that the accuracy of the model increases as we augment more and more data. After generating 200 extra synthetic samples, the accuracy exceeds 70% indicating that text augmentation can greatly improve the accuracy of our model." - ], "metadata": { "id": "ISMz0x0LCvJv" - } + }, + "source": [ + "We observe that the accuracy of the model increases as we augment more and more data. After generating 200 extra synthetic samples, the accuracy exceeds 70% indicating that text augmentation can greatly improve the accuracy of our model." + ] }, { "cell_type": "code", - "source": [ - "plot_accuracy_curve(df_gptneo, \"Accuracy of the model in different versions of the dataset (augmentation by GPT-Neo).\")" - ], + "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/", @@ -1199,10 +1199,8 @@ "id": "9IEYdrC4nBIn", "outputId": "f7a0b80c-6e0e-4baa-854b-9a7b270cfa8f" }, - "execution_count": null, "outputs": [ { - "output_type": "display_data", "data": { "text/html": [ "\n", @@ -1256,21 +1254,26 @@ "" ] }, - "metadata": {} + "metadata": {}, + "output_type": "display_data" } + ], + "source": [ + "plot_accuracy_curve(df_gptneo, \"Accuracy of the model in different versions of the dataset (augmentation by GPT-Neo).\")" ] }, { "cell_type": "markdown", - "source": [ - "The results of GPT-Neo are worse ๐Ÿ˜ž. The performance of the model decreases when we generate synthetic data (except in the case of 50 synthetic samples)." - ], "metadata": { "id": "EXTnkEBOnBeO" - } + }, + "source": [ + "The results of GPT-Neo are worse ๐Ÿ˜ž. The performance of the model decreases when we generate synthetic data (except in the case of 50 synthetic samples)." + ] }, { "cell_type": "code", + "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/", @@ -1279,13 +1282,8 @@ "id": "iH90XkBgbcJj", "outputId": "0015d35d-b4e1-4bf2-e851-f3149c7b0a38" }, - "source": [ - "plot_accuracy_curve(df_gptj, \"Accuracy of the model in different versions of the dataset (augmentation by GPT-J).\")" - ], - "execution_count": null, "outputs": [ { - "output_type": "display_data", "data": { "text/html": [ "\n", @@ -1339,8 +1337,12 @@ "" ] }, - "metadata": {} + "metadata": {}, + "output_type": "display_data" } + ], + "source": [ + "plot_accuracy_curve(df_gptj, \"Accuracy of the model in different versions of the dataset (augmentation by GPT-J).\")" ] }, { @@ -1363,6 +1365,7 @@ }, { "cell_type": "code", + "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/", @@ -1371,49 +1374,8 @@ "id": "xHliv6hLd_y2", "outputId": "7c01d5bd-804e-4237-e903-76cfc3d2cbe8" }, - "source": [ - "# keep the accuracy of the last training step\n", - "acc_gpt3 = [df_baseline.iloc[0]['logs'][-1]['eval_accuracy']]\n", - "for i in range(4):\n", - " acc_gpt3.append(df_gpt3.iloc[i]['logs'][-1]['eval_accuracy'])\n", - "\n", - "acc_gptj = [df_baseline.iloc[0]['logs'][-1]['eval_accuracy']]\n", - "for i in range(4):\n", - " acc_gptj.append(df_gptj.iloc[i]['logs'][-1]['eval_accuracy'])\n", - "\n", - "acc_gptneo = [df_baseline.iloc[0]['logs'][-1]['eval_accuracy']]\n", - "for i in range(4):\n", - " acc_gptneo.append(df_gptneo.iloc[i]['logs'][-1]['eval_accuracy'])\n", - "\n", - "fig = go.Figure()\n", - "\n", - "fig.add_trace(go.Scatter(\n", - " x=[0, 10, 50, 100, 200],\n", - " y=acc_gpt3,\n", - " name='GPT-3'))\n", - "\n", - "\n", - "fig.add_trace(go.Scatter(\n", - " x=[0, 10, 50, 100, 200],\n", - " y=acc_gptj,\n", - " name='GPT-J'))\n", - "\n", - "fig.add_trace(go.Scatter(\n", - " x=[0, 10, 50, 100, 200],\n", - " y=acc_gptneo,\n", - " name='GPT-Neo'))\n", - "\n", - "fig.update_xaxes(title_text='number of synthetic samples')\n", - "fig.update_yaxes(title_text='accuracy')\n", - "fig.update_layout(\n", - " title=\"Comparison of GPT-3, GPT-J and GPT-Neo in text augmentation.\")\n", - "\n", - "fig.show()" - ], - "execution_count": null, "outputs": [ { - "output_type": "display_data", "data": { "text/html": [ "\n", @@ -1467,8 +1429,48 @@ "" ] }, - "metadata": {} + "metadata": {}, + "output_type": "display_data" } + ], + "source": [ + "# keep the accuracy of the last training step\n", + "acc_gpt3 = [df_baseline.iloc[0]['logs'][-1]['eval_accuracy']]\n", + "for i in range(4):\n", + " acc_gpt3.append(df_gpt3.iloc[i]['logs'][-1]['eval_accuracy'])\n", + "\n", + "acc_gptj = [df_baseline.iloc[0]['logs'][-1]['eval_accuracy']]\n", + "for i in range(4):\n", + " acc_gptj.append(df_gptj.iloc[i]['logs'][-1]['eval_accuracy'])\n", + "\n", + "acc_gptneo = [df_baseline.iloc[0]['logs'][-1]['eval_accuracy']]\n", + "for i in range(4):\n", + " acc_gptneo.append(df_gptneo.iloc[i]['logs'][-1]['eval_accuracy'])\n", + "\n", + "fig = go.Figure()\n", + "\n", + "fig.add_trace(go.Scatter(\n", + " x=[0, 10, 50, 100, 200],\n", + " y=acc_gpt3,\n", + " name='GPT-3'))\n", + "\n", + "\n", + "fig.add_trace(go.Scatter(\n", + " x=[0, 10, 50, 100, 200],\n", + " y=acc_gptj,\n", + " name='GPT-J'))\n", + "\n", + "fig.add_trace(go.Scatter(\n", + " x=[0, 10, 50, 100, 200],\n", + " y=acc_gptneo,\n", + " name='GPT-Neo'))\n", + "\n", + "fig.update_xaxes(title_text='number of synthetic samples')\n", + "fig.update_yaxes(title_text='accuracy')\n", + "fig.update_layout(\n", + " title=\"Comparison of GPT-3, GPT-J and GPT-Neo in text augmentation.\")\n", + "\n", + "fig.show()" ] }, { @@ -1493,43 +1495,17 @@ }, { "cell_type": "code", + "execution_count": null, "metadata": { - "id": "B0Zv0UIqfM8Z", "colab": { "base_uri": "https://localhost:8080/", "height": 542 }, + "id": "B0Zv0UIqfM8Z", "outputId": "c3ce0276-19b1-4b79-80c3-148822ac13f6" }, - "source": [ - "fig = make_subplots(rows=1, cols=4,\n", - " subplot_titles=(\"Baseline (30 samples)\", \"GPT-3 (200 synthetic samples)\", \"GPT-Neo (200 synthetic samples)\", \"GPT-J (200 synthetic samples)\"))\n", - "\n", - "trace0 = go.Histogram(x=[mapping.int2str(i) for i in emotion_train_ds[\"label\"]],\n", - " opacity=0.8)\n", - "\n", - "trace1 = go.Histogram(x=[mapping.int2str(i) for i in concatenate_datasets([emotion_train_ds, datasets_gpt3['200']])[\"label\"]],\n", - " opacity=0.8)\n", - "\n", - "trace2 = go.Histogram(x=[mapping.int2str(i) for i in concatenate_datasets([emotion_train_ds, datasets_gptneo['200']])[\"label\"]],\n", - " opacity=0.8)\n", - "\n", - "trace3 = go.Histogram(x=[mapping.int2str(i) for i in concatenate_datasets([emotion_train_ds, datasets_gptj['200']])[\"label\"]],\n", - " opacity=0.8)\n", - "\n", - "fig.append_trace(trace0, 1, 1)\n", - "fig.append_trace(trace1, 1, 2)\n", - "fig.append_trace(trace2, 1, 3)\n", - "fig.append_trace(trace2, 1, 4)\n", - "fig.update_layout(showlegend=False, title_text=\"Distribution of labels\", \n", - " bargap=0.30, width=1300)\n", - "\n", - "fig.show()" - ], - "execution_count": null, "outputs": [ { - "output_type": "display_data", "data": { "text/html": [ "\n", @@ -1583,8 +1559,34 @@ "" ] }, - "metadata": {} + "metadata": {}, + "output_type": "display_data" } + ], + "source": [ + "fig = make_subplots(rows=1, cols=4,\n", + " subplot_titles=(\"Baseline (30 samples)\", \"GPT-3 (200 synthetic samples)\", \"GPT-Neo (200 synthetic samples)\", \"GPT-J (200 synthetic samples)\"))\n", + "\n", + "trace0 = go.Histogram(x=[mapping.int2str(i) for i in emotion_train_ds[\"label\"]],\n", + " opacity=0.8)\n", + "\n", + "trace1 = go.Histogram(x=[mapping.int2str(i) for i in concatenate_datasets([emotion_train_ds, datasets_gpt3['200']])[\"label\"]],\n", + " opacity=0.8)\n", + "\n", + "trace2 = go.Histogram(x=[mapping.int2str(i) for i in concatenate_datasets([emotion_train_ds, datasets_gptneo['200']])[\"label\"]],\n", + " opacity=0.8)\n", + "\n", + "trace3 = go.Histogram(x=[mapping.int2str(i) for i in concatenate_datasets([emotion_train_ds, datasets_gptj['200']])[\"label\"]],\n", + " opacity=0.8)\n", + "\n", + "fig.append_trace(trace0, 1, 1)\n", + "fig.append_trace(trace1, 1, 2)\n", + "fig.append_trace(trace2, 1, 3)\n", + "fig.append_trace(trace2, 1, 4)\n", + "fig.update_layout(showlegend=False, title_text=\"Distribution of labels\", \n", + " bargap=0.30, width=1300)\n", + "\n", + "fig.show()" ] }, { @@ -1618,9 +1620,11 @@ }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "sSTaVSOfiUOQ" }, + "outputs": [], "source": [ "# load the synthetic datasets with 300, 400 and 500 samples.\n", "# run this if you do not want to generate new samples on your own.\n", @@ -1628,42 +1632,45 @@ "if not os.path.isdir('./data'):\n", " !git clone https://github.com/ml6team/quick-tips.git\n", " !cd quick-tips\n", - " !mv quick-tips/nlp/2021_11_25_gpt3mix/data ./data\n", + " !mv quick-tips/nlp/gpt3mix/data ./data\n", " !rm -rf quick-tips\n", "\n", "synthetic_gpt3_300_ds = load_from_disk('./data/gpt-3/300')\n", "synthetic_gpt3_400_ds = load_from_disk('./data/gpt-3/400')\n", "synthetic_gpt3_500_ds = load_from_disk('./data/gpt-3/500')" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "f5WBfDvpAP_F" + }, + "outputs": [], "source": [ "# prepare datasets\n", "max_size = 500\n", "steps = 5*int(max_size/batch_size) # 4 epochs in the large dataset\n", "\n", "datasets_gpt3_more = {'300': synthetic_gpt3_300_ds, '400': synthetic_gpt3_400_ds, '500': synthetic_gpt3_500_ds}" - ], - "metadata": { - "id": "f5WBfDvpAP_F" - }, - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", - "source": [ - "In case you loaded our results earlier, ignore the next cell that trains the model on the the new augmented datasets." - ], "metadata": { "id": "mi2x-sI9CzO1" - } + }, + "source": [ + "In case you loaded our results earlier, ignore the next cell that trains the model on the the new augmented datasets." + ] }, { "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "wIy1GV0vt7mk" + }, + "outputs": [], "source": [ "# train the model on the augmented datasets of GPT-3\n", "# in case you loaded our results earlier, ignore this cell\n", @@ -1677,30 +1684,26 @@ " \"metrics\": metrics,\n", " \"logs\": logs\n", " })\n" - ], - "metadata": { - "id": "wIy1GV0vt7mk" - }, - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "code", + "execution_count": null, "metadata": { "id": "lPWA5_9Cz5-7" }, + "outputs": [], "source": [ "df = pd.DataFrame(run_dicts)\n", "gpt3_names_more = ['augmented_gpt3_10', 'augmented_gpt3_50', 'augmented_gpt3_100', 'augmented_gpt3_200', \n", " 'augmented_gpt3_300', 'augmented_gpt3_400', 'augmented_gpt3_500']\n", "\n", "df_gpt3_more = df.loc[df['id'].isin(gpt3_names_more)]" - ], - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "code", + "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/", @@ -1709,29 +1712,8 @@ "id": "qur5SW4CmGeH", "outputId": "3fe63e36-ce9a-4502-911d-61266eaaa538" }, - "source": [ - "acc_gpt3_more = [df_baseline.iloc[0]['logs'][-1]['eval_accuracy']]\n", - "for i in range(7):\n", - " acc_gpt3_more.append(df_gpt3_more.iloc[i]['logs'][-1]['eval_accuracy'])\n", - "\n", - "fig = go.Figure()\n", - "\n", - "fig.add_trace(go.Scatter(\n", - " x=[0, 10, 50, 100, 200, 300, 400, 500],\n", - " y=acc_gpt3_more,\n", - " name='GPT-3'))\n", - "\n", - "fig.update_xaxes(title_text='number of extra samples')\n", - "fig.update_yaxes(title_text='accuracy')\n", - "fig.update_layout(showlegend=True, \n", - " title=\"Accuracy of the model in different versions of the dataset (augmentation by GPT-3).\")\n", - "\n", - "fig.show()" - ], - "execution_count": null, "outputs": [ { - "output_type": "display_data", "data": { "text/html": [ "\n", @@ -1785,8 +1767,28 @@ "" ] }, - "metadata": {} + "metadata": {}, + "output_type": "display_data" } + ], + "source": [ + "acc_gpt3_more = [df_baseline.iloc[0]['logs'][-1]['eval_accuracy']]\n", + "for i in range(7):\n", + " acc_gpt3_more.append(df_gpt3_more.iloc[i]['logs'][-1]['eval_accuracy'])\n", + "\n", + "fig = go.Figure()\n", + "\n", + "fig.add_trace(go.Scatter(\n", + " x=[0, 10, 50, 100, 200, 300, 400, 500],\n", + " y=acc_gpt3_more,\n", + " name='GPT-3'))\n", + "\n", + "fig.update_xaxes(title_text='number of extra samples')\n", + "fig.update_yaxes(title_text='accuracy')\n", + "fig.update_layout(showlegend=True, \n", + " title=\"Accuracy of the model in different versions of the dataset (augmentation by GPT-3).\")\n", + "\n", + "fig.show()" ] }, { @@ -1822,5 +1824,35 @@ "\n" ] } - ] -} \ No newline at end of file + ], + "metadata": { + "accelerator": "GPU", + "colab": { + "collapsed_sections": [], + "name": "nlp_gpt3mix.ipynb", + "provenance": [] + }, + "interpreter": { + "hash": "b3ba2566441a7c06988d0923437866b63cedc61552a5af99d1f4fb67d367b25f" + }, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.6" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/nlp/2021_11_25_augmentation_lm/results/run_dicts.pkl b/nlp/augmentation_lm/results/run_dicts.pkl similarity index 100% rename from nlp/2021_11_25_augmentation_lm/results/run_dicts.pkl rename to nlp/augmentation_lm/results/run_dicts.pkl diff --git a/nlp/2021_02_26_compact_transformers/README.md b/nlp/compact_transformers/README.md similarity index 100% rename from nlp/2021_02_26_compact_transformers/README.md rename to nlp/compact_transformers/README.md diff --git a/nlp/2021_02_26_compact_transformers/compact_transformers.ipynb b/nlp/compact_transformers/compact_transformers.ipynb similarity index 100% rename from nlp/2021_02_26_compact_transformers/compact_transformers.ipynb rename to nlp/compact_transformers/compact_transformers.ipynb diff --git a/nlp/2021_06_18_data_augmentation/README.md b/nlp/data_augmentation/README.md similarity index 93% rename from nlp/2021_06_18_data_augmentation/README.md rename to nlp/data_augmentation/README.md index 6967174..dc0d244 100644 --- a/nlp/2021_06_18_data_augmentation/README.md +++ b/nlp/data_augmentation/README.md @@ -7,4 +7,4 @@ The training size will impact the performace of a model heavily, this notebook l We recommend to open the notebook using Colab for an interactive explainable experience and optimal rendering of the visuals ๐Ÿ‘‡: -[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/ml6team/quick-tips/blob/main/nlp/2021_06_18_data_augmentation/totw_nlp_dat_aug.ipynb) +[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/ml6team/quick-tips/blob/main/nlp/data_augmentation/totw_nlp_dat_aug.ipynb) diff --git a/nlp/2021_06_18_data_augmentation/totw_nlp_dat_aug.ipynb b/nlp/data_augmentation/totw_nlp_dat_aug.ipynb similarity index 100% rename from nlp/2021_06_18_data_augmentation/totw_nlp_dat_aug.ipynb rename to nlp/data_augmentation/totw_nlp_dat_aug.ipynb diff --git a/nlp/2021_10_12_huggingface_optimum/README.md b/nlp/huggingface_optimum/README.md similarity index 92% rename from nlp/2021_10_12_huggingface_optimum/README.md rename to nlp/huggingface_optimum/README.md index 1a3bfa3..66e26c0 100644 --- a/nlp/2021_10_12_huggingface_optimum/README.md +++ b/nlp/huggingface_optimum/README.md @@ -10,4 +10,4 @@ It also compares the performance of Optimum - LPOT quantization, ONNX/ONNX Runti It's recommended to run this notebook using Google Cloud AI Platform, using an N2-standard-4 machine. But for ease of use, you can follow this link for a Colab version ๐Ÿ‘‡: -[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/ml6team/quick-tips/blob/main/nlp/2021_10_12_huggingface_optimum/optimum.ipynb) +[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/ml6team/quick-tips/blob/main/nlp/huggingface_optimum/optimum.ipynb) diff --git a/nlp/2021_10_12_huggingface_optimum/optimum.ipynb b/nlp/huggingface_optimum/optimum.ipynb similarity index 100% rename from nlp/2021_10_12_huggingface_optimum/optimum.ipynb rename to nlp/huggingface_optimum/optimum.ipynb diff --git a/nlp/2021_10_12_huggingface_optimum/quantization.yml b/nlp/huggingface_optimum/quantization.yml similarity index 100% rename from nlp/2021_10_12_huggingface_optimum/quantization.yml rename to nlp/huggingface_optimum/quantization.yml diff --git a/nlp/2021_06_29_long_range_transformers/.gitignore b/nlp/long_range_transformers/.gitignore similarity index 100% rename from nlp/2021_06_29_long_range_transformers/.gitignore rename to nlp/long_range_transformers/.gitignore diff --git a/nlp/2021_06_29_long_range_transformers/LongRangeTransformers.ipynb b/nlp/long_range_transformers/LongRangeTransformers.ipynb similarity index 95% rename from nlp/2021_06_29_long_range_transformers/LongRangeTransformers.ipynb rename to nlp/long_range_transformers/LongRangeTransformers.ipynb index 6b977c8..80f9360 100644 --- a/nlp/2021_06_29_long_range_transformers/LongRangeTransformers.ipynb +++ b/nlp/long_range_transformers/LongRangeTransformers.ipynb @@ -1,20288 +1,338 @@ { - "nbformat": 4, - "nbformat_minor": 2, - "metadata": { - "accelerator": "GPU", - "colab": { - "name": "LongRangeTransformers.ipynb", - "provenance": [], - "collapsed_sections": [ - "7InJLnvld8zu" - ], - "toc_visible": true, - "machine_shape": "hm" + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "id": "7XEv8E2SALpN" + }, + "source": [ + "# ๐Ÿ‰ Go long! Long range transformers\n", + "\n", + "In recent research there has been an extensive study for improving the calculation of attention in transformer architectures. Mostly for improving their capacity to handle longer token sequences. ๐Ÿ‘Š\n", + "\n", + "The attention calculation is known to be quadratic in computation time with respect to the sequence length ๐Ÿ‘Ž. These recent advances, however, are able to perform attention calculation in near-linear time with respect to the sequence length. This allows us to scale the transformer architecture such that it can handle input sequences beyond the usual token length of 512 more efficiently. \n", + "\n", + "In this notebook, we compare traditional transformers with novel efficient transformers. We'll use roBERTa as a baseline to compare against LongFormer and BigBird. \n", + "\n", + "Let's put these architectures to the test and see which one comes out on top ๐Ÿ†! \n" + ] }, - "environment": { - "name": "common-cu101.m59", - "type": "gcloud", - "uri": "gcr.io/deeplearning-platform-release/base-cu101:m59" + { + "cell_type": "markdown", + "metadata": { + "id": "l8ymZTD_BSZS" + }, + "source": [ + "## ๐Ÿ› ๏ธ Getting started: Install packages & download models\n", + "\n", + "The below cells will setup everything that is required to get started with model training:\n", + "\n", + "* Install python specific packages\n", + "* Import required packages" + ] }, - "interpreter": { - "hash": "86993f52baa5a6752b4073e845d54786683e91310bee8dfb64ea6af9c3404bcd" + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "eDI558ScBWpZ" + }, + "outputs": [], + "source": [ + "!pip install -q sklearn transformers datasets torch plotly sentencepiece tqdm" + ] }, - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "x7x9s2l_VMLV" + }, + "outputs": [], + "source": [ + "import time \n", + "import sys \n", + "import json\n", + "import shutil\n", + "import pandas as pd\n", + "from enum import Enum\n", + "import math\n", + "import torch\n", + "\n", + "import plotly.express as px\n", + "import plotly.graph_objects as go\n", + "\n", + "\n", + "from sklearn.metrics import accuracy_score, precision_recall_fscore_support\n", + "from transformers import BigBirdTokenizerFast, BigBirdForSequenceClassification, RobertaTokenizer, RobertaForSequenceClassification, LongformerForSequenceClassification, TrainingArguments, Trainer, LongformerTokenizerFast\n", + "from datasets import load_dataset" + ] }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 + { + "cell_type": "markdown", + "metadata": { + "id": "UOdkK5pXBSZT" }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.7.8" + "source": [ + "## ๐Ÿ’พ Dataset & downstream task\n", + "\n", + "We will use the [Hyperpartisan news dataset](https://huggingface.co/datasets/hyperpartisan_news_detection) for binary sentiment classification. In the paper publication of LongFormer and BigBird, both architectures were compared against RoBERTa with this exact dataset.\n", + "\n", + "This dataset contains on average wordpieces, which is ideal to make our point ๐Ÿ’ช.\n", + "\n", + "We aim to gain more insight in when to use which architecture, therefore we will go *one step beyond ๐Ÿ”ฅ*, and evaluate the architectures on distinct subsets of the data, each time introducing sentences with more tokens!" + ] }, - "widgets": { - "application/vnd.jupyter.widget-state+json": { - "55899ffd648e4247a473524510624500": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_4d57388a992d44b09d7eda48dd7e9a07", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_6d021402a64e4990aa5cb85a6be8b5e0", - "IPY_MODEL_8b4db93741974396a007b24cd29118c9", - "IPY_MODEL_e5c0ed77e8484e5084f1b9ba1d80ed6d" - ] - } - }, - "4d57388a992d44b09d7eda48dd7e9a07": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "6d021402a64e4990aa5cb85a6be8b5e0": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_8321e53c31bf4b77870dc240eee11321", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "Downloading: 100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_619bb5f14a874943af67c1c090cb26c5" - } - }, - "8b4db93741974396a007b24cd29118c9": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_579df599346c42379509f8bb9fd42522", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 501200538, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 501200538, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_988a9bf6f9f245409d89ed7daed1ca64" - } - }, - "e5c0ed77e8484e5084f1b9ba1d80ed6d": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_4b2f2c688ed74a07950f19b1acd31c82", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 501M/501M [00:08<00:00, 58.4MB/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_34c4d6b388d54cdda4458862047790c7" - } - }, - "8321e53c31bf4b77870dc240eee11321": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "619bb5f14a874943af67c1c090cb26c5": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "579df599346c42379509f8bb9fd42522": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "988a9bf6f9f245409d89ed7daed1ca64": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "4b2f2c688ed74a07950f19b1acd31c82": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "34c4d6b388d54cdda4458862047790c7": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "b63ad79c90de4be198e55dff6841c088": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_7b2d3cabb42644469d85c89876ff61d0", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_43de47548fa04353b2b10bc7e910b782", - "IPY_MODEL_f98a2a70e4324d549b61fa91506f0dc0", - "IPY_MODEL_b165dcc6a8164171827f6f592fb6ccf6" - ] - } - }, - "7b2d3cabb42644469d85c89876ff61d0": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "43de47548fa04353b2b10bc7e910b782": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_3a9ed80daead4101b181a87d4d3691e3", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "Downloading: 100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_d470cb17ea994fe28452f9ae16c7f82d" - } - }, - "f98a2a70e4324d549b61fa91506f0dc0": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_fdd465843f6148439cb5a1f8337f16f8", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 898823, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 898823, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_d17637dd47de4d09bd546675ce6fde6f" - } - }, - "b165dcc6a8164171827f6f592fb6ccf6": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_067a8aae09414386945b2471587e72d3", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 899k/899k [00:00<00:00, 1.68MB/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_09d95ce0097b447f8cead167efa3198d" - } - }, - "3a9ed80daead4101b181a87d4d3691e3": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "d470cb17ea994fe28452f9ae16c7f82d": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "fdd465843f6148439cb5a1f8337f16f8": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "d17637dd47de4d09bd546675ce6fde6f": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "067a8aae09414386945b2471587e72d3": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "09d95ce0097b447f8cead167efa3198d": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "672d1cdaf30f4935a91d8f34a5d17d59": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_bf89db6d45854ef1b79d15c57e8ffbd6", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_11f56af13d104a52b4370582c9de2f00", - "IPY_MODEL_34f8c2c170e3420781cebc3624f1cae1", - "IPY_MODEL_1b88aa00f5334a2fa68709294c0dacba" - ] - } - }, - "bf89db6d45854ef1b79d15c57e8ffbd6": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "11f56af13d104a52b4370582c9de2f00": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_5ae2a5ff3447425eb7fac499197fc803", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "Downloading: 100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_ad8ac6b922c7405f8a7c65b819672085" - } - }, - "34f8c2c170e3420781cebc3624f1cae1": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_9df88c3cc59943b6968a70d1308fe369", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 456318, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 456318, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_723cca500fbe4b19b398823c8d4de30f" - } - }, - "1b88aa00f5334a2fa68709294c0dacba": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_d54392996ca7410da18d135f02da1457", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 456k/456k [00:00<00:00, 1.62MB/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_32f36db3ef03419fbceea97713c38ee5" - } - }, - "5ae2a5ff3447425eb7fac499197fc803": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "ad8ac6b922c7405f8a7c65b819672085": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "9df88c3cc59943b6968a70d1308fe369": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "723cca500fbe4b19b398823c8d4de30f": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "d54392996ca7410da18d135f02da1457": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "32f36db3ef03419fbceea97713c38ee5": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "afb2dc2d78d14e42b55496ee958617fc": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_1c1deac85a7c4d429f58704506ed293c", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_5023a41d29714e1db9b311711865c40c", - "IPY_MODEL_427ef7ea59214c579bc83bbda47544ac", - "IPY_MODEL_1997e39f4889454ea2e82779bc716153" - ] - } - }, - "1c1deac85a7c4d429f58704506ed293c": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "5023a41d29714e1db9b311711865c40c": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_5b1a5ed00cf14876bceb62c595173d87", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "Downloading: 100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_4a65045961c140f19fe42b5b4d8c1f17" - } - }, - "427ef7ea59214c579bc83bbda47544ac": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_e76c6641872c4906aea39481fd91a530", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1355863, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1355863, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_b0984669e9d24a118cff1b52baaa443e" - } - }, - "1997e39f4889454ea2e82779bc716153": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_d9f2b66e0f1d4e2fac120ceb161a1183", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1.36M/1.36M [00:00<00:00, 4.92MB/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_4357e0c77797493f9b97d981cd87ed47" - } - }, - "5b1a5ed00cf14876bceb62c595173d87": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "4a65045961c140f19fe42b5b4d8c1f17": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "e76c6641872c4906aea39481fd91a530": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "b0984669e9d24a118cff1b52baaa443e": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "d9f2b66e0f1d4e2fac120ceb161a1183": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "4357e0c77797493f9b97d981cd87ed47": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "7b3fc592e43749918f8981d87978cd96": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_cbfc043bfe8c4211a213751063a440ce", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_dc7e24c4675941d48717a0641a1da966", - "IPY_MODEL_2722a901b44945c1b0ff4112b7dc782b", - "IPY_MODEL_6de10a345ad0433eb188ddcfbba05e2a" - ] - } - }, - "cbfc043bfe8c4211a213751063a440ce": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "dc7e24c4675941d48717a0641a1da966": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_928571a3137844bb89ac565a63ceb341", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "Downloading: 100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_27acd10018714d309dbe039d93b8d961" - } - }, - "2722a901b44945c1b0ff4112b7dc782b": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_dd7018175ebd4830939cde69d44fbd71", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 694, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 694, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_874e58a5169046098bd5f727a9c38418" - } - }, - "6de10a345ad0433eb188ddcfbba05e2a": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_5af03825393d4eb9a2c65b1d567a831c", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 694/694 [00:00<00:00, 30.4kB/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_ba780f52be7c41ad9199b14cc1e8fa88" - } - }, - "928571a3137844bb89ac565a63ceb341": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "27acd10018714d309dbe039d93b8d961": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "dd7018175ebd4830939cde69d44fbd71": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "874e58a5169046098bd5f727a9c38418": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "5af03825393d4eb9a2c65b1d567a831c": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "ba780f52be7c41ad9199b14cc1e8fa88": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "9cb348b5e883446e9c952fa2dd123b48": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_76d66e6d27734b739c974b256486f108", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_e1366573404d429cba62737029c3d823", - "IPY_MODEL_3029b03f4b084b148743d1097c69d27f", - "IPY_MODEL_6edb44e41f014fe98f81e61553313554" - ] - } - }, - "76d66e6d27734b739c974b256486f108": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "e1366573404d429cba62737029c3d823": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_843cf821df1e4195807159ce5d035af0", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "Downloading: 100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_577cb7f496644abbab3f6216e0cb4ae5" - } - }, - "3029b03f4b084b148743d1097c69d27f": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_ce6bb7b2614743d18f0fbf78f832f714", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 597257159, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 597257159, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_f4c278e2b471457ca9888d6f33f80543" - } - }, - "6edb44e41f014fe98f81e61553313554": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_b5b01b231d8d4d0394a2f6b7d6f4f582", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 597M/597M [00:10<00:00, 64.5MB/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_f899d3939ec744ed8a80cd2fcf8e8478" - } - }, - "843cf821df1e4195807159ce5d035af0": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "577cb7f496644abbab3f6216e0cb4ae5": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "ce6bb7b2614743d18f0fbf78f832f714": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "f4c278e2b471457ca9888d6f33f80543": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "b5b01b231d8d4d0394a2f6b7d6f4f582": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "f899d3939ec744ed8a80cd2fcf8e8478": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "b38ef695beb7461ba55c7cb653fec48b": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_239a4b7c80c54a68a550d0b825a829e2", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_d440419e016e4befa04b54506470277c", - "IPY_MODEL_3ea0a931a699498a90c584086b80de4b", - "IPY_MODEL_4e80f5740aeb47ff9e36ba3a359e17dd" - ] - } - }, - "239a4b7c80c54a68a550d0b825a829e2": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "d440419e016e4befa04b54506470277c": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_f4377b535d644daf89803743dc39925f", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "Downloading: 100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_17a4614cb92943caa33ca9b8a5e7b486" - } - }, - "3ea0a931a699498a90c584086b80de4b": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_a80b826d6c044ac0b083305fb700d989", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 845731, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 845731, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_b78bc350d41845168111292b15d60760" - } - }, - "4e80f5740aeb47ff9e36ba3a359e17dd": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_7011beedb71a4fa0a06360fcc540ba2f", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 846k/846k [00:00<00:00, 14.4MB/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_ddd1e98628ce48c2899df72673056929" - } - }, - "f4377b535d644daf89803743dc39925f": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "17a4614cb92943caa33ca9b8a5e7b486": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "a80b826d6c044ac0b083305fb700d989": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "b78bc350d41845168111292b15d60760": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "7011beedb71a4fa0a06360fcc540ba2f": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "ddd1e98628ce48c2899df72673056929": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "c096e371e0de430cb5ddfa9c5aff3c87": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_40d0906b51e44befb324edfeba7b5ee0", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_0bce30a1e9f7447d8e654b893e002102", - "IPY_MODEL_3354f587c2fd433b8e771bbccc464c95", - "IPY_MODEL_1463c0894c8a431d9dd1e092c9b5aa1c" - ] - } - }, - "40d0906b51e44befb324edfeba7b5ee0": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "0bce30a1e9f7447d8e654b893e002102": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_9838ea8da8624c7f914e0e5b282e92e5", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "Downloading: 100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_8822943c12624ddba8621039022cc3a2" - } - }, - "3354f587c2fd433b8e771bbccc464c95": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_e9ee467f7d184bd1a8c16ae66db33afa", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 775, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 775, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_f701c5107f694ce0b91eb9a764150422" - } - }, - "1463c0894c8a431d9dd1e092c9b5aa1c": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_012628b907134e1f9283af807c649c41", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 775/775 [00:00<00:00, 32.8kB/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_a2f8147c443646c5b95abbf0598d9537" - } - }, - "9838ea8da8624c7f914e0e5b282e92e5": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "8822943c12624ddba8621039022cc3a2": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "e9ee467f7d184bd1a8c16ae66db33afa": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "f701c5107f694ce0b91eb9a764150422": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "012628b907134e1f9283af807c649c41": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "a2f8147c443646c5b95abbf0598d9537": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "5f9d7a14bbaa481f93c06b16d7928a0f": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_b276f277ea32452e8dec3dd577c12cbd", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_018f1e518e1742f0a0bab7b217e82d87", - "IPY_MODEL_642fdb797f824abab4958679a5cc9ed9", - "IPY_MODEL_3825327aefe445898e0831aa2e6fe2f7" - ] - } - }, - "b276f277ea32452e8dec3dd577c12cbd": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "018f1e518e1742f0a0bab7b217e82d87": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_7f75729f788b49e98490c3f3fc7f7ad6", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "Downloading: 100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_8197349fa532498996802976bcb74686" - } - }, - "642fdb797f824abab4958679a5cc9ed9": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_fb0acf54604743df9f705cd7101061c7", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1017, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1017, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_d7f5bc603c5345ae960c0e311a282863" - } - }, - "3825327aefe445898e0831aa2e6fe2f7": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_0ba1f9c9d8824e179c92ccde81c27cca", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1.02k/1.02k [00:00<00:00, 44.8kB/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_4b606f990c3f47309c30e4f88885b3ba" - } - }, - "7f75729f788b49e98490c3f3fc7f7ad6": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "8197349fa532498996802976bcb74686": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "fb0acf54604743df9f705cd7101061c7": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "d7f5bc603c5345ae960c0e311a282863": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "0ba1f9c9d8824e179c92ccde81c27cca": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "4b606f990c3f47309c30e4f88885b3ba": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "fa8a7f1a82d3411c8ce7dd38d4fb9c33": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_4acb0e1bb77746648bfdf71026e38bdf", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_9bf5556edb0944a68862ea5bc0a64d20", - "IPY_MODEL_26c3f0c073964751aed3b8d3f2c72083", - "IPY_MODEL_fb35af599ea44c37898dccaf739e000a" - ] - } - }, - "4acb0e1bb77746648bfdf71026e38bdf": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "9bf5556edb0944a68862ea5bc0a64d20": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_6ae7e8bb3dfc4150a2ae94cf89205df7", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "Downloading: 100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_09d5745cf90c457eb60caaaf613b8c2c" - } - }, - "26c3f0c073964751aed3b8d3f2c72083": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_a3146a4a178c4443a06326aa7feb5c9e", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 760, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 760, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_461246088a834b2989c4f628e1a55f77" - } - }, - "fb35af599ea44c37898dccaf739e000a": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_4272e4f5801040a0b83987be126ba5b6", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 760/760 [00:00<00:00, 28.9kB/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_fd228fdca2e54bd0b1eed3cfd7256fec" - } - }, - "6ae7e8bb3dfc4150a2ae94cf89205df7": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "09d5745cf90c457eb60caaaf613b8c2c": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "a3146a4a178c4443a06326aa7feb5c9e": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "461246088a834b2989c4f628e1a55f77": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "4272e4f5801040a0b83987be126ba5b6": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "fd228fdca2e54bd0b1eed3cfd7256fec": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "ff04371f71c4455a903dafb1c200d130": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_187da9592d0b4e81a4d48b8a222e52e6", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_fa90fe70866d4cb586a8fb78bceb9ff1", - "IPY_MODEL_6bcbd08b2b814863b80fbc9aef79dfd6", - "IPY_MODEL_9bc97584c9cc4f599f729047158aa034" - ] - } - }, - "187da9592d0b4e81a4d48b8a222e52e6": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "fa90fe70866d4cb586a8fb78bceb9ff1": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_5da7742f8eca4d1d8733b850a4b04647", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "Downloading: 100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_36d25db734514a8aac83310539fd44ef" - } - }, - "6bcbd08b2b814863b80fbc9aef79dfd6": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_fa0a423c094c44a09b7c55d1c6057dbc", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 512568261, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 512568261, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_30ac91ccd8e848a29679f865842f85d9" - } - }, - "9bc97584c9cc4f599f729047158aa034": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_47015be3b3cb4c01ba605e4f76f2956b", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 513M/513M [00:13<00:00, 41.8MB/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_05afbe8927ca4a7fa4e8a9fd435ce7ff" - } - }, - "5da7742f8eca4d1d8733b850a4b04647": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "36d25db734514a8aac83310539fd44ef": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "fa0a423c094c44a09b7c55d1c6057dbc": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "30ac91ccd8e848a29679f865842f85d9": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "47015be3b3cb4c01ba605e4f76f2956b": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "05afbe8927ca4a7fa4e8a9fd435ce7ff": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "6cd875eef4254429baee29d5e0310ba5": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_ff44676aef784e9ea6e691e2b76fb230", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_578a69d1c3134fc6bb617abc8ee7110f", - "IPY_MODEL_b2342c31553e4152b658a06d472d3aa5", - "IPY_MODEL_86650a26ed1744268eeec356dfa3e10c" - ] - } - }, - "ff44676aef784e9ea6e691e2b76fb230": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "578a69d1c3134fc6bb617abc8ee7110f": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_fb84b43f5e204034a1ec1d48e8f75e70", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_74526b61c5f74b9f8e50d2ca128a5ee9" - } - }, - "b2342c31553e4152b658a06d472d3aa5": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_b7731c1835b9489d9eaf83f28ae69383", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_9cc668c3019d436e9d33a362b21ce441" - } - }, - "86650a26ed1744268eeec356dfa3e10c": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_53fdfe2620894f5b9a4dc9a3614cbcd3", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 4.62ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_331ed34ff1164bf185a40fc5e613894a" - } - }, - "fb84b43f5e204034a1ec1d48e8f75e70": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "74526b61c5f74b9f8e50d2ca128a5ee9": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "b7731c1835b9489d9eaf83f28ae69383": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "9cc668c3019d436e9d33a362b21ce441": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "53fdfe2620894f5b9a4dc9a3614cbcd3": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "331ed34ff1164bf185a40fc5e613894a": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "e843c04a87ad4d8c8fdaca77ed14a22c": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_aa66b7472c8c4a519bfe9d21a8d9274d", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_b49b7e1e7e68481cb18a87a67bcc8a00", - "IPY_MODEL_70be36b2bdc7408aa685e30fae200055", - "IPY_MODEL_2018348757cb429185abbd4af1bb7ad4" - ] - } - }, - "aa66b7472c8c4a519bfe9d21a8d9274d": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "b49b7e1e7e68481cb18a87a67bcc8a00": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_3ca04f9f70b348d193d57e9b4e8f6095", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_cd45060a36f841aeaaaa8c28d4b10c82" - } - }, - "70be36b2bdc7408aa685e30fae200055": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_42188a87ae8c4b9d96553d5a1c47d313", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_827e125ab0db4f298a30531a3db46fcb" - } - }, - "2018348757cb429185abbd4af1bb7ad4": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_fe2a1c9b231a4980b584af79c70a9114", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:02<00:00, 2.88s/ba]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_444f4699b0424c37847b24cc73f8d8ef" - } - }, - "3ca04f9f70b348d193d57e9b4e8f6095": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "cd45060a36f841aeaaaa8c28d4b10c82": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "42188a87ae8c4b9d96553d5a1c47d313": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "827e125ab0db4f298a30531a3db46fcb": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "fe2a1c9b231a4980b584af79c70a9114": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "444f4699b0424c37847b24cc73f8d8ef": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "0f29b422508a484f98d8541e7c42c67d": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_5c9ac73dcb954adb95d47c3388234adc", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_fbb8d837f2c34e12b04992d70c950d89", - "IPY_MODEL_fffa29c189c7403384cb5d33cc8641e3", - "IPY_MODEL_f0e4d168b4024d10a7b619e6a8c5a685" - ] - } - }, - "5c9ac73dcb954adb95d47c3388234adc": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "fbb8d837f2c34e12b04992d70c950d89": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_a91260355b454352a8ed2f3745d6e3e5", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_2fabb3a1bfa9495295bee7a7b16e6ce7" - } - }, - "fffa29c189c7403384cb5d33cc8641e3": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_eebe8d4d80914c85a463ac6cb84c4ada", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_a364bbe1366d4ced863c05a4f0307412" - } - }, - "f0e4d168b4024d10a7b619e6a8c5a685": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_92c3e278463041a7a9b6c4014faf8064", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 13.66ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_a87d0e21def2485db74c54d003802920" - } - }, - "a91260355b454352a8ed2f3745d6e3e5": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "2fabb3a1bfa9495295bee7a7b16e6ce7": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "eebe8d4d80914c85a463ac6cb84c4ada": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "a364bbe1366d4ced863c05a4f0307412": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "92c3e278463041a7a9b6c4014faf8064": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "a87d0e21def2485db74c54d003802920": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "565044fbf5a44f9cb25a182ccc7f7724": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_a14da623707f4e6b8264e64722feab63", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_a3509b53f5874df996beaa0ad8a2715f", - "IPY_MODEL_c449b4729cd3495989eed99d4c043674", - "IPY_MODEL_40db3074685e4c2b8bf95a690b913a49" - ] - } - }, - "a14da623707f4e6b8264e64722feab63": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "a3509b53f5874df996beaa0ad8a2715f": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_2593f4f02e23475caa734f71b7ea0b90", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_7eeb3cb584c94389afaa56904300f1f3" - } - }, - "c449b4729cd3495989eed99d4c043674": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_23aecbc928de42f7b3da4388301da5d8", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_f976b6294aca47d3ae05b06217a8e2ba" - } - }, - "40db3074685e4c2b8bf95a690b913a49": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_cb0a555e8e8e4b03b8ecc581b2c4932c", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 4.03ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_164d6b58433d4b53910a69588336c5da" - } - }, - "2593f4f02e23475caa734f71b7ea0b90": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "7eeb3cb584c94389afaa56904300f1f3": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "23aecbc928de42f7b3da4388301da5d8": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "f976b6294aca47d3ae05b06217a8e2ba": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "cb0a555e8e8e4b03b8ecc581b2c4932c": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "164d6b58433d4b53910a69588336c5da": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "d2b93cf59d5849699938b996a19ede97": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_4c909f77c79e461783d362793729ffcc", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_56e0c690da5d4a0ab20be5e53c8000a2", - "IPY_MODEL_72482ed1385d46fcafabcc206991d511", - "IPY_MODEL_9e3c7fd88e5e401d99bd5ca9cbcdf9e8" - ] - } - }, - "4c909f77c79e461783d362793729ffcc": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "56e0c690da5d4a0ab20be5e53c8000a2": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_80041bd67b8b4c83bc68935dd89edc17", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_262809e1a7af4e219ac58a5328df4f1b" - } - }, - "72482ed1385d46fcafabcc206991d511": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_249ba341beb94434b44d7ea9e3f332fe", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_9be286665d7943b7b293dcf66ad1e733" - } - }, - "9e3c7fd88e5e401d99bd5ca9cbcdf9e8": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_01090e6ce3c8467da56386bd11a0abaa", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 15.04ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_f45f03683b41490d9be4b144014932ae" - } - }, - "80041bd67b8b4c83bc68935dd89edc17": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "262809e1a7af4e219ac58a5328df4f1b": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "249ba341beb94434b44d7ea9e3f332fe": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "9be286665d7943b7b293dcf66ad1e733": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "01090e6ce3c8467da56386bd11a0abaa": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "f45f03683b41490d9be4b144014932ae": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "9219fec14efb4927b13db8f2ca729594": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_ec331cb6ad53456f8ef8497c87cde7e0", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_a3b0392dab454773b6935fe664bdd021", - "IPY_MODEL_5d613a83b4c94909b063f17f11bb7423", - "IPY_MODEL_959250ce4566416183d6e0a9bb74f10a" - ] - } - }, - "ec331cb6ad53456f8ef8497c87cde7e0": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "a3b0392dab454773b6935fe664bdd021": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_56a4273f29184dcc8d4c9501352a0b0b", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_924d88bec18843d8b647ae5e78b47a78" - } - }, - "5d613a83b4c94909b063f17f11bb7423": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_ebcfec04262244c88f70c6667cbb0c27", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_1ebd834fbbc04a7da8761e507c9f6595" - } - }, - "959250ce4566416183d6e0a9bb74f10a": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_bab3385944464a3b91b66baa8e475625", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 3.96ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_76e3dfff38f5424b816083853622c9cd" - } - }, - "56a4273f29184dcc8d4c9501352a0b0b": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "924d88bec18843d8b647ae5e78b47a78": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "ebcfec04262244c88f70c6667cbb0c27": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "1ebd834fbbc04a7da8761e507c9f6595": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "bab3385944464a3b91b66baa8e475625": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "76e3dfff38f5424b816083853622c9cd": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "62f55f22a2e444e8948094001af7c68a": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_5034bf9f757d4897b2edce291668aa70", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_9112f3060630497fb570123effc88a1f", - "IPY_MODEL_04f93ca698a34e4fba2a00aee3bd851d", - "IPY_MODEL_44d27c2305f34507958a62986a4460f8" - ] - } - }, - "5034bf9f757d4897b2edce291668aa70": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "9112f3060630497fb570123effc88a1f": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_3e9b1492999f4cd2a8357677b2e55f70", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_27633b5d93324c83a8d5acbe3c0f5c8d" - } - }, - "04f93ca698a34e4fba2a00aee3bd851d": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_00ee1c3098244d8685994e77f2b0f073", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_8987784a2f9b44fcb1c54722c0aacbf4" - } - }, - "44d27c2305f34507958a62986a4460f8": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_6b5cb4c85f5a409daf009d407b30e62c", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 1.22ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_673d280ff2d54b4ca45ee21f3bf9f4ad" - } - }, - "3e9b1492999f4cd2a8357677b2e55f70": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "27633b5d93324c83a8d5acbe3c0f5c8d": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "00ee1c3098244d8685994e77f2b0f073": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "8987784a2f9b44fcb1c54722c0aacbf4": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "6b5cb4c85f5a409daf009d407b30e62c": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "673d280ff2d54b4ca45ee21f3bf9f4ad": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "158b609e29624b2298613648369b5858": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_f58578c4350e4385a15bd1470ca2aba0", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_c8fa606ee4b44f4cba93ed8e33c10aa1", - "IPY_MODEL_62371b81af1c4325a5686443f88f932f", - "IPY_MODEL_a8402ade283b401394891ffde56e77ae" - ] - } - }, - "f58578c4350e4385a15bd1470ca2aba0": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "c8fa606ee4b44f4cba93ed8e33c10aa1": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_018d7d67086348938cbc994ca536e93e", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_fdb8731215944db4a778939b17766709" - } - }, - "62371b81af1c4325a5686443f88f932f": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_8f3e89919aa84a86ba18900f0ea061af", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_d71558fa4225416cb263b2faa40e1522" - } - }, - "a8402ade283b401394891ffde56e77ae": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_3b6cb11afe98488fa07d21550450fecf", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:02<00:00, 2.63s/ba]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_7c842efce69341fca127e5a2c35507be" - } - }, - "018d7d67086348938cbc994ca536e93e": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "fdb8731215944db4a778939b17766709": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "8f3e89919aa84a86ba18900f0ea061af": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "d71558fa4225416cb263b2faa40e1522": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "3b6cb11afe98488fa07d21550450fecf": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "7c842efce69341fca127e5a2c35507be": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "48effe85b41246aeaf65721ad4e674d9": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_6154b366079d43d68c26056dd39eda20", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_feebdfe28fac45448db79dfe4b5e1a2a", - "IPY_MODEL_7da4c3ac1a5f4b738879011aec514c24", - "IPY_MODEL_0eb30f5c62e743cbafa806aacdf63886" - ] - } - }, - "6154b366079d43d68c26056dd39eda20": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "feebdfe28fac45448db79dfe4b5e1a2a": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_de53680b38054ab7ade667ec75ebbe1d", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_a8f49f5de9554526a2dce0e0ff73dd0f" - } - }, - "7da4c3ac1a5f4b738879011aec514c24": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_de110bad8390476787ad83f2d0e2e399", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_53a73caa81324ac28dc16ca5072ad41a" - } - }, - "0eb30f5c62e743cbafa806aacdf63886": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_2b64e5dde23147809b90fda52c26adf4", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 8.59ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_fe013083cf4c4d4ab5113fc9befba15d" - } - }, - "de53680b38054ab7ade667ec75ebbe1d": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "a8f49f5de9554526a2dce0e0ff73dd0f": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "de110bad8390476787ad83f2d0e2e399": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "53a73caa81324ac28dc16ca5072ad41a": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "2b64e5dde23147809b90fda52c26adf4": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "fe013083cf4c4d4ab5113fc9befba15d": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "0020e39e4a994e8e9f2c6ff34a002e2b": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_f03ec80b91e040a7985a506344daf16c", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_fd9ebec8192041b1b90c2ae439153252", - "IPY_MODEL_6c098921a19a4181bbdce81224858136", - "IPY_MODEL_cb8b8200abc34886bd1ee6b43b9b6251" - ] - } - }, - "f03ec80b91e040a7985a506344daf16c": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "fd9ebec8192041b1b90c2ae439153252": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_9443f9c1b0754ad8abda31d0c2266a14", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_c3d96ba9ca784215852fbb0a53db04a4" - } - }, - "6c098921a19a4181bbdce81224858136": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_dd55838850314f549c839e55f9f1ba6a", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_dde176109a464a52bf1ab64093ff3d33" - } - }, - "cb8b8200abc34886bd1ee6b43b9b6251": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_17c0c7751e36459dbf290e3ee11e1963", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 4.80ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_b104a392faae47b1b85bfc188d736c4b" - } - }, - "9443f9c1b0754ad8abda31d0c2266a14": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "c3d96ba9ca784215852fbb0a53db04a4": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "dd55838850314f549c839e55f9f1ba6a": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "dde176109a464a52bf1ab64093ff3d33": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "17c0c7751e36459dbf290e3ee11e1963": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "b104a392faae47b1b85bfc188d736c4b": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "dbc3c3d19db54d28b5b392ec3d18c24e": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_4a595aeb62e247249674840a4909b7af", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_b2dff5ec946b437baf545c139542a443", - "IPY_MODEL_73bf5c12a8ca4d9fb42ab9c1ab194e99", - "IPY_MODEL_d2daa497d20844e3b66bbbf9a5f10c38" - ] - } - }, - "4a595aeb62e247249674840a4909b7af": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "b2dff5ec946b437baf545c139542a443": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_f03ad206345746cf99769287b25577a2", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_5c46be69bfc642efba1607089e1d15d9" - } - }, - "73bf5c12a8ca4d9fb42ab9c1ab194e99": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_579fc23364e943fba13c21252d5c3653", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_6ba5fd8de2d94f3fac57780b8f13b548" - } - }, - "d2daa497d20844e3b66bbbf9a5f10c38": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_64aaab9dfd8043eea64409ef37bc5d64", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 7.76ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_59a398e80fb14ccf83a53f474e15d65b" - } - }, - "f03ad206345746cf99769287b25577a2": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "5c46be69bfc642efba1607089e1d15d9": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "579fc23364e943fba13c21252d5c3653": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "6ba5fd8de2d94f3fac57780b8f13b548": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "64aaab9dfd8043eea64409ef37bc5d64": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "59a398e80fb14ccf83a53f474e15d65b": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "bfa5ab8385a74effb393e6c8e26d7649": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_8982b8ae9461495fac55362facd7e50f", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_4a0d9b95dd3a40d6b3fd9b78b3ee192f", - "IPY_MODEL_a1d4b903830f4bba95bbeff4e190181a", - "IPY_MODEL_b8383c75ef6f44d9963651145fb1c1ae" - ] - } - }, - "8982b8ae9461495fac55362facd7e50f": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "4a0d9b95dd3a40d6b3fd9b78b3ee192f": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_4d699cd8eaae44e286dc32530ce9aead", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_ec7708db9c29486f85dd8e5dcdde93d1" - } - }, - "a1d4b903830f4bba95bbeff4e190181a": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_823652cdd6f34cb3baa8dd1bf5733f02", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_f6c1a6311cc14b64b595421fdca92d88" - } - }, - "b8383c75ef6f44d9963651145fb1c1ae": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_94dc0b767e524aa7a789bd26e7acd5b3", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 4.00ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_6c0b7b2c01db4699aa2a31417a2a10c6" - } - }, - "4d699cd8eaae44e286dc32530ce9aead": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "ec7708db9c29486f85dd8e5dcdde93d1": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "823652cdd6f34cb3baa8dd1bf5733f02": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "f6c1a6311cc14b64b595421fdca92d88": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "94dc0b767e524aa7a789bd26e7acd5b3": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "6c0b7b2c01db4699aa2a31417a2a10c6": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "b1e5d35e320040ddad4650e448ffa21d": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_0f1b3a7bf38549769479bc665589ec7c", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_9ac1adeaa4c14f5c8db22950529935ed", - "IPY_MODEL_1d08a71ce10b428c884cbdf0bc18f112", - "IPY_MODEL_a8ea7b5a9b34400fbea09fa812d4c80b" - ] - } - }, - "0f1b3a7bf38549769479bc665589ec7c": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "9ac1adeaa4c14f5c8db22950529935ed": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_ddee46672cc44f208f8843b79b22ed52", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_6ce12da921184817a9ad7a114ec991fd" - } - }, - "1d08a71ce10b428c884cbdf0bc18f112": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_21ff47f44fa742c6b54d3f20b952dfad", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_2a0f6d6387f141c88f41153e4d61a8a6" - } - }, - "a8ea7b5a9b34400fbea09fa812d4c80b": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_b3368e3647e0477a84341c2af7f1901e", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:02<00:00, 2.63s/ba]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_b1402545f4834bbebc49839210c1bbcf" - } - }, - "ddee46672cc44f208f8843b79b22ed52": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "6ce12da921184817a9ad7a114ec991fd": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "21ff47f44fa742c6b54d3f20b952dfad": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "2a0f6d6387f141c88f41153e4d61a8a6": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "b3368e3647e0477a84341c2af7f1901e": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "b1402545f4834bbebc49839210c1bbcf": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "6fc84a7060f445a6891bf57b4c87951d": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_eae34b2b84a743af9a478f02effbbfdc", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_a78e4da5cec8417b9820cb689c0b51b1", - "IPY_MODEL_29e6c0a6d06646e4ba04e11c56ee18ea", - "IPY_MODEL_6d0b4f652a19450dbc77217c4dab46a5" - ] - } - }, - "eae34b2b84a743af9a478f02effbbfdc": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "a78e4da5cec8417b9820cb689c0b51b1": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_a4f1fdfb3a634b8e9f5549a972e6d4ce", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_a413d135c87d4eada799473e4ea2a298" - } - }, - "29e6c0a6d06646e4ba04e11c56ee18ea": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_3da67f635055454287625e6c70303c8b", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_abbb4dfe75244ecca48715b4a09b481a" - } - }, - "6d0b4f652a19450dbc77217c4dab46a5": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_33b1574be0dc4eadaafb6124a3837425", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:02<00:00, 2.87s/ba]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_eb977bea725843ccb7d7be5537b64f0b" - } - }, - "a4f1fdfb3a634b8e9f5549a972e6d4ce": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "a413d135c87d4eada799473e4ea2a298": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "3da67f635055454287625e6c70303c8b": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "abbb4dfe75244ecca48715b4a09b481a": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "33b1574be0dc4eadaafb6124a3837425": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "eb977bea725843ccb7d7be5537b64f0b": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "b07b810992314152a64788cdc100821d": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_6d3e1d22a7154b5cb1f438a227e553eb", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_1f05e7bef8c94a5ca483c6c0804be14a", - "IPY_MODEL_d4fb5d6f6ab04251896a32979efa8c45", - "IPY_MODEL_9ea7551c5cfa4b0a879463fe56da86c3" - ] - } - }, - "6d3e1d22a7154b5cb1f438a227e553eb": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "1f05e7bef8c94a5ca483c6c0804be14a": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_26ed6d00e40d481d930dd67528102206", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_f3144cd2ec0a4011871cc4c2d0b46383" - } - }, - "d4fb5d6f6ab04251896a32979efa8c45": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_dcb2f9760abf4ff2847633d364cb4a3c", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_3862e070555b4147b86969edb492e2bd" - } - }, - "9ea7551c5cfa4b0a879463fe56da86c3": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_698f506c4b0d444791e52aeaf16ea650", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 3.51ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_7c006fc956c74f15a7963507021a6e49" - } - }, - "26ed6d00e40d481d930dd67528102206": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "f3144cd2ec0a4011871cc4c2d0b46383": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "dcb2f9760abf4ff2847633d364cb4a3c": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "3862e070555b4147b86969edb492e2bd": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "698f506c4b0d444791e52aeaf16ea650": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "7c006fc956c74f15a7963507021a6e49": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "170586a83f2b4839a90a2ee1ce352ea2": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_6ce337d64b38428298d9685593146431", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_fc5dcbebb1584ed89504b1450a1521b0", - "IPY_MODEL_4b4146ca992d4381b0973c2838edf064", - "IPY_MODEL_2717b933559945038cee2214516dbf6e" - ] - } - }, - "6ce337d64b38428298d9685593146431": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "fc5dcbebb1584ed89504b1450a1521b0": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_f47e1474f8454daeb12d6d87304169ff", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_95a5a44ff78447739125e29e0b6af41e" - } - }, - "4b4146ca992d4381b0973c2838edf064": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_6d56e4aba7dd4a13904f7f964c58ba60", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_f6769b1054bc4858b68affaf0fec9bdb" - } - }, - "2717b933559945038cee2214516dbf6e": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_3f8e4bca66fc4a8fa7f13553812abd01", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 4.33ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_f52bb0ad7f8a4029bf8462bd7448f9cd" - } - }, - "f47e1474f8454daeb12d6d87304169ff": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "95a5a44ff78447739125e29e0b6af41e": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "6d56e4aba7dd4a13904f7f964c58ba60": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "f6769b1054bc4858b68affaf0fec9bdb": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "3f8e4bca66fc4a8fa7f13553812abd01": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "f52bb0ad7f8a4029bf8462bd7448f9cd": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "b31f3082260e4ba4bfd8340f85ed6105": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_b641069177974d40903b206672ed3cb6", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_9be75a3529f94d0a9b462d13d2419f39", - "IPY_MODEL_4e3ed92a934e415eb88573e2675fa2a6", - "IPY_MODEL_be219c11272d4ef9aee78cd7596a6aab" - ] - } - }, - "b641069177974d40903b206672ed3cb6": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "9be75a3529f94d0a9b462d13d2419f39": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_50456880ef2c44cbbf8e93f2ef3b2c14", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_0a367d699ca144db9933ac02916bafe8" - } - }, - "4e3ed92a934e415eb88573e2675fa2a6": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_16a74b3bdfd04dd9a38958e0770d6c45", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_8da78a7182de422381567716889175f8" - } - }, - "be219c11272d4ef9aee78cd7596a6aab": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_ec3fdc2d4a1340f1ae27975b60603149", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 3.63ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_9fdeaeddb32b4a048acf0fd24300808f" - } - }, - "50456880ef2c44cbbf8e93f2ef3b2c14": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "0a367d699ca144db9933ac02916bafe8": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "16a74b3bdfd04dd9a38958e0770d6c45": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "8da78a7182de422381567716889175f8": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "ec3fdc2d4a1340f1ae27975b60603149": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "9fdeaeddb32b4a048acf0fd24300808f": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "c7725e9fd1cb43908aa8c4c82a8afdc1": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_1462d23ef56f494fb1519ce4a4b16cc6", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_c935b92f10f34b7fadeebf0194bf8cf7", - "IPY_MODEL_51c758f75d1e4dc385732d52e96a632d", - "IPY_MODEL_f5fc50c84b42427cb6af50aeda1c51a9" - ] - } - }, - "1462d23ef56f494fb1519ce4a4b16cc6": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "c935b92f10f34b7fadeebf0194bf8cf7": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_5b4a3ba087e74a2aa17bc63b5b9ccce8", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_8b415d2cb97b4a82acf84be5949e29e5" - } - }, - "51c758f75d1e4dc385732d52e96a632d": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_22ce3beefd2f43f68184e1dd0999b2bd", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_1875e0dde02142d1bbec9e7541ecb548" - } - }, - "f5fc50c84b42427cb6af50aeda1c51a9": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_ca0ca35c9d1340a0a9d241b179ac6ebd", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 4.02ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_cc82cd9a9f624af28e964440ed8651f7" - } - }, - "5b4a3ba087e74a2aa17bc63b5b9ccce8": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "8b415d2cb97b4a82acf84be5949e29e5": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "22ce3beefd2f43f68184e1dd0999b2bd": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "1875e0dde02142d1bbec9e7541ecb548": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "ca0ca35c9d1340a0a9d241b179ac6ebd": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "cc82cd9a9f624af28e964440ed8651f7": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "80913f1792cc44d6ad5599ecec511dae": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_7d584b83060649638b905aa5faf54478", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_f840baeacab74e2bbc2afb8caf761487", - "IPY_MODEL_e5ad7692806b4eb89da574b35b638324", - "IPY_MODEL_ffc524b0b5184d4d9b624738478c27e9" - ] - } - }, - "7d584b83060649638b905aa5faf54478": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "f840baeacab74e2bbc2afb8caf761487": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_8df1039ada91449cb4a47565a2d41707", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_df661c0e64ed4369bbf356ae35c43d66" - } - }, - "e5ad7692806b4eb89da574b35b638324": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_20aea794682c4a329fc819e582fcf5a1", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_cdb04643e4254f2dbecca08bbbac88ac" - } - }, - "ffc524b0b5184d4d9b624738478c27e9": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_8ec9b7638a3e483e83bcd4499dac5192", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:05<00:00, 5.42s/ba]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_3a91b95948ec4b7c952e9f7de2163cb9" - } - }, - "8df1039ada91449cb4a47565a2d41707": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "df661c0e64ed4369bbf356ae35c43d66": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "20aea794682c4a329fc819e582fcf5a1": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "cdb04643e4254f2dbecca08bbbac88ac": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "8ec9b7638a3e483e83bcd4499dac5192": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "3a91b95948ec4b7c952e9f7de2163cb9": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "94013c8635a34ede81df21046f858252": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_02bcef31f2564c6aab8c8fef2b31c54d", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_4ff948f6513043308cf0cc5535426579", - "IPY_MODEL_0b6c45bc9c1645c58b12f79f20178368", - "IPY_MODEL_15de4b4e424348f2b713e8d1c64c9da3" - ] - } - }, - "02bcef31f2564c6aab8c8fef2b31c54d": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "4ff948f6513043308cf0cc5535426579": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_20d4213e61724014954a54c116c58a43", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_e19f73088f5a40e991ae3445ae94dd77" - } - }, - "0b6c45bc9c1645c58b12f79f20178368": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_3cef696e02614fa5ae76b2dfecec3e33", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_a70e575c99044542b5aec5370d0a2f0d" - } - }, - "15de4b4e424348f2b713e8d1c64c9da3": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_35e00a46d23e4235866959e0e7c6a87a", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:02<00:00, 2.74s/ba]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_2c0318e6c31c4dbca9b2b64986a3498d" - } - }, - "20d4213e61724014954a54c116c58a43": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "e19f73088f5a40e991ae3445ae94dd77": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "3cef696e02614fa5ae76b2dfecec3e33": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "a70e575c99044542b5aec5370d0a2f0d": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "35e00a46d23e4235866959e0e7c6a87a": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "2c0318e6c31c4dbca9b2b64986a3498d": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "e746cb0ccb1046b1aaef424e7719ba76": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_6dce51725e3d4849ad220999b270b244", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_189e2af080184719ae45ea51b110e48d", - "IPY_MODEL_eeed04c1679f437390a01fc61e4681d0", - "IPY_MODEL_c6977ec48664458d9612315d324dc133" - ] - } - }, - "6dce51725e3d4849ad220999b270b244": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "189e2af080184719ae45ea51b110e48d": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_c54be3438fd64041b59f7f24c832382c", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_15b048a741384add8955fd1af18011bb" - } - }, - "eeed04c1679f437390a01fc61e4681d0": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_fc675ea7db01426cbc837d359cc255fc", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_00a1bc33b8224339ab039caf0a58c71d" - } - }, - "c6977ec48664458d9612315d324dc133": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_5a576d4882014e4e96e2f6f8d52d3bff", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 1.87ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_dab01f50282a430095bf270331841ba1" - } - }, - "c54be3438fd64041b59f7f24c832382c": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "15b048a741384add8955fd1af18011bb": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "fc675ea7db01426cbc837d359cc255fc": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "00a1bc33b8224339ab039caf0a58c71d": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "5a576d4882014e4e96e2f6f8d52d3bff": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "dab01f50282a430095bf270331841ba1": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "91e7050e2a4f47449ea81e5f1be3ea3e": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_65e45d0f9374427c9b80290ad2d212c3", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_da435a8ce06c4bb693cd0bd78157fb6f", - "IPY_MODEL_32e6ced69ece4c28801a59e6abb5652c", - "IPY_MODEL_9375e0415ed54be8a1e49ecc42105b99" - ] - } - }, - "65e45d0f9374427c9b80290ad2d212c3": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "da435a8ce06c4bb693cd0bd78157fb6f": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_e4f17e8df92242d3a255bb7e73a48b3a", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_59629e754c4043d29e46dfffc3cfb383" - } - }, - "32e6ced69ece4c28801a59e6abb5652c": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_0f67371a3d0f443a95ce8a5ecaa9164a", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_01ca01b970094be68397b02d1de9710d" - } - }, - "9375e0415ed54be8a1e49ecc42105b99": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_4f9ad3511eb646059730b88b61efca8c", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 3.86ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_bc7afadc638d4bef8df68ffa87bf1268" - } - }, - "e4f17e8df92242d3a255bb7e73a48b3a": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "59629e754c4043d29e46dfffc3cfb383": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "0f67371a3d0f443a95ce8a5ecaa9164a": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "01ca01b970094be68397b02d1de9710d": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "4f9ad3511eb646059730b88b61efca8c": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "bc7afadc638d4bef8df68ffa87bf1268": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "69796bea16a644a7988dd096e2efec1a": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_886049e0a6154dbaa99ba79e55e5b772", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_a79e55afea2249ac9e36aaf54a892817", - "IPY_MODEL_ac35318d506444ca9654d1cf366aa478", - "IPY_MODEL_61845cc7d9e5497e9db1d3051afdc33d" - ] - } - }, - "886049e0a6154dbaa99ba79e55e5b772": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "a79e55afea2249ac9e36aaf54a892817": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_bbd0984b8b3c4cfbaf1802befdb63216", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_41640459f5644018ab673e219b9fa800" - } - }, - "ac35318d506444ca9654d1cf366aa478": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_785f1c8244774e6b8a5fba3cd2dded5e", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_9157008c7cb44d7aa941dd77d1a88a48" - } - }, - "61845cc7d9e5497e9db1d3051afdc33d": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_5a1f62538eed4642965ef9e391d1e133", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 1.71ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_253ee32c99d448aab163050e5b7aeabe" - } - }, - "bbd0984b8b3c4cfbaf1802befdb63216": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "41640459f5644018ab673e219b9fa800": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "785f1c8244774e6b8a5fba3cd2dded5e": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "9157008c7cb44d7aa941dd77d1a88a48": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "5a1f62538eed4642965ef9e391d1e133": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "253ee32c99d448aab163050e5b7aeabe": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "a2297d83eb284a8eaf04c974933a3c8f": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_6cd8f06fc86f4835a975309a1af4951c", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_2c4d956ceb984da99c89ed8cdf95779c", - "IPY_MODEL_890236cc2b9340e1997832e31925643f", - "IPY_MODEL_e33b6e680647489481e3ffd48aabe36e" - ] - } - }, - "6cd8f06fc86f4835a975309a1af4951c": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "2c4d956ceb984da99c89ed8cdf95779c": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_7b217ab5ca3545d29af984bad5fc43f7", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_47a596395ce04cd58470f9820441847c" - } - }, - "890236cc2b9340e1997832e31925643f": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_afb663bf18e94410a56854b7c9c51391", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_2c2c561673e74aefb598281668abfe3b" - } - }, - "e33b6e680647489481e3ffd48aabe36e": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_241d88ad7d184065a6faeb66a97de9d7", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 3.37ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_88269755099d49c3a1bae2c6b590a021" - } - }, - "7b217ab5ca3545d29af984bad5fc43f7": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "47a596395ce04cd58470f9820441847c": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "afb663bf18e94410a56854b7c9c51391": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "2c2c561673e74aefb598281668abfe3b": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "241d88ad7d184065a6faeb66a97de9d7": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "88269755099d49c3a1bae2c6b590a021": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "b80d4028e09e460780242a310b2ce881": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_ed4663b0921349489701827f30b0c0d1", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_4ec039133e574171a886bc110c456db3", - "IPY_MODEL_3d3f3f0ced3247648a82e6e6da0607db", - "IPY_MODEL_c18a03a0625446bbb0f1e0978768ac7f" - ] - } - }, - "ed4663b0921349489701827f30b0c0d1": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "4ec039133e574171a886bc110c456db3": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_13a127b9a9064dadbf1f3b26e5645abc", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_a34fa5a1646b4e439e21c743df021ba4" - } - }, - "3d3f3f0ced3247648a82e6e6da0607db": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_d1df3b7397714e86bec1feb171d5a288", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_726f1d2329e540709261bbe036cef023" - } - }, - "c18a03a0625446bbb0f1e0978768ac7f": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_2bb9abf312274651b5b8199a8dc7b72b", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:07<00:00, 7.41s/ba]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_e5ba20a7fd2f4829af24a26f6943859c" - } - }, - "13a127b9a9064dadbf1f3b26e5645abc": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "a34fa5a1646b4e439e21c743df021ba4": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "d1df3b7397714e86bec1feb171d5a288": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "726f1d2329e540709261bbe036cef023": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "2bb9abf312274651b5b8199a8dc7b72b": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "e5ba20a7fd2f4829af24a26f6943859c": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "ce4f962f86a74a81b97b4de7d081a1c9": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_e7d922fe9caa41c4a35e7149fc1256c6", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_e0a509110c19437190c9c0c95d19aa30", - "IPY_MODEL_e030180976a94892b6e110295edd18f5", - "IPY_MODEL_d5808baf0ab74f368a7645385fe49854" - ] - } - }, - "e7d922fe9caa41c4a35e7149fc1256c6": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "e0a509110c19437190c9c0c95d19aa30": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_f304060121e24a64836d31fbb0a9c626", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_412068117e3c4daeab119c6f317d8ce4" - } - }, - "e030180976a94892b6e110295edd18f5": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_8836b767ebe94dbf8cbceb1c3b3c617c", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_04384585a0c94ac4a4c737bb0e082127" - } - }, - "d5808baf0ab74f368a7645385fe49854": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_cf587e7e9ff343f09550efe21dde60f4", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:02<00:00, 2.63s/ba]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_699815cf05894f499a63923f9ba4b810" - } - }, - "f304060121e24a64836d31fbb0a9c626": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "412068117e3c4daeab119c6f317d8ce4": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "8836b767ebe94dbf8cbceb1c3b3c617c": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "04384585a0c94ac4a4c737bb0e082127": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "cf587e7e9ff343f09550efe21dde60f4": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "699815cf05894f499a63923f9ba4b810": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "ecb7f32112aa457587e16f630402d093": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_81e0933c748245d5baa521289d95fb6e", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_28093ea068fe49e59d95d3d3846a727f", - "IPY_MODEL_a8634f6433294a4f88b89d010553a6ed", - "IPY_MODEL_88475b2359a249458645c1cd1c581eb3" - ] - } - }, - "81e0933c748245d5baa521289d95fb6e": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "28093ea068fe49e59d95d3d3846a727f": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_40b5dd29935a438ea0a5093cff6e4790", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_f53c9207120942a5a4792d1a59b0be12" - } - }, - "a8634f6433294a4f88b89d010553a6ed": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_8c9cad5b6117485b98e96bc67e6e12d4", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_f85beba18c43413f95768d949e806cc3" - } - }, - "88475b2359a249458645c1cd1c581eb3": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_387a2e2ab6594f198bc2b0efb0799082", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 1.14ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_80661aae47e64aec94a212d54a2b399d" - } - }, - "40b5dd29935a438ea0a5093cff6e4790": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "f53c9207120942a5a4792d1a59b0be12": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "8c9cad5b6117485b98e96bc67e6e12d4": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "f85beba18c43413f95768d949e806cc3": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "387a2e2ab6594f198bc2b0efb0799082": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "80661aae47e64aec94a212d54a2b399d": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "5a71bf3c4e954857828717e37540decd": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_996bcc4bf2fe424984e8d6932ce4bb9d", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_32521db36b074115a830ac7d37a94831", - "IPY_MODEL_463146ccd9ae45b6a9ec85146fbd4e72", - "IPY_MODEL_308994c31d9c478babb2978d056453bd" - ] - } - }, - "996bcc4bf2fe424984e8d6932ce4bb9d": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "32521db36b074115a830ac7d37a94831": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_0efc4a61cafb408eb8e17f8fe82cc4b2", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_2c6990430a69423e87f3bfd1513655b5" - } - }, - "463146ccd9ae45b6a9ec85146fbd4e72": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_1646af07355b49e28af8f59718e9cde8", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_9ed05970d2624dae8d60f227fa0146a6" - } - }, - "308994c31d9c478babb2978d056453bd": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_4180df71a2fa4359ac6d4d094119726a", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 3.05ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_a48a36317d0f4203b3f5c35446c6819c" - } - }, - "0efc4a61cafb408eb8e17f8fe82cc4b2": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "2c6990430a69423e87f3bfd1513655b5": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "1646af07355b49e28af8f59718e9cde8": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "9ed05970d2624dae8d60f227fa0146a6": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "4180df71a2fa4359ac6d4d094119726a": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "a48a36317d0f4203b3f5c35446c6819c": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "d9e4e6a08e4c4d6baee6358643090800": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_97c26aa8d2304232a8f9df18f6923840", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_fe0605ce85d0452ebca4a8350dce0744", - "IPY_MODEL_96891675fbc84e1ab7ca4c75f13cad46", - "IPY_MODEL_315c3ae236c44884b8b0412b176db4d9" - ] - } - }, - "97c26aa8d2304232a8f9df18f6923840": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "fe0605ce85d0452ebca4a8350dce0744": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_a9c7a18ab36b4236b3f13e1220e45364", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_dfe8a5987950418282678160d059a455" - } - }, - "96891675fbc84e1ab7ca4c75f13cad46": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_3d7357402bbb4205ba6db979a8fd42c0", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_aeb8abc1a9264217a49ea17c974a441b" - } - }, - "315c3ae236c44884b8b0412b176db4d9": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_8be5bb01b2ca4ba89d98c0dfe425f165", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 1.05ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_897e02adb91b44b8917f7510a8982439" - } - }, - "a9c7a18ab36b4236b3f13e1220e45364": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "dfe8a5987950418282678160d059a455": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "3d7357402bbb4205ba6db979a8fd42c0": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "aeb8abc1a9264217a49ea17c974a441b": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "8be5bb01b2ca4ba89d98c0dfe425f165": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "897e02adb91b44b8917f7510a8982439": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "69a94cfb70d545fbad6a2c7806252ac9": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HBoxModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HBoxView", - "_dom_classes": [], - "_model_name": "HBoxModel", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.5.0", - "box_style": "", - "layout": "IPY_MODEL_6c860dbe6a0a4e3cbaa75f7da3d7c4a1", - "_model_module": "@jupyter-widgets/controls", - "children": [ - "IPY_MODEL_66caaa629ba04739a29c495b3dcb19d4", - "IPY_MODEL_e07b43912f6b4cb5937fecbffa1f0734", - "IPY_MODEL_b3dc2e2d8824441f8241a2d06caf47e9" - ] - } - }, - "6c860dbe6a0a4e3cbaa75f7da3d7c4a1": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "66caaa629ba04739a29c495b3dcb19d4": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_2a04cb2d6bb844f9b26e8922ef12ed78", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": "100%", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_68a0e220574e496289a5ea511f34527a" - } - }, - "e07b43912f6b4cb5937fecbffa1f0734": { - "model_module": "@jupyter-widgets/controls", - "model_name": "FloatProgressModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "ProgressView", - "style": "IPY_MODEL_e5c0369205dc4393bfab0e69621503c3", - "_dom_classes": [], - "description": "", - "_model_name": "FloatProgressModel", - "bar_style": "success", - "max": 1, - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": 1, - "_view_count": null, - "_view_module_version": "1.5.0", - "orientation": "horizontal", - "min": 0, - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_174d1e4b609543899732ef3c0d30e30c" - } - }, - "b3dc2e2d8824441f8241a2d06caf47e9": { - "model_module": "@jupyter-widgets/controls", - "model_name": "HTMLModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "HTMLView", - "style": "IPY_MODEL_5f7c61016d6245f4b27181804efd84fc", - "_dom_classes": [], - "description": "", - "_model_name": "HTMLModel", - "placeholder": "โ€‹", - "_view_module": "@jupyter-widgets/controls", - "_model_module_version": "1.5.0", - "value": " 1/1 [00:00<00:00, 3.08ba/s]", - "_view_count": null, - "_view_module_version": "1.5.0", - "description_tooltip": null, - "_model_module": "@jupyter-widgets/controls", - "layout": "IPY_MODEL_8c93decc6aeb49f5bf21f47cd917c85a" - } - }, - "2a04cb2d6bb844f9b26e8922ef12ed78": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "68a0e220574e496289a5ea511f34527a": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "e5c0369205dc4393bfab0e69621503c3": { - "model_module": "@jupyter-widgets/controls", - "model_name": "ProgressStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "ProgressStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "bar_color": null, - "_model_module": "@jupyter-widgets/controls" - } - }, - "174d1e4b609543899732ef3c0d30e30c": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - }, - "5f7c61016d6245f4b27181804efd84fc": { - "model_module": "@jupyter-widgets/controls", - "model_name": "DescriptionStyleModel", - "model_module_version": "1.5.0", - "state": { - "_view_name": "StyleView", - "_model_name": "DescriptionStyleModel", - "description_width": "", - "_view_module": "@jupyter-widgets/base", - "_model_module_version": "1.5.0", - "_view_count": null, - "_view_module_version": "1.2.0", - "_model_module": "@jupyter-widgets/controls" - } - }, - "8c93decc6aeb49f5bf21f47cd917c85a": { - "model_module": "@jupyter-widgets/base", - "model_name": "LayoutModel", - "model_module_version": "1.2.0", - "state": { - "_view_name": "LayoutView", - "grid_template_rows": null, - "right": null, - "justify_content": null, - "_view_module": "@jupyter-widgets/base", - "overflow": null, - "_model_module_version": "1.2.0", - "_view_count": null, - "flex_flow": null, - "width": null, - "min_width": null, - "border": null, - "align_items": null, - "bottom": null, - "_model_module": "@jupyter-widgets/base", - "top": null, - "grid_column": null, - "overflow_y": null, - "overflow_x": null, - "grid_auto_flow": null, - "grid_area": null, - "grid_template_columns": null, - "flex": null, - "_model_name": "LayoutModel", - "justify_items": null, - "grid_row": null, - "max_height": null, - "align_content": null, - "visibility": null, - "align_self": null, - "height": null, - "min_height": null, - "padding": null, - "grid_auto_rows": null, - "grid_gap": null, - "max_width": null, - "order": null, - "_view_module_version": "1.2.0", - "grid_template_areas": null, - "object_position": null, - "object_fit": null, - "grid_auto_columns": null, - "margin": null, - "display": null, - "left": null - } - } - } - } - }, - "cells": [ - { - "cell_type": "markdown", - "source": [ - "# ๐Ÿ‰ Go long! Long range transformers\r\n", - "\r\n", - "In recent research there has been an extensive study for improving the calculation of attention in transformer architectures. Mostly for improving their capacity to handle longer token sequences. ๐Ÿ‘Š\r\n", - "\r\n", - "The attention calculation is known to be quadratic in computation time with respect to the sequence length ๐Ÿ‘Ž. These recent advances, however, are able to perform attention calculation in near-linear time with respect to the sequence length. This allows us to scale the transformer architecture such that it can handle input sequences beyond the usual token length of 512 more efficiently. \r\n", - "\r\n", - "In this notebook, we compare traditional transformers with novel efficient transformers. We'll use roBERTa as a baseline to compare against LongFormer and BigBird. \r\n", - "\r\n", - "Let's put these architectures to the test and see which one comes out on top ๐Ÿ†! \r\n" - ], - "metadata": { - "id": "7XEv8E2SALpN" - } - }, - { - "cell_type": "markdown", - "source": [ - "## ๐Ÿ› ๏ธ Getting started: Install packages & download models\n", - "\n", - "The below cells will setup everything that is required to get started with model training:\n", - "\n", - "* Install python specific packages\n", - "* Import required packages" - ], - "metadata": { - "id": "l8ymZTD_BSZS" - } - }, - { - "cell_type": "code", - "execution_count": null, - "source": [ - "!pip install -q sklearn transformers datasets torch plotly sentencepiece tqdm" - ], - "outputs": [], - "metadata": { - "id": "eDI558ScBWpZ" - } - }, - { - "cell_type": "code", - "execution_count": null, - "source": [ - "import time \r\n", - "import sys \r\n", - "import json\r\n", - "import shutil\r\n", - "import pandas as pd\r\n", - "from enum import Enum\r\n", - "import math\r\n", - "import torch\r\n", - "\r\n", - "import plotly.express as px\r\n", - "import plotly.graph_objects as go\r\n", - "\r\n", - "\r\n", - "from sklearn.metrics import accuracy_score, precision_recall_fscore_support\r\n", - "from transformers import BigBirdTokenizerFast, BigBirdForSequenceClassification, RobertaTokenizer, RobertaForSequenceClassification, LongformerForSequenceClassification, TrainingArguments, Trainer, LongformerTokenizerFast\r\n", - "from datasets import load_dataset" - ], - "outputs": [], - "metadata": { - "id": "x7x9s2l_VMLV" - } - }, - { - "cell_type": "markdown", - "source": [ - "## ๐Ÿ’พ Dataset & downstream task\n", - "\n", - "We will use the [Hyperpartisan news dataset](https://huggingface.co/datasets/hyperpartisan_news_detection) for binary sentiment classification. In the paper publication of LongFormer and BigBird, both architectures were compared against RoBERTa with this exact dataset.\n", - "\n", - "This dataset contains on average wordpieces, which is ideal to make our point ๐Ÿ’ช.\n", - "\n", - "We aim to gain more insight in when to use which architecture, therefore we will go *one step beyond ๐Ÿ”ฅ*, and evaluate the architectures on distinct subsets of the data, each time introducing sentences with more tokens!" - ], - "metadata": { - "id": "UOdkK5pXBSZT" - } - }, - { - "cell_type": "code", - "execution_count": null, - "source": [ - "# Load the tokenizer\r\n", - "tokenizer = RobertaTokenizer.from_pretrained('roberta-base')" - ], - "outputs": [], - "metadata": { - "id": "9glZmD-TTLni" - } - }, - { - "cell_type": "code", - "execution_count": null, - "source": [ - "# Load the hyperpartisan dataset\r\n", - "ds = load_dataset('hyperpartisan_news_detection', 'byarticle')['train']\r\n", - "\r\n", - "# Rename the label column for uniformity\r\n", - "ds = ds.rename_column(\"hyperpartisan\", \"label\")\r\n", - "\r\n", - "# Remove unused columns\r\n", - "ds = ds.remove_columns(['title', 'url', 'published_at'])\r\n", - "\r\n", - "# Add token length column to filter on later\r\n", - "ds=ds.add_column(name = 'token_length', column=[len(tokenizer.batch_encode_plus([x['text']]).input_ids[0]) for x in ds])" - ], - "outputs": [], - "metadata": { - "id": "s63EKk7JVdsY" - } - }, - { - "cell_type": "markdown", - "source": [ - "Split into train and test set" - ], - "metadata": { - "id": "Rpj-LxJeb7p8" - } - }, - { - "cell_type": "code", - "execution_count": null, - "source": [ - "split_ds = ds.train_test_split(test_size=0.20)" - ], - "outputs": [], - "metadata": { - "id": "_x2ZmgVEIM8z" - } - }, - { - "cell_type": "markdown", - "source": [ - "Make various train partitions" - ], - "metadata": { - "id": "4PSVpBhGb94w" - } - }, - { - "cell_type": "code", - "execution_count": null, - "source": [ - "train_ds = split_ds['train']\r\n", - "test_ds = split_ds['test']\r\n", - "\r\n", - "train_ds_dict = {}\r\n", - "\r\n", - "for min_tok, max_tok in [(0,256), (0, 512), (0, 1024), (0, 2048), (0, 4096)]:\r\n", - " # Filter on the lengths\r\n", - " train_ds_dict[str(max_tok)] = train_ds.filter(lambda x : x['token_length'] <= max_tok).filter(lambda x : x['token_length'] > min_tok)\r\n", - "\r\n", - " # Select the closest even number\r\n", - " # the longformer optimizer cannot handle a single remaining datapoint at the end of the epoch\r\n", - " train_ds_dict[str(max_tok)] = train_ds_dict[str(max_tok)].select(range(\r\n", - " math.floor(\r\n", - " len(train_ds_dict[str(max_tok)]) / 2.\r\n", - " ) * 2\r\n", - " ))" - ], - "outputs": [], - "metadata": { - "id": "kE7FKkQ_THBR" - } - }, - { - "cell_type": "markdown", - "source": [ - "# ๐Ÿ’ฅ Models\n", - "\n", - "Run the cells below to redo the training of each model\n", - "\n", - "๐Ÿ›Ž๏ธ Disclaimer: this will take some time... So if you're a busy bee or a hurrying hippo, you can skip this section and load in the results from one of our runs smoothly and swiftly ๐Ÿ˜Š!" - ], - "metadata": { - "id": "joa1qCMA_KpG" - } - }, - { - "cell_type": "markdown", - "source": [ - "## ๐Ÿ’ช Training" - ], - "metadata": { - "id": "Sju2Yw8n_KpH" - } - }, - { - "cell_type": "code", - "execution_count": null, - "source": [ - "def compute_metrics(pred):\r\n", - " labels = pred.label_ids\r\n", - " preds = pred.predictions.argmax(-1)\r\n", - " precision, recall, f1, _ = precision_recall_fscore_support(labels, preds, average='binary')\r\n", - " acc = accuracy_score(labels, preds)\r\n", - " return {\r\n", - " 'accuracy': acc,\r\n", - " 'f1': f1,\r\n", - " 'precision': precision,\r\n", - " 'recall': recall\r\n", - " }\r\n", - "\r\n", - "def train_and_evaluate(timing_run, model, identifier, tokenizer, train_ds, test_ds, max_length=1024):\r\n", - " def tokenization(batched_text):\r\n", - " return tokenizer(batched_text['text'], padding='max_length', truncation=True, max_length=min(max_length, tokenizer.model_max_length))\r\n", - "\r\n", - " # tokenizing both the training and test dataset \r\n", - " train_ds = train_ds.map(tokenization,\r\n", - " batched=True,\r\n", - " batch_size=len(train_ds),\r\n", - " remove_columns=['text'])\r\n", - "\r\n", - " test_ds = test_ds.map(tokenization,\r\n", - " batched=True,\r\n", - " batch_size=len(test_ds),\r\n", - " remove_columns=['text'])\r\n", - "\r\n", - " train_ds.set_format('torch', columns=['input_ids', 'attention_mask', 'label'])\r\n", - " test_ds.set_format('torch', columns=['input_ids', 'attention_mask', 'label'])\r\n", - "\r\n", - " parameters = model.num_parameters()\r\n", - "\r\n", - " # Set different number of steps for accuracy and timing runs\r\n", - " logging_steps = 1000 if timing_run else 10\r\n", - " eval_steps = 1000 if timing_run else 10\r\n", - " max_steps = 100 if timing_run else 250\r\n", - "\r\n", - " training_args = TrainingArguments(\r\n", - " # Set the batch sizes\r\n", - " per_device_train_batch_size=2,\r\n", - " per_device_eval_batch_size=4,\r\n", - "\r\n", - " # Apply efficiency tricks\r\n", - " gradient_accumulation_steps=8,\r\n", - " fp16=True,\r\n", - " \r\n", - " # steps paramters\r\n", - " evaluation_strategy=\"steps\",\r\n", - " warmup_steps=0,\r\n", - " eval_steps=eval_steps, \r\n", - " max_steps=max_steps,\r\n", - " logging_steps=logging_steps,\r\n", - "\r\n", - " # Optimizer parameters\r\n", - " learning_rate=2e-5,\r\n", - "\r\n", - " # Finalization\r\n", - " load_best_model_at_end=False if timing_run else True,\r\n", - " metric_for_best_model='accuracy', # default value is validation loss, we want the model with the highest accuracy \r\n", - " \r\n", - " # Output locations\r\n", - " output_dir='./{}'.format(identifier),\r\n", - " run_name='{}'.format(identifier),\r\n", - " logging_dir='./{}-logging'.format(identifier),\r\n", - " log_level='warning',\r\n", - " save_strategy=\"steps\"\r\n", - " )\r\n", - "\r\n", - " trainer = Trainer(\r\n", - " model=model,\r\n", - " args=training_args,\r\n", - " compute_metrics=compute_metrics,\r\n", - " train_dataset=train_ds,\r\n", - " eval_dataset=test_ds,\r\n", - " )\r\n", - " start_time = time.time()\r\n", - " trainer.train()\r\n", - " duration = time.time() - start_time\r\n", - "\r\n", - " shutil.rmtree('./{}'.format(identifier))\r\n", - "\r\n", - " metrics = None\r\n", - " if not timing_run:\r\n", - " metrics = trainer.evaluate()\r\n", - "\r\n", - " return trainer, parameters, metrics, duration" - ], - "outputs": [], - "metadata": { - "id": "PJ6mWoPjVstv" - } - }, - { - "cell_type": "code", - "execution_count": null, - "source": [ - "def run_training(timing_run, model, tokenizer, model_name, train_ds, test_ds, max_length=1024):\r\n", - " torch.cuda.empty_cache()\r\n", - "\r\n", - " try:\r\n", - " identifier = '{}-{}-{}'.format(model_name, max_length, 'timing' if timing_run else 'accuracy').lower()\r\n", - " trainer, parameters, metrics, duration = train_and_evaluate(timing_run, model, identifier, tokenizer, train_ds, test_ds, max_length)\r\n", - "\r\n", - " results_and_metrics = {\r\n", - " \"model_name\": model_name,\r\n", - " \"parameters\": parameters,\r\n", - " \"duration\": duration,\r\n", - " \"metrics\": metrics,\r\n", - " 'tokencount': str(max_length),\r\n", - " 'timing_run': timing_run\r\n", - " }\r\n", - " with open(f'{identifier}-results.json', \"w\") as fd:\r\n", - " json.dump(results_and_metrics, fd)\r\n", - "\r\n", - " except RuntimeError:\r\n", - " del trainer \r\n", - "\r\n", - " del model \r\n" - ], - "outputs": [], - "metadata": { - "id": "88g5S6z7BSZX" - } - }, - { - "cell_type": "markdown", - "source": [ - "### ๐ŸŽฏ Accuracy runs" - ], - "metadata": { - "id": "7InJLnvld8zu" - } - }, - { - "cell_type": "code", - "execution_count": null, - "source": [ - "timing = False\r\n", - "gradient_checkpointing = True\r\n", - "\r\n", - "for size in [256, 512, 1024, 2048, 4096]:\r\n", - "\r\n", - " for model_name, tokenizer, model in [\r\n", - " (\r\n", - " 'roberta',\r\n", - " RobertaTokenizer.from_pretrained('roberta-base'),\r\n", - " RobertaForSequenceClassification.from_pretrained('roberta-base',\r\n", - " gradient_checkpointing=gradient_checkpointing,\r\n", - " num_labels=2)\r\n", - " ),\r\n", - " (\r\n", - " 'longformer',\r\n", - " LongformerTokenizerFast.from_pretrained('allenai/longformer-base-4096'),\r\n", - " LongformerForSequenceClassification.from_pretrained('allenai/longformer-base-4096',\r\n", - " gradient_checkpointing=gradient_checkpointing,\r\n", - " attention_window=128,\r\n", - " num_labels=2)\r\n", - " ),\r\n", - " (\r\n", - " 'bigbird',\r\n", - " BigBirdTokenizerFast.from_pretrained('google/bigbird-roberta-base'),\r\n", - " BigBirdForSequenceClassification.from_pretrained('google/bigbird-roberta-base',\r\n", - " gradient_checkpointing=gradient_checkpointing,\r\n", - " num_labels=2)\r\n", - " )\r\n", - " ]:\r\n", - "\r\n", - " print(f\"Training {model_name} on size {size}\")\r\n", - "\r\n", - " run_training(timing, model, tokenizer, model_name, train_ds_dict[str(size)], test_ds, max_length=size)" - ], - "outputs": [ - { - "output_type": "display_data", - "data": { - "text/plain": [ - "Downloading: 0%| | 0.00/501M [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [250/250 07:20, Epoch 83/84]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6642000.7619170.6124030.0000000.0000000.000000
200.5068000.9617460.6124030.0000000.0000000.000000
300.4043000.9413790.6124030.0000000.0000000.000000
400.2604001.2895190.6201550.0392161.0000000.020000
500.1290001.5491110.6589150.2142861.0000000.120000
600.0096001.4948550.7364340.5142860.9000000.360000
700.0023002.4640120.6356590.1132081.0000000.060000
800.0013002.5766930.6356590.1132081.0000000.060000
900.0008002.4359690.6821710.3050851.0000000.180000
1000.0007002.4241430.6899220.3333331.0000000.200000
1100.0006002.4389690.6976740.3606561.0000000.220000
1200.0006002.4575290.6976740.3606561.0000000.220000
1300.0005002.4768890.6976740.3606561.0000000.220000
1400.0005002.4923020.6976740.3606561.0000000.220000
1500.0005002.5057300.6976740.3606561.0000000.220000
1600.0004002.5173070.6976740.3606561.0000000.220000
1700.0004002.5266350.6976740.3606561.0000000.220000
1800.0004002.5313350.6976740.3606561.0000000.220000
1900.0004002.5351940.6976740.3606561.0000000.220000
2000.0004002.5400780.6976740.3606561.0000000.220000
2100.0004002.5451030.6976740.3606561.0000000.220000
2200.0004002.5478110.6976740.3606561.0000000.220000
2300.0003002.5493940.6976740.3606561.0000000.220000
2400.0003002.5507460.6976740.3606561.0000000.220000
2500.0003002.5513100.6976740.3606561.0000000.220000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:01]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stdout", - "text": [ - "Training longformer on size 256\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - " 0%| | 0/1 [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [250/250 10:53, Epoch 83/84]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6500000.8352490.6124030.0000000.0000000.000000
200.4681001.1130630.6124030.0000000.0000000.000000
300.2259001.5616310.6201550.0392161.0000000.020000
400.0742001.5010760.6899220.3939390.8125000.260000
500.0078002.4290080.6279070.0769231.0000000.040000
600.0021002.5967270.6434110.1481481.0000000.080000
700.0012002.0764510.7131780.5066670.7600000.380000
800.0012002.1613570.6976740.4935060.7037040.380000
900.0007002.7653820.6589150.2142861.0000000.120000
1000.0528002.5479760.6744190.2758621.0000000.160000
1100.0009002.2260270.7054260.4062500.9285710.260000
1200.0007002.3277050.6976740.4179100.8235290.280000
1300.0006002.3609900.6976740.4179100.8235290.280000
1400.0005002.3863560.6976740.4179100.8235290.280000
1500.0005002.4174680.6976740.4179100.8235290.280000
1600.0005002.4434670.6976740.4179100.8235290.280000
1700.0005002.4622830.6976740.4179100.8235290.280000
1800.0004002.4769860.6976740.4179100.8235290.280000
1900.0004002.4890600.6976740.4179100.8235290.280000
2000.0004002.4986560.6976740.4179100.8235290.280000
2100.0004002.5063510.6976740.4179100.8235290.280000
2200.0004002.5119280.6976740.4179100.8235290.280000
2300.0004002.5161280.6976740.4179100.8235290.280000
2400.0004002.5186930.6976740.4179100.8235290.280000
2500.0004002.5196280.6976740.4179100.8235290.280000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:02]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stdout", - "text": [ - "Training bigbird on size 256\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - " 0%| | 0/1 [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [250/250 07:28, Epoch 83/84]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6298000.7492910.6124030.0000000.0000000.000000
200.5173000.8522470.6124030.0000000.0000000.000000
300.3887000.8704010.6124030.0000000.0000000.000000
400.2688001.0119610.6124030.0000000.0000000.000000
500.2162001.0144580.6744190.3000000.9000000.180000
600.0949000.8895750.7131780.6021510.6511630.560000
700.0331001.1210460.7364340.5853660.7500000.480000
800.0149001.3733600.6976740.4347830.7894740.300000
900.0086001.4575020.7209300.5000000.8181820.360000
1000.0066001.5336530.7131780.4931510.7826090.360000
1100.0049001.6133220.7209300.5000000.8181820.360000
1200.0043001.6532420.7209300.5000000.8181820.360000
1300.0037001.6837650.7209300.5000000.8181820.360000
1400.0035001.7110290.7209300.5000000.8181820.360000
1500.0030001.7414980.7131780.4788730.8095240.340000
1600.0028001.7681990.7131780.4788730.8095240.340000
1700.0025001.7865190.7131780.4788730.8095240.340000
1800.0024001.7995920.7131780.4788730.8095240.340000
1900.0025001.8076350.7131780.4788730.8095240.340000
2000.0023001.8106740.7131780.4788730.8095240.340000
2100.0022001.8148100.7131780.4788730.8095240.340000
2200.0022001.8217240.7131780.4788730.8095240.340000
2300.0021001.8267760.7131780.4788730.8095240.340000
2400.0021001.8300160.7131780.4788730.8095240.340000
2500.0021001.8312960.7131780.4788730.8095240.340000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:01]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "Some weights of the model checkpoint at roberta-base were not used when initializing RobertaForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'roberta.pooler.dense.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'roberta.pooler.dense.weight', 'lm_head.bias']\n", - "- This IS expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", - "- This IS NOT expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", - "Some weights of RobertaForSequenceClassification were not initialized from the model checkpoint at roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", - "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", - "Some weights of the model checkpoint at allenai/longformer-base-4096 were not used when initializing LongformerForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'lm_head.bias']\n", - "- This IS expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", - "- This IS NOT expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", - "Some weights of LongformerForSequenceClassification were not initialized from the model checkpoint at allenai/longformer-base-4096 and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", - "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", - "Some weights of the model checkpoint at google/bigbird-roberta-base were not used when initializing BigBirdForSequenceClassification: ['cls.predictions.transform.dense.weight', 'cls.seq_relationship.bias', 'cls.predictions.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.decoder.bias', 'cls.seq_relationship.weight', 'cls.predictions.transform.LayerNorm.weight']\n", - "- This IS expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", - "- This IS NOT expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", - "Some weights of BigBirdForSequenceClassification were not initialized from the model checkpoint at google/bigbird-roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.out_proj.weight', 'classifier.dense.bias', 'classifier.dense.weight']\n", - "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n" - ] - }, - { - "output_type": "stream", - "name": "stdout", - "text": [ - "Training roberta on size 512\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - " 0%| | 0/1 [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [250/250 09:46, Epoch 27/28]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.4999000.9839210.6124030.0000000.0000000.000000
200.5274000.8685350.6124030.0000000.0000000.000000
300.4750000.7876690.6124030.0000000.0000000.000000
400.4758000.7870850.6124030.0000000.0000000.000000
500.3321000.7716630.6124030.0000000.0000000.000000
600.3130001.1676880.6124030.0000000.0000000.000000
700.2499000.6919680.8139530.7209300.8611110.620000
800.1487000.7492250.8062020.7252750.8048780.660000
900.1215001.4531970.7131780.4126981.0000000.260000
1000.0878000.8184210.8294570.7500000.8684210.660000
1100.0125001.0005040.8217050.7160490.9354840.580000
1200.0016001.0885490.8217050.7160490.9354840.580000
1300.0011001.1476730.8217050.7160490.9354840.580000
1400.0009001.1848940.8217050.7160490.9354840.580000
1500.0008001.1981580.8217050.7160490.9354840.580000
1600.0007001.2181820.8217050.7160490.9354840.580000
1700.0006001.2360450.8217050.7160490.9354840.580000
1800.0006001.2455580.8217050.7160490.9354840.580000
1900.0006001.2581830.8217050.7160490.9354840.580000
2000.0005001.2682030.8139530.7000000.9333330.560000
2100.0005001.2790890.8139530.7000000.9333330.560000
2200.0005001.2887000.8139530.7000000.9333330.560000
2300.0005001.2939340.8139530.7000000.9333330.560000
2400.0005001.2951740.8139530.7000000.9333330.560000
2500.0005001.2953350.8139530.7000000.9333330.560000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:02]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stdout", - "text": [ - "Training longformer on size 512\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - " 0%| | 0/1 [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [250/250 14:49, Epoch 27/28]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.5655001.0445180.6124030.0000000.0000000.000000
200.5335000.7804340.6124030.0000000.0000000.000000
300.4415000.9519570.6124030.0000000.0000000.000000
400.4233000.6899050.6124030.0000000.0000000.000000
500.3069000.8793220.6046510.0000000.0000000.000000
600.2130001.2819370.6201550.0392161.0000000.020000
700.1118000.9624900.7209300.5500000.7333330.440000
800.0451001.7984140.6356590.1754390.7142860.100000
900.0066001.5679690.7209300.5500000.7333330.440000
1000.0032002.0556270.6899220.3939390.8125000.260000
1100.0017002.1873850.6821710.3692310.8000000.240000
1200.0013002.2158570.6821710.3880600.7647060.260000
1300.0303002.1951670.6976740.4347830.7894740.300000
1400.0010002.0635660.7131780.4931510.7826090.360000
1500.0088002.4833160.6589150.2666670.8000000.160000
1600.0008002.7156760.6434110.2068970.7500000.120000
1700.0192002.1281880.7131780.4931510.7826090.360000
1800.0007001.9860170.7441860.5822780.7931030.460000
1900.0007002.0025480.7441860.5822780.7931030.460000
2000.0055002.0180110.7441860.5714290.8148150.440000
2100.0006002.2845480.6976740.4507040.7619050.320000
2200.0006002.3149410.6976740.4347830.7894740.300000
2300.0006002.3283290.7054260.4411760.8333330.300000
2400.0006002.3305810.7054260.4411760.8333330.300000
2500.0005002.3317030.7054260.4411760.8333330.300000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:03]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stdout", - "text": [ - "Training bigbird on size 512\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - " 0%| | 0/1 [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [250/250 10:31, Epoch 27/28]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.5555000.8479790.6124030.0000000.0000000.000000
200.4858000.7729490.6124030.0000000.0000000.000000
300.5159000.8500250.6124030.0000000.0000000.000000
400.4959000.8022470.6124030.0000000.0000000.000000
500.3861000.8980660.6124030.0000000.0000000.000000
600.4263000.7200030.6124030.0000000.0000000.000000
700.4082000.7641100.6124030.0000000.0000000.000000
800.3018000.7509210.6124030.0000000.0000000.000000
900.2381000.6806160.6279070.0769231.0000000.040000
1000.2152000.8279650.6511630.1818181.0000000.100000
1100.1497000.6685030.7674420.5714291.0000000.400000
1200.0995000.7212880.7906980.6400000.9600000.480000
1300.0428000.9357910.7441860.5217390.9473680.360000
1400.0347000.6834010.8372090.7469880.9393940.620000
1500.0223001.2789350.7054260.3870971.0000000.240000
1600.0127001.0230400.7751940.6027400.9565220.440000
1700.0126000.9009570.8217050.7160490.9354840.580000
1800.0092001.0906410.7751940.6027400.9565220.440000
1900.0099001.1435510.7596900.5507251.0000000.380000
2000.0079001.0393370.8062020.6835440.9310340.540000
2100.0059001.0418010.8062020.6835440.9310340.540000
2200.0055001.0370000.8062020.6835440.9310340.540000
2300.0061001.0801740.8062020.6835440.9310340.540000
2400.0048001.1463070.7906980.6400000.9600000.480000
2500.0054001.1532420.7906980.6400000.9600000.480000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:02]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "Some weights of the model checkpoint at roberta-base were not used when initializing RobertaForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'roberta.pooler.dense.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'roberta.pooler.dense.weight', 'lm_head.bias']\n", - "- This IS expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", - "- This IS NOT expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", - "Some weights of RobertaForSequenceClassification were not initialized from the model checkpoint at roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", - "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", - "Some weights of the model checkpoint at allenai/longformer-base-4096 were not used when initializing LongformerForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'lm_head.bias']\n", - "- This IS expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", - "- This IS NOT expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", - "Some weights of LongformerForSequenceClassification were not initialized from the model checkpoint at allenai/longformer-base-4096 and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", - "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", - "Some weights of the model checkpoint at google/bigbird-roberta-base were not used when initializing BigBirdForSequenceClassification: ['cls.predictions.transform.dense.weight', 'cls.seq_relationship.bias', 'cls.predictions.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.decoder.bias', 'cls.seq_relationship.weight', 'cls.predictions.transform.LayerNorm.weight']\n", - "- This IS expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", - "- This IS NOT expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", - "Some weights of BigBirdForSequenceClassification were not initialized from the model checkpoint at google/bigbird-roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.out_proj.weight', 'classifier.dense.bias', 'classifier.dense.weight']\n", - "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n" - ] - }, - { - "output_type": "stream", - "name": "stdout", - "text": [ - "Training roberta on size 1024\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - " 0%| | 0/1 [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [250/250 09:29, Epoch 13/14]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.5862000.8314900.6124030.0000000.0000000.000000
200.5571000.6717520.6124030.0000000.0000000.000000
300.5205000.8132380.6124030.0000000.0000000.000000
400.5676000.6583520.6124030.0000000.0000000.000000
500.4380000.7975360.6124030.0000000.0000000.000000
600.5666000.6183760.6124030.0000000.0000000.000000
700.4614000.5945660.6124030.0000000.0000000.000000
800.4601000.5996140.6124030.0000000.0000000.000000
900.4148000.7595600.7054260.3870971.0000000.240000
1000.3872000.4748850.8062020.6987950.8787880.580000
1100.2514000.5871110.8062020.6753250.9629630.520000
1200.3027000.4451100.8372090.7529410.9142860.640000
1300.1673000.6552120.8062020.6666671.0000000.500000
1400.1820000.4531000.8217050.7526880.8139530.700000
1500.0988000.5448540.8527130.7710840.9696970.640000
1600.0688000.6007770.8682170.8045980.9459460.700000
1700.0637000.7247080.8527130.7710840.9696970.640000
1800.0657000.7044680.8682170.8045980.9459460.700000
1900.0048000.8514030.8449610.7560980.9687500.620000
2000.0352000.8005530.8527130.7764710.9428570.660000
2100.0236000.7849010.8604650.7954550.9210530.700000
2200.0363000.9069540.8294570.7317070.9375000.600000
2300.0112000.9311510.8294570.7317070.9375000.600000
2400.0126000.8370810.8682170.8045980.9459460.700000
2500.0057000.8202780.8682170.8045980.9459460.700000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:02]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stdout", - "text": [ - "Training longformer on size 1024\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - " 0%| | 0/1 [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [222/250 19:40 < 02:30, 0.19 it/s, Epoch 11.60/14]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.5801000.8557980.6124030.0000000.0000000.000000
200.5486000.7344800.6124030.0000000.0000000.000000
300.4956000.8862200.6124030.0000000.0000000.000000
400.5132000.6673670.6124030.0000000.0000000.000000
500.3483000.8080920.6124030.0000000.0000000.000000
600.4464000.6704740.6279070.0769231.0000000.040000
700.4044000.6904570.7209300.4375001.0000000.280000
800.3574000.8125860.7519380.5294121.0000000.360000
900.2684000.5704040.7674420.6739130.7380950.620000
1000.2268000.8022430.7441860.5714290.8148150.440000
1100.1566000.9707190.7441860.5479450.8695650.400000
1200.2071000.7912540.7596900.6172840.8064520.500000
1300.0983001.0002490.7364340.5526320.8076920.420000
1400.0648001.1780580.7596900.5974030.8518520.460000
1500.1233001.1176090.7829460.6410260.8928570.500000
1600.0395001.1939210.7751940.6027400.9565220.440000
1700.1363001.7650000.7131780.4126981.0000000.260000
1800.0405001.0535010.7674420.6341460.8125000.520000
1900.0600001.2812060.7829460.6216220.9583330.460000
2000.0191001.2842090.7829460.6315790.9230770.480000
2100.0495001.2976020.7751940.6233770.8888890.480000
2200.0027001.2993790.7674420.6153850.8571430.480000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [250/250 22:29, Epoch 13/14]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.5801000.8557980.6124030.0000000.0000000.000000
200.5486000.7344800.6124030.0000000.0000000.000000
300.4956000.8862200.6124030.0000000.0000000.000000
400.5132000.6673670.6124030.0000000.0000000.000000
500.3483000.8080920.6124030.0000000.0000000.000000
600.4464000.6704740.6279070.0769231.0000000.040000
700.4044000.6904570.7209300.4375001.0000000.280000
800.3574000.8125860.7519380.5294121.0000000.360000
900.2684000.5704040.7674420.6739130.7380950.620000
1000.2268000.8022430.7441860.5714290.8148150.440000
1100.1566000.9707190.7441860.5479450.8695650.400000
1200.2071000.7912540.7596900.6172840.8064520.500000
1300.0983001.0002490.7364340.5526320.8076920.420000
1400.0648001.1780580.7596900.5974030.8518520.460000
1500.1233001.1176090.7829460.6410260.8928570.500000
1600.0395001.1939210.7751940.6027400.9565220.440000
1700.1363001.7650000.7131780.4126981.0000000.260000
1800.0405001.0535010.7674420.6341460.8125000.520000
1900.0600001.2812060.7829460.6216220.9583330.460000
2000.0191001.2842090.7829460.6315790.9230770.480000
2100.0495001.2976020.7751940.6233770.8888890.480000
2200.0027001.2993790.7674420.6153850.8571430.480000
2300.0420001.2312170.7519380.6097560.7812500.500000
2400.0025001.3666920.7751940.6133330.9200000.460000
2500.0314001.3506830.7751940.6133330.9200000.460000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:06]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stdout", - "text": [ - "Training bigbird on size 1024\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - " 0%| | 0/1 [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [250/250 25:42, Epoch 13/14]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.5777000.7593690.6124030.0000000.0000000.000000
200.5995000.7966410.6124030.0000000.0000000.000000
300.5416000.7993030.6124030.0000000.0000000.000000
400.5718000.6822900.6124030.0000000.0000000.000000
500.4031000.7064930.6124030.0000000.0000000.000000
600.4894000.6949780.6124030.0000000.0000000.000000
700.5066000.6205150.6124030.0000000.0000000.000000
800.4676000.5150160.8062020.7126440.8378380.620000
900.3906000.6008300.7751940.5915491.0000000.420000
1000.3525000.4853720.8062020.7126440.8378380.620000
1100.3179000.4884100.8217050.7294120.8857140.620000
1200.3091000.5015110.8217050.7160490.9354840.580000
1300.1801000.4693600.8294570.7708330.8043480.740000
1400.1451000.5011400.8294570.7500000.8684210.660000
1500.1639000.5115600.8449610.7727270.8947370.680000
1600.0821000.5060370.8527130.7764710.9428570.660000
1700.1325000.5065900.8527130.7956990.8604650.740000
1800.0957000.5063320.8449610.7727270.8947370.680000
1900.0484000.5282760.8449610.7674420.9166670.660000
2000.0519000.5604700.8449610.7674420.9166670.660000
2100.0675000.5607670.8527130.7956990.8604650.740000
2200.0417000.5682420.8527130.7956990.8604650.740000
2300.0400000.5733020.8527130.7912090.8780490.720000
2400.0392000.5976740.8449610.7674420.9166670.660000
2500.0316000.5947710.8449610.7674420.9166670.660000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:07]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "Some weights of the model checkpoint at roberta-base were not used when initializing RobertaForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'roberta.pooler.dense.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'roberta.pooler.dense.weight', 'lm_head.bias']\n", - "- This IS expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", - "- This IS NOT expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", - "Some weights of RobertaForSequenceClassification were not initialized from the model checkpoint at roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", - "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", - "Some weights of the model checkpoint at allenai/longformer-base-4096 were not used when initializing LongformerForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'lm_head.bias']\n", - "- This IS expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", - "- This IS NOT expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", - "Some weights of LongformerForSequenceClassification were not initialized from the model checkpoint at allenai/longformer-base-4096 and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", - "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", - "Some weights of the model checkpoint at google/bigbird-roberta-base were not used when initializing BigBirdForSequenceClassification: ['cls.predictions.transform.dense.weight', 'cls.seq_relationship.bias', 'cls.predictions.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.decoder.bias', 'cls.seq_relationship.weight', 'cls.predictions.transform.LayerNorm.weight']\n", - "- This IS expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", - "- This IS NOT expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", - "Some weights of BigBirdForSequenceClassification were not initialized from the model checkpoint at google/bigbird-roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.out_proj.weight', 'classifier.dense.bias', 'classifier.dense.weight']\n", - "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n" - ] - }, - { - "output_type": "stream", - "name": "stdout", - "text": [ - "Training roberta on size 2048\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - " 0%| | 0/1 [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [250/250 09:19, Epoch 8/9]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6745000.6575070.6124030.0000000.0000000.000000
200.6305000.7073850.6124030.0000000.0000000.000000
300.6737000.6436040.6124030.0000000.0000000.000000
400.6088000.6241070.6124030.0000000.0000000.000000
500.5849000.5840060.6124030.0000000.0000000.000000
600.4944000.7023760.6666670.2456141.0000000.140000
700.5031000.5390030.7829460.6888890.7750000.620000
800.4674000.5028340.7829460.6888890.7750000.620000
900.4304000.5513730.8217050.7356320.8648650.640000
1000.3888000.4849050.7829460.7021280.7500000.660000
1100.3554000.4835310.7906980.7032970.7804880.640000
1200.3229000.5825270.8294570.7380950.9117650.620000
1300.2407000.4963170.7984500.7400000.7400000.740000
1400.1706000.6080890.8294570.7317070.9375000.600000
1500.1949000.4843110.8372090.7789470.8222220.740000
1600.1670000.5677860.8527130.7816090.9189190.680000
1700.1164000.5996040.8372090.7741940.8372090.720000
1800.1195000.6524960.8372090.7741940.8372090.720000
1900.1358000.6581740.8372090.7741940.8372090.720000
2000.0764000.7435750.8294570.7380950.9117650.620000
2100.0394000.8111440.8294570.7380950.9117650.620000
2200.0836000.7966130.8217050.7356320.8648650.640000
2300.0545000.8209970.8294570.7500000.8684210.660000
2400.0546000.8356190.8372090.7640450.8717950.680000
2500.0704000.8461550.8372090.7586210.8918920.660000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:02]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stdout", - "text": [ - "Training longformer on size 2048\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - " 0%| | 0/1 [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [250/250 37:03, Epoch 8/9]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6877000.6616410.6124030.0000000.0000000.000000
200.6163000.6718560.6124030.0000000.0000000.000000
300.6690000.6335200.6124030.0000000.0000000.000000
400.5919000.6016200.6124030.0000000.0000000.000000
500.5173000.4933860.7751940.6666670.7837840.580000
600.4630000.7329430.7286820.4615381.0000000.300000
700.3875000.4388260.8217050.7578950.8000000.720000
800.3833000.4539310.7751940.7238100.6909090.760000
900.3374000.4141730.8682170.8045980.9459460.700000
1000.2628000.4060230.8682170.8045980.9459460.700000
1100.2852000.4233260.8449610.8039220.7884620.820000
1200.2568000.4337590.8759690.8139530.9722220.700000
1300.3325000.4749970.8217050.7766990.7547170.800000
1400.2235000.4595500.8682170.8045980.9459460.700000
1500.1416000.4777920.8527130.8041240.8297870.780000
1600.1416000.5470140.8604650.7906980.9444440.680000
1700.1678000.5806680.8449610.7916670.8260870.760000
1800.1051000.6098610.8759690.8222220.9250000.740000
1900.1534000.6233640.8449610.7959180.8125000.780000
2000.0293000.6682980.8604650.7954550.9210530.700000
2100.0299000.6908290.8682170.8131870.9024390.740000
2200.0770000.7798870.8294570.7843140.7692310.800000
2300.0577000.7272030.8604650.8000000.9000000.720000
2400.0851000.7239600.8759690.8260870.9047620.760000
2500.0214000.7210290.8759690.8260870.9047620.760000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:11]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stdout", - "text": [ - "Training bigbird on size 2048\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - " 0%| | 0/1 [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [250/250 47:01, Epoch 8/9]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6773000.6542260.6124030.0000000.0000000.000000
200.5884000.6105540.6124030.0000000.0000000.000000
300.5951000.5206610.7984500.6829270.8750000.560000
400.4360000.4408620.8294570.7924530.7500000.840000
500.4101000.3731920.8604650.8125000.8478260.780000
600.3651000.4215950.8372090.7469880.9393940.620000
700.2908000.3621630.8682170.8282830.8367350.820000
800.2907000.3599470.8682170.8247420.8510640.800000
900.1960000.3576360.8914730.8510640.9090910.800000
1000.1337000.3958590.8759690.8181820.9473680.720000
1100.2049000.4027240.8759690.8181820.9473680.720000
1200.1471000.4107570.8914730.8409090.9736840.740000
1300.1146000.4779820.8759690.8139530.9722220.700000
1400.0465000.4561940.8682170.8210530.8666670.780000
1500.1082000.4464610.8992250.8539330.9743590.760000
1600.0265000.4582640.8914730.8444440.9500000.760000
1700.1228000.5022630.8682170.8247420.8510640.800000
1800.0698000.4933100.8837210.8387100.9069770.780000
1900.0905000.4932130.8837210.8387100.9069770.780000
2000.0092000.4654220.9069770.8666670.9750000.780000
2100.0249000.5097550.8837210.8387100.9069770.780000
2200.0684000.5513330.8759690.8367350.8541670.820000
2300.0440000.4657040.9069770.8666670.9750000.780000
2400.0815000.4744010.9069770.8666670.9750000.780000
2500.0075000.4708650.9069770.8666670.9750000.780000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:15]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "Some weights of the model checkpoint at roberta-base were not used when initializing RobertaForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'roberta.pooler.dense.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'roberta.pooler.dense.weight', 'lm_head.bias']\n", - "- This IS expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", - "- This IS NOT expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", - "Some weights of RobertaForSequenceClassification were not initialized from the model checkpoint at roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", - "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", - "Some weights of the model checkpoint at allenai/longformer-base-4096 were not used when initializing LongformerForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'lm_head.bias']\n", - "- This IS expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", - "- This IS NOT expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", - "Some weights of LongformerForSequenceClassification were not initialized from the model checkpoint at allenai/longformer-base-4096 and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", - "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", - "Some weights of the model checkpoint at google/bigbird-roberta-base were not used when initializing BigBirdForSequenceClassification: ['cls.predictions.transform.dense.weight', 'cls.seq_relationship.bias', 'cls.predictions.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.decoder.bias', 'cls.seq_relationship.weight', 'cls.predictions.transform.LayerNorm.weight']\n", - "- This IS expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", - "- This IS NOT expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", - "Some weights of BigBirdForSequenceClassification were not initialized from the model checkpoint at google/bigbird-roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.out_proj.weight', 'classifier.dense.bias', 'classifier.dense.weight']\n", - "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n" - ] - }, - { - "output_type": "stream", - "name": "stdout", - "text": [ - "Training roberta on size 4096\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - " 0%| | 0/1 [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [250/250 09:22, Epoch 8/9]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6635000.6588280.6124030.0000000.0000000.000000
200.7123000.6524970.6124030.0000000.0000000.000000
300.6483000.6384170.6124030.0000000.0000000.000000
400.6462000.6032880.6124030.0000000.0000000.000000
500.6309000.5552630.7441860.5925930.7741940.480000
600.5424000.5814190.7519380.5428570.9500000.380000
700.5093000.5027400.7906980.7032970.7804880.640000
800.4019000.5551120.8062020.7058820.8571430.600000
900.3354000.4721360.7906980.7096770.7674420.660000
1000.3271000.4580290.8139530.7272730.8421050.640000
1100.2899000.4906750.8294570.7380950.9117650.620000
1200.2446000.5332360.7984500.6750000.9000000.540000
1300.2438000.4955920.8449610.7727270.8947370.680000
1400.4435000.5853940.7984500.6486491.0000000.480000
1500.2112000.5051090.8294570.7380950.9117650.620000
1600.1506000.5536820.8527130.7865170.8974360.700000
1700.1176000.6315130.8139530.7209300.8611110.620000
1800.1583000.6734380.8294570.7800000.7800000.780000
1900.3698000.6275180.8294570.7500000.8684210.660000
2000.1237000.6486760.8217050.7415730.8461540.660000
2100.1373000.7427670.8139530.7000000.9333330.560000
2200.1648000.6804870.8372090.7469880.9393940.620000
2300.0893000.7018560.8372090.7469880.9393940.620000
2400.1636000.6322020.8372090.7586210.8918920.660000
2500.0445000.6228560.8217050.7415730.8461540.660000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:02]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stdout", - "text": [ - "Training longformer on size 4096\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - " 0%| | 0/1 [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [250/250 1:07:55, Epoch 8/9]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6627000.6625020.6124030.0000000.0000000.000000
200.7011000.6349400.6124030.0000000.0000000.000000
300.6244000.6928170.6124030.0000000.0000000.000000
400.6195000.5712760.7364340.6792450.6428570.720000
500.5235000.5326420.7674420.6341460.8125000.520000
600.4123000.5375400.7829460.6818180.7894740.600000
700.4279000.4852780.7751940.7238100.6909090.760000
800.3466000.5216630.7906980.6966290.7948720.620000
900.2342000.4585160.7984500.7450980.7307690.760000
1000.2691000.4783830.8294570.7608700.8333330.700000
1100.2478000.5600490.7984500.6829270.8750000.560000
1200.2523000.4638210.8217050.7472530.8292680.680000
1300.1952000.4824560.8294570.7608700.8333330.700000
1400.1931000.6225490.8139530.7209300.8611110.620000
1500.1286000.6279150.8372090.7640450.8717950.680000
1600.1453000.6785510.8372090.7640450.8717950.680000
1700.0657000.8652290.8139530.7142860.8823530.600000
1800.1158000.7094490.8527130.7956990.8604650.740000
1900.1314000.8761560.8294570.7441860.8888890.640000
2000.0234000.7345280.8372090.7789470.8222220.740000
2100.1692000.7785480.8372090.7692310.8536590.700000
2200.0768000.8179040.8294570.7555560.8500000.680000
2300.0509000.8291820.8372090.7640450.8717950.680000
2400.0355000.8332720.8372090.7640450.8717950.680000
2500.0284000.8304700.8449610.7777780.8750000.700000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:22]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stdout", - "text": [ - "Training bigbird on size 4096\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - " 0%| | 0/1 [00:00" - ], - "text/html": [ - "\n", - "
\n", - " \n", - " \n", - " [250/250 1:29:59, Epoch 8/9]\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6333000.6639030.6124030.0000000.0000000.000000
200.6412000.6060360.6124030.0000000.0000000.000000
300.5808000.6284150.6124030.0000000.0000000.000000
400.5534000.4947540.8449610.7727270.8947370.680000
500.4913000.4559930.8294570.7800000.7800000.780000
600.3898000.4539850.8372090.7407410.9677420.600000
700.3491000.3884940.8217050.7472530.8292680.680000
800.3235000.4043260.8217050.7889910.7288140.860000
900.2506000.3785070.8372090.7789470.8222220.740000
1000.2102000.3921780.8604650.8085110.8636360.760000
1100.2176000.3936500.8682170.8089890.9230770.720000
1200.1668000.3905460.8759690.8181820.9473680.720000
1300.1817000.3478400.8914730.8510640.9090910.800000
1400.1764000.3637090.8837210.8421050.8888890.800000
1500.1241000.3983420.8837210.8314610.9487180.740000
1600.0962000.3509450.8914730.8600000.8600000.860000
1700.0585000.4323160.8682170.8172040.8837210.760000
1800.0896000.4170360.8759690.8333330.8695650.800000
1900.0719000.4281600.8837210.8387100.9069770.780000
2000.0660000.4212070.8837210.8453610.8723400.820000
2100.0583000.4217710.8759690.8333330.8695650.800000
2200.0522000.4493450.8837210.8387100.9069770.780000
2300.0604000.4551530.8759690.8333330.8695650.800000
2400.0123000.4629740.8759690.8333330.8695650.800000
2500.0142000.4644450.8759690.8333330.8695650.800000

" - ] - }, - "metadata": { - "tags": [] - } - }, - { - "output_type": "stream", - "name": "stderr", - "text": [ - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", - "\n", - "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n", - "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", - "\n", - "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", - "\n" - ] - }, - { - "output_type": "display_data", - "data": { - "text/plain": [ - "" - ], - "text/html": [ - "\n", - "

\n", - " \n", - " \n", - " [33/33 00:28]\n", - "
\n", - " " - ] - }, - "metadata": { - "tags": [] - } - } - ], + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "9glZmD-TTLni" + }, + "outputs": [], + "source": [ + "# Load the tokenizer\n", + "tokenizer = RobertaTokenizer.from_pretrained('roberta-base')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "s63EKk7JVdsY" + }, + "outputs": [], + "source": [ + "# Load the hyperpartisan dataset\n", + "ds = load_dataset('hyperpartisan_news_detection', 'byarticle')['train']\n", + "\n", + "# Rename the label column for uniformity\n", + "ds = ds.rename_column(\"hyperpartisan\", \"label\")\n", + "\n", + "# Remove unused columns\n", + "ds = ds.remove_columns(['title', 'url', 'published_at'])\n", + "\n", + "# Add token length column to filter on later\n", + "ds=ds.add_column(name = 'token_length', column=[len(tokenizer.batch_encode_plus([x['text']]).input_ids[0]) for x in ds])" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "Rpj-LxJeb7p8" + }, + "source": [ + "Split into train and test set" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "_x2ZmgVEIM8z" + }, + "outputs": [], + "source": [ + "split_ds = ds.train_test_split(test_size=0.20)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "4PSVpBhGb94w" + }, + "source": [ + "Make various train partitions" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "kE7FKkQ_THBR" + }, + "outputs": [], + "source": [ + "train_ds = split_ds['train']\n", + "test_ds = split_ds['test']\n", + "\n", + "train_ds_dict = {}\n", + "\n", + "for min_tok, max_tok in [(0,256), (0, 512), (0, 1024), (0, 2048), (0, 4096)]:\n", + " # Filter on the lengths\n", + " train_ds_dict[str(max_tok)] = train_ds.filter(lambda x : x['token_length'] <= max_tok).filter(lambda x : x['token_length'] > min_tok)\n", + "\n", + " # Select the closest even number\n", + " # the longformer optimizer cannot handle a single remaining datapoint at the end of the epoch\n", + " train_ds_dict[str(max_tok)] = train_ds_dict[str(max_tok)].select(range(\n", + " math.floor(\n", + " len(train_ds_dict[str(max_tok)]) / 2.\n", + " ) * 2\n", + " ))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "joa1qCMA_KpG" + }, + "source": [ + "# ๐Ÿ’ฅ Models\n", + "\n", + "Run the cells below to redo the training of each model\n", + "\n", + "๐Ÿ›Ž๏ธ Disclaimer: this will take some time... So if you're a busy bee or a hurrying hippo, you can skip this section and load in the results from one of our runs smoothly and swiftly ๐Ÿ˜Š!" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "Sju2Yw8n_KpH" + }, + "source": [ + "## ๐Ÿ’ช Training" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "PJ6mWoPjVstv" + }, + "outputs": [], + "source": [ + "def compute_metrics(pred):\n", + " labels = pred.label_ids\n", + " preds = pred.predictions.argmax(-1)\n", + " precision, recall, f1, _ = precision_recall_fscore_support(labels, preds, average='binary')\n", + " acc = accuracy_score(labels, preds)\n", + " return {\n", + " 'accuracy': acc,\n", + " 'f1': f1,\n", + " 'precision': precision,\n", + " 'recall': recall\n", + " }\n", + "\n", + "def train_and_evaluate(timing_run, model, identifier, tokenizer, train_ds, test_ds, max_length=1024):\n", + " def tokenization(batched_text):\n", + " return tokenizer(batched_text['text'], padding='max_length', truncation=True, max_length=min(max_length, tokenizer.model_max_length))\n", + "\n", + " # tokenizing both the training and test dataset \n", + " train_ds = train_ds.map(tokenization,\n", + " batched=True,\n", + " batch_size=len(train_ds),\n", + " remove_columns=['text'])\n", + "\n", + " test_ds = test_ds.map(tokenization,\n", + " batched=True,\n", + " batch_size=len(test_ds),\n", + " remove_columns=['text'])\n", + "\n", + " train_ds.set_format('torch', columns=['input_ids', 'attention_mask', 'label'])\n", + " test_ds.set_format('torch', columns=['input_ids', 'attention_mask', 'label'])\n", + "\n", + " parameters = model.num_parameters()\n", + "\n", + " # Set different number of steps for accuracy and timing runs\n", + " logging_steps = 1000 if timing_run else 10\n", + " eval_steps = 1000 if timing_run else 10\n", + " max_steps = 100 if timing_run else 250\n", + "\n", + " training_args = TrainingArguments(\n", + " # Set the batch sizes\n", + " per_device_train_batch_size=2,\n", + " per_device_eval_batch_size=4,\n", + "\n", + " # Apply efficiency tricks\n", + " gradient_accumulation_steps=8,\n", + " fp16=True,\n", + " \n", + " # steps paramters\n", + " evaluation_strategy=\"steps\",\n", + " warmup_steps=0,\n", + " eval_steps=eval_steps, \n", + " max_steps=max_steps,\n", + " logging_steps=logging_steps,\n", + "\n", + " # Optimizer parameters\n", + " learning_rate=2e-5,\n", + "\n", + " # Finalization\n", + " load_best_model_at_end=False if timing_run else True,\n", + " metric_for_best_model='accuracy', # default value is validation loss, we want the model with the highest accuracy \n", + " \n", + " # Output locations\n", + " output_dir='./{}'.format(identifier),\n", + " run_name='{}'.format(identifier),\n", + " logging_dir='./{}-logging'.format(identifier),\n", + " log_level='warning',\n", + " save_strategy=\"steps\"\n", + " )\n", + "\n", + " trainer = Trainer(\n", + " model=model,\n", + " args=training_args,\n", + " compute_metrics=compute_metrics,\n", + " train_dataset=train_ds,\n", + " eval_dataset=test_ds,\n", + " )\n", + " start_time = time.time()\n", + " trainer.train()\n", + " duration = time.time() - start_time\n", + "\n", + " shutil.rmtree('./{}'.format(identifier))\n", + "\n", + " metrics = None\n", + " if not timing_run:\n", + " metrics = trainer.evaluate()\n", + "\n", + " return trainer, parameters, metrics, duration" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "88g5S6z7BSZX" + }, + "outputs": [], + "source": [ + "def run_training(timing_run, model, tokenizer, model_name, train_ds, test_ds, max_length=1024):\n", + " torch.cuda.empty_cache()\n", + "\n", + " try:\n", + " identifier = '{}-{}-{}'.format(model_name, max_length, 'timing' if timing_run else 'accuracy').lower()\n", + " trainer, parameters, metrics, duration = train_and_evaluate(timing_run, model, identifier, tokenizer, train_ds, test_ds, max_length)\n", + "\n", + " results_and_metrics = {\n", + " \"model_name\": model_name,\n", + " \"parameters\": parameters,\n", + " \"duration\": duration,\n", + " \"metrics\": metrics,\n", + " 'tokencount': str(max_length),\n", + " 'timing_run': timing_run\n", + " }\n", + " with open(f'{identifier}-results.json', \"w\") as fd:\n", + " json.dump(results_and_metrics, fd)\n", + "\n", + " except RuntimeError:\n", + " del trainer \n", + "\n", + " del model \n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "7InJLnvld8zu" + }, + "source": [ + "### ๐ŸŽฏ Accuracy runs" + ] + }, + { + "cell_type": "code", + "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/", @@ -20741,142 +791,5991 @@ "8c93decc6aeb49f5bf21f47cd917c85a" ] }, - "id": "gBqkgqKld4-I", - "outputId": "6b9ee730-0768-474d-f5e2-2d3beae88367" - } + "id": "gBqkgqKld4-I", + "outputId": "6b9ee730-0768-474d-f5e2-2d3beae88367" + }, + "outputs": [ + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "55899ffd648e4247a473524510624500", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "Downloading: 0%| | 0.00/501M [00:00\n", + " \n", + " \n", + " [250/250 07:20, Epoch 83/84]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6642000.7619170.6124030.0000000.0000000.000000
200.5068000.9617460.6124030.0000000.0000000.000000
300.4043000.9413790.6124030.0000000.0000000.000000
400.2604001.2895190.6201550.0392161.0000000.020000
500.1290001.5491110.6589150.2142861.0000000.120000
600.0096001.4948550.7364340.5142860.9000000.360000
700.0023002.4640120.6356590.1132081.0000000.060000
800.0013002.5766930.6356590.1132081.0000000.060000
900.0008002.4359690.6821710.3050851.0000000.180000
1000.0007002.4241430.6899220.3333331.0000000.200000
1100.0006002.4389690.6976740.3606561.0000000.220000
1200.0006002.4575290.6976740.3606561.0000000.220000
1300.0005002.4768890.6976740.3606561.0000000.220000
1400.0005002.4923020.6976740.3606561.0000000.220000
1500.0005002.5057300.6976740.3606561.0000000.220000
1600.0004002.5173070.6976740.3606561.0000000.220000
1700.0004002.5266350.6976740.3606561.0000000.220000
1800.0004002.5313350.6976740.3606561.0000000.220000
1900.0004002.5351940.6976740.3606561.0000000.220000
2000.0004002.5400780.6976740.3606561.0000000.220000
2100.0004002.5451030.6976740.3606561.0000000.220000
2200.0004002.5478110.6976740.3606561.0000000.220000
2300.0003002.5493940.6976740.3606561.0000000.220000
2400.0003002.5507460.6976740.3606561.0000000.220000
2500.0003002.5513100.6976740.3606561.0000000.220000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:01]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training longformer on size 256\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "0f29b422508a484f98d8541e7c42c67d", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + " 0%| | 0/1 [00:00\n", + " \n", + " \n", + " [250/250 10:53, Epoch 83/84]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6500000.8352490.6124030.0000000.0000000.000000
200.4681001.1130630.6124030.0000000.0000000.000000
300.2259001.5616310.6201550.0392161.0000000.020000
400.0742001.5010760.6899220.3939390.8125000.260000
500.0078002.4290080.6279070.0769231.0000000.040000
600.0021002.5967270.6434110.1481481.0000000.080000
700.0012002.0764510.7131780.5066670.7600000.380000
800.0012002.1613570.6976740.4935060.7037040.380000
900.0007002.7653820.6589150.2142861.0000000.120000
1000.0528002.5479760.6744190.2758621.0000000.160000
1100.0009002.2260270.7054260.4062500.9285710.260000
1200.0007002.3277050.6976740.4179100.8235290.280000
1300.0006002.3609900.6976740.4179100.8235290.280000
1400.0005002.3863560.6976740.4179100.8235290.280000
1500.0005002.4174680.6976740.4179100.8235290.280000
1600.0005002.4434670.6976740.4179100.8235290.280000
1700.0005002.4622830.6976740.4179100.8235290.280000
1800.0004002.4769860.6976740.4179100.8235290.280000
1900.0004002.4890600.6976740.4179100.8235290.280000
2000.0004002.4986560.6976740.4179100.8235290.280000
2100.0004002.5063510.6976740.4179100.8235290.280000
2200.0004002.5119280.6976740.4179100.8235290.280000
2300.0004002.5161280.6976740.4179100.8235290.280000
2400.0004002.5186930.6976740.4179100.8235290.280000
2500.0004002.5196280.6976740.4179100.8235290.280000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:02]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training bigbird on size 256\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "d2b93cf59d5849699938b996a19ede97", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + " 0%| | 0/1 [00:00\n", + " \n", + " \n", + " [250/250 07:28, Epoch 83/84]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6298000.7492910.6124030.0000000.0000000.000000
200.5173000.8522470.6124030.0000000.0000000.000000
300.3887000.8704010.6124030.0000000.0000000.000000
400.2688001.0119610.6124030.0000000.0000000.000000
500.2162001.0144580.6744190.3000000.9000000.180000
600.0949000.8895750.7131780.6021510.6511630.560000
700.0331001.1210460.7364340.5853660.7500000.480000
800.0149001.3733600.6976740.4347830.7894740.300000
900.0086001.4575020.7209300.5000000.8181820.360000
1000.0066001.5336530.7131780.4931510.7826090.360000
1100.0049001.6133220.7209300.5000000.8181820.360000
1200.0043001.6532420.7209300.5000000.8181820.360000
1300.0037001.6837650.7209300.5000000.8181820.360000
1400.0035001.7110290.7209300.5000000.8181820.360000
1500.0030001.7414980.7131780.4788730.8095240.340000
1600.0028001.7681990.7131780.4788730.8095240.340000
1700.0025001.7865190.7131780.4788730.8095240.340000
1800.0024001.7995920.7131780.4788730.8095240.340000
1900.0025001.8076350.7131780.4788730.8095240.340000
2000.0023001.8106740.7131780.4788730.8095240.340000
2100.0022001.8148100.7131780.4788730.8095240.340000
2200.0022001.8217240.7131780.4788730.8095240.340000
2300.0021001.8267760.7131780.4788730.8095240.340000
2400.0021001.8300160.7131780.4788730.8095240.340000
2500.0021001.8312960.7131780.4788730.8095240.340000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:01]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Some weights of the model checkpoint at roberta-base were not used when initializing RobertaForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'roberta.pooler.dense.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'roberta.pooler.dense.weight', 'lm_head.bias']\n", + "- This IS expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", + "- This IS NOT expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", + "Some weights of RobertaForSequenceClassification were not initialized from the model checkpoint at roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", + "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", + "Some weights of the model checkpoint at allenai/longformer-base-4096 were not used when initializing LongformerForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'lm_head.bias']\n", + "- This IS expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", + "- This IS NOT expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", + "Some weights of LongformerForSequenceClassification were not initialized from the model checkpoint at allenai/longformer-base-4096 and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", + "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", + "Some weights of the model checkpoint at google/bigbird-roberta-base were not used when initializing BigBirdForSequenceClassification: ['cls.predictions.transform.dense.weight', 'cls.seq_relationship.bias', 'cls.predictions.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.decoder.bias', 'cls.seq_relationship.weight', 'cls.predictions.transform.LayerNorm.weight']\n", + "- This IS expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", + "- This IS NOT expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", + "Some weights of BigBirdForSequenceClassification were not initialized from the model checkpoint at google/bigbird-roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.out_proj.weight', 'classifier.dense.bias', 'classifier.dense.weight']\n", + "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training roberta on size 512\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "62f55f22a2e444e8948094001af7c68a", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + " 0%| | 0/1 [00:00\n", + " \n", + " \n", + " [250/250 09:46, Epoch 27/28]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.4999000.9839210.6124030.0000000.0000000.000000
200.5274000.8685350.6124030.0000000.0000000.000000
300.4750000.7876690.6124030.0000000.0000000.000000
400.4758000.7870850.6124030.0000000.0000000.000000
500.3321000.7716630.6124030.0000000.0000000.000000
600.3130001.1676880.6124030.0000000.0000000.000000
700.2499000.6919680.8139530.7209300.8611110.620000
800.1487000.7492250.8062020.7252750.8048780.660000
900.1215001.4531970.7131780.4126981.0000000.260000
1000.0878000.8184210.8294570.7500000.8684210.660000
1100.0125001.0005040.8217050.7160490.9354840.580000
1200.0016001.0885490.8217050.7160490.9354840.580000
1300.0011001.1476730.8217050.7160490.9354840.580000
1400.0009001.1848940.8217050.7160490.9354840.580000
1500.0008001.1981580.8217050.7160490.9354840.580000
1600.0007001.2181820.8217050.7160490.9354840.580000
1700.0006001.2360450.8217050.7160490.9354840.580000
1800.0006001.2455580.8217050.7160490.9354840.580000
1900.0006001.2581830.8217050.7160490.9354840.580000
2000.0005001.2682030.8139530.7000000.9333330.560000
2100.0005001.2790890.8139530.7000000.9333330.560000
2200.0005001.2887000.8139530.7000000.9333330.560000
2300.0005001.2939340.8139530.7000000.9333330.560000
2400.0005001.2951740.8139530.7000000.9333330.560000
2500.0005001.2953350.8139530.7000000.9333330.560000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:02]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training longformer on size 512\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "48effe85b41246aeaf65721ad4e674d9", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + " 0%| | 0/1 [00:00\n", + " \n", + " \n", + " [250/250 14:49, Epoch 27/28]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.5655001.0445180.6124030.0000000.0000000.000000
200.5335000.7804340.6124030.0000000.0000000.000000
300.4415000.9519570.6124030.0000000.0000000.000000
400.4233000.6899050.6124030.0000000.0000000.000000
500.3069000.8793220.6046510.0000000.0000000.000000
600.2130001.2819370.6201550.0392161.0000000.020000
700.1118000.9624900.7209300.5500000.7333330.440000
800.0451001.7984140.6356590.1754390.7142860.100000
900.0066001.5679690.7209300.5500000.7333330.440000
1000.0032002.0556270.6899220.3939390.8125000.260000
1100.0017002.1873850.6821710.3692310.8000000.240000
1200.0013002.2158570.6821710.3880600.7647060.260000
1300.0303002.1951670.6976740.4347830.7894740.300000
1400.0010002.0635660.7131780.4931510.7826090.360000
1500.0088002.4833160.6589150.2666670.8000000.160000
1600.0008002.7156760.6434110.2068970.7500000.120000
1700.0192002.1281880.7131780.4931510.7826090.360000
1800.0007001.9860170.7441860.5822780.7931030.460000
1900.0007002.0025480.7441860.5822780.7931030.460000
2000.0055002.0180110.7441860.5714290.8148150.440000
2100.0006002.2845480.6976740.4507040.7619050.320000
2200.0006002.3149410.6976740.4347830.7894740.300000
2300.0006002.3283290.7054260.4411760.8333330.300000
2400.0006002.3305810.7054260.4411760.8333330.300000
2500.0005002.3317030.7054260.4411760.8333330.300000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:03]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training bigbird on size 512\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "dbc3c3d19db54d28b5b392ec3d18c24e", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + " 0%| | 0/1 [00:00\n", + " \n", + " \n", + " [250/250 10:31, Epoch 27/28]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.5555000.8479790.6124030.0000000.0000000.000000
200.4858000.7729490.6124030.0000000.0000000.000000
300.5159000.8500250.6124030.0000000.0000000.000000
400.4959000.8022470.6124030.0000000.0000000.000000
500.3861000.8980660.6124030.0000000.0000000.000000
600.4263000.7200030.6124030.0000000.0000000.000000
700.4082000.7641100.6124030.0000000.0000000.000000
800.3018000.7509210.6124030.0000000.0000000.000000
900.2381000.6806160.6279070.0769231.0000000.040000
1000.2152000.8279650.6511630.1818181.0000000.100000
1100.1497000.6685030.7674420.5714291.0000000.400000
1200.0995000.7212880.7906980.6400000.9600000.480000
1300.0428000.9357910.7441860.5217390.9473680.360000
1400.0347000.6834010.8372090.7469880.9393940.620000
1500.0223001.2789350.7054260.3870971.0000000.240000
1600.0127001.0230400.7751940.6027400.9565220.440000
1700.0126000.9009570.8217050.7160490.9354840.580000
1800.0092001.0906410.7751940.6027400.9565220.440000
1900.0099001.1435510.7596900.5507251.0000000.380000
2000.0079001.0393370.8062020.6835440.9310340.540000
2100.0059001.0418010.8062020.6835440.9310340.540000
2200.0055001.0370000.8062020.6835440.9310340.540000
2300.0061001.0801740.8062020.6835440.9310340.540000
2400.0048001.1463070.7906980.6400000.9600000.480000
2500.0054001.1532420.7906980.6400000.9600000.480000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:02]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Some weights of the model checkpoint at roberta-base were not used when initializing RobertaForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'roberta.pooler.dense.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'roberta.pooler.dense.weight', 'lm_head.bias']\n", + "- This IS expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", + "- This IS NOT expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", + "Some weights of RobertaForSequenceClassification were not initialized from the model checkpoint at roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", + "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", + "Some weights of the model checkpoint at allenai/longformer-base-4096 were not used when initializing LongformerForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'lm_head.bias']\n", + "- This IS expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", + "- This IS NOT expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", + "Some weights of LongformerForSequenceClassification were not initialized from the model checkpoint at allenai/longformer-base-4096 and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", + "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", + "Some weights of the model checkpoint at google/bigbird-roberta-base were not used when initializing BigBirdForSequenceClassification: ['cls.predictions.transform.dense.weight', 'cls.seq_relationship.bias', 'cls.predictions.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.decoder.bias', 'cls.seq_relationship.weight', 'cls.predictions.transform.LayerNorm.weight']\n", + "- This IS expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", + "- This IS NOT expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", + "Some weights of BigBirdForSequenceClassification were not initialized from the model checkpoint at google/bigbird-roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.out_proj.weight', 'classifier.dense.bias', 'classifier.dense.weight']\n", + "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training roberta on size 1024\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "b1e5d35e320040ddad4650e448ffa21d", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + " 0%| | 0/1 [00:00\n", + " \n", + " \n", + " [250/250 09:29, Epoch 13/14]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.5862000.8314900.6124030.0000000.0000000.000000
200.5571000.6717520.6124030.0000000.0000000.000000
300.5205000.8132380.6124030.0000000.0000000.000000
400.5676000.6583520.6124030.0000000.0000000.000000
500.4380000.7975360.6124030.0000000.0000000.000000
600.5666000.6183760.6124030.0000000.0000000.000000
700.4614000.5945660.6124030.0000000.0000000.000000
800.4601000.5996140.6124030.0000000.0000000.000000
900.4148000.7595600.7054260.3870971.0000000.240000
1000.3872000.4748850.8062020.6987950.8787880.580000
1100.2514000.5871110.8062020.6753250.9629630.520000
1200.3027000.4451100.8372090.7529410.9142860.640000
1300.1673000.6552120.8062020.6666671.0000000.500000
1400.1820000.4531000.8217050.7526880.8139530.700000
1500.0988000.5448540.8527130.7710840.9696970.640000
1600.0688000.6007770.8682170.8045980.9459460.700000
1700.0637000.7247080.8527130.7710840.9696970.640000
1800.0657000.7044680.8682170.8045980.9459460.700000
1900.0048000.8514030.8449610.7560980.9687500.620000
2000.0352000.8005530.8527130.7764710.9428570.660000
2100.0236000.7849010.8604650.7954550.9210530.700000
2200.0363000.9069540.8294570.7317070.9375000.600000
2300.0112000.9311510.8294570.7317070.9375000.600000
2400.0126000.8370810.8682170.8045980.9459460.700000
2500.0057000.8202780.8682170.8045980.9459460.700000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:02]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training longformer on size 1024\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "b07b810992314152a64788cdc100821d", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + " 0%| | 0/1 [00:00\n", + " \n", + " \n", + " [222/250 19:40 < 02:30, 0.19 it/s, Epoch 11.60/14]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.5801000.8557980.6124030.0000000.0000000.000000
200.5486000.7344800.6124030.0000000.0000000.000000
300.4956000.8862200.6124030.0000000.0000000.000000
400.5132000.6673670.6124030.0000000.0000000.000000
500.3483000.8080920.6124030.0000000.0000000.000000
600.4464000.6704740.6279070.0769231.0000000.040000
700.4044000.6904570.7209300.4375001.0000000.280000
800.3574000.8125860.7519380.5294121.0000000.360000
900.2684000.5704040.7674420.6739130.7380950.620000
1000.2268000.8022430.7441860.5714290.8148150.440000
1100.1566000.9707190.7441860.5479450.8695650.400000
1200.2071000.7912540.7596900.6172840.8064520.500000
1300.0983001.0002490.7364340.5526320.8076920.420000
1400.0648001.1780580.7596900.5974030.8518520.460000
1500.1233001.1176090.7829460.6410260.8928570.500000
1600.0395001.1939210.7751940.6027400.9565220.440000
1700.1363001.7650000.7131780.4126981.0000000.260000
1800.0405001.0535010.7674420.6341460.8125000.520000
1900.0600001.2812060.7829460.6216220.9583330.460000
2000.0191001.2842090.7829460.6315790.9230770.480000
2100.0495001.2976020.7751940.6233770.8888890.480000
2200.0027001.2993790.7674420.6153850.8571430.480000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [250/250 22:29, Epoch 13/14]\n", + "
\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.5801000.8557980.6124030.0000000.0000000.000000
200.5486000.7344800.6124030.0000000.0000000.000000
300.4956000.8862200.6124030.0000000.0000000.000000
400.5132000.6673670.6124030.0000000.0000000.000000
500.3483000.8080920.6124030.0000000.0000000.000000
600.4464000.6704740.6279070.0769231.0000000.040000
700.4044000.6904570.7209300.4375001.0000000.280000
800.3574000.8125860.7519380.5294121.0000000.360000
900.2684000.5704040.7674420.6739130.7380950.620000
1000.2268000.8022430.7441860.5714290.8148150.440000
1100.1566000.9707190.7441860.5479450.8695650.400000
1200.2071000.7912540.7596900.6172840.8064520.500000
1300.0983001.0002490.7364340.5526320.8076920.420000
1400.0648001.1780580.7596900.5974030.8518520.460000
1500.1233001.1176090.7829460.6410260.8928570.500000
1600.0395001.1939210.7751940.6027400.9565220.440000
1700.1363001.7650000.7131780.4126981.0000000.260000
1800.0405001.0535010.7674420.6341460.8125000.520000
1900.0600001.2812060.7829460.6216220.9583330.460000
2000.0191001.2842090.7829460.6315790.9230770.480000
2100.0495001.2976020.7751940.6233770.8888890.480000
2200.0027001.2993790.7674420.6153850.8571430.480000
2300.0420001.2312170.7519380.6097560.7812500.500000
2400.0025001.3666920.7751940.6133330.9200000.460000
2500.0314001.3506830.7751940.6133330.9200000.460000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:06]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training bigbird on size 1024\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "b31f3082260e4ba4bfd8340f85ed6105", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + " 0%| | 0/1 [00:00\n", + " \n", + " \n", + " [250/250 25:42, Epoch 13/14]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.5777000.7593690.6124030.0000000.0000000.000000
200.5995000.7966410.6124030.0000000.0000000.000000
300.5416000.7993030.6124030.0000000.0000000.000000
400.5718000.6822900.6124030.0000000.0000000.000000
500.4031000.7064930.6124030.0000000.0000000.000000
600.4894000.6949780.6124030.0000000.0000000.000000
700.5066000.6205150.6124030.0000000.0000000.000000
800.4676000.5150160.8062020.7126440.8378380.620000
900.3906000.6008300.7751940.5915491.0000000.420000
1000.3525000.4853720.8062020.7126440.8378380.620000
1100.3179000.4884100.8217050.7294120.8857140.620000
1200.3091000.5015110.8217050.7160490.9354840.580000
1300.1801000.4693600.8294570.7708330.8043480.740000
1400.1451000.5011400.8294570.7500000.8684210.660000
1500.1639000.5115600.8449610.7727270.8947370.680000
1600.0821000.5060370.8527130.7764710.9428570.660000
1700.1325000.5065900.8527130.7956990.8604650.740000
1800.0957000.5063320.8449610.7727270.8947370.680000
1900.0484000.5282760.8449610.7674420.9166670.660000
2000.0519000.5604700.8449610.7674420.9166670.660000
2100.0675000.5607670.8527130.7956990.8604650.740000
2200.0417000.5682420.8527130.7956990.8604650.740000
2300.0400000.5733020.8527130.7912090.8780490.720000
2400.0392000.5976740.8449610.7674420.9166670.660000
2500.0316000.5947710.8449610.7674420.9166670.660000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:07]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Some weights of the model checkpoint at roberta-base were not used when initializing RobertaForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'roberta.pooler.dense.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'roberta.pooler.dense.weight', 'lm_head.bias']\n", + "- This IS expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", + "- This IS NOT expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", + "Some weights of RobertaForSequenceClassification were not initialized from the model checkpoint at roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", + "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", + "Some weights of the model checkpoint at allenai/longformer-base-4096 were not used when initializing LongformerForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'lm_head.bias']\n", + "- This IS expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", + "- This IS NOT expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", + "Some weights of LongformerForSequenceClassification were not initialized from the model checkpoint at allenai/longformer-base-4096 and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", + "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", + "Some weights of the model checkpoint at google/bigbird-roberta-base were not used when initializing BigBirdForSequenceClassification: ['cls.predictions.transform.dense.weight', 'cls.seq_relationship.bias', 'cls.predictions.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.decoder.bias', 'cls.seq_relationship.weight', 'cls.predictions.transform.LayerNorm.weight']\n", + "- This IS expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", + "- This IS NOT expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", + "Some weights of BigBirdForSequenceClassification were not initialized from the model checkpoint at google/bigbird-roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.out_proj.weight', 'classifier.dense.bias', 'classifier.dense.weight']\n", + "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training roberta on size 2048\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "80913f1792cc44d6ad5599ecec511dae", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + " 0%| | 0/1 [00:00\n", + " \n", + " \n", + " [250/250 09:19, Epoch 8/9]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6745000.6575070.6124030.0000000.0000000.000000
200.6305000.7073850.6124030.0000000.0000000.000000
300.6737000.6436040.6124030.0000000.0000000.000000
400.6088000.6241070.6124030.0000000.0000000.000000
500.5849000.5840060.6124030.0000000.0000000.000000
600.4944000.7023760.6666670.2456141.0000000.140000
700.5031000.5390030.7829460.6888890.7750000.620000
800.4674000.5028340.7829460.6888890.7750000.620000
900.4304000.5513730.8217050.7356320.8648650.640000
1000.3888000.4849050.7829460.7021280.7500000.660000
1100.3554000.4835310.7906980.7032970.7804880.640000
1200.3229000.5825270.8294570.7380950.9117650.620000
1300.2407000.4963170.7984500.7400000.7400000.740000
1400.1706000.6080890.8294570.7317070.9375000.600000
1500.1949000.4843110.8372090.7789470.8222220.740000
1600.1670000.5677860.8527130.7816090.9189190.680000
1700.1164000.5996040.8372090.7741940.8372090.720000
1800.1195000.6524960.8372090.7741940.8372090.720000
1900.1358000.6581740.8372090.7741940.8372090.720000
2000.0764000.7435750.8294570.7380950.9117650.620000
2100.0394000.8111440.8294570.7380950.9117650.620000
2200.0836000.7966130.8217050.7356320.8648650.640000
2300.0545000.8209970.8294570.7500000.8684210.660000
2400.0546000.8356190.8372090.7640450.8717950.680000
2500.0704000.8461550.8372090.7586210.8918920.660000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:02]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training longformer on size 2048\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "e746cb0ccb1046b1aaef424e7719ba76", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + " 0%| | 0/1 [00:00\n", + " \n", + " \n", + " [250/250 37:03, Epoch 8/9]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6877000.6616410.6124030.0000000.0000000.000000
200.6163000.6718560.6124030.0000000.0000000.000000
300.6690000.6335200.6124030.0000000.0000000.000000
400.5919000.6016200.6124030.0000000.0000000.000000
500.5173000.4933860.7751940.6666670.7837840.580000
600.4630000.7329430.7286820.4615381.0000000.300000
700.3875000.4388260.8217050.7578950.8000000.720000
800.3833000.4539310.7751940.7238100.6909090.760000
900.3374000.4141730.8682170.8045980.9459460.700000
1000.2628000.4060230.8682170.8045980.9459460.700000
1100.2852000.4233260.8449610.8039220.7884620.820000
1200.2568000.4337590.8759690.8139530.9722220.700000
1300.3325000.4749970.8217050.7766990.7547170.800000
1400.2235000.4595500.8682170.8045980.9459460.700000
1500.1416000.4777920.8527130.8041240.8297870.780000
1600.1416000.5470140.8604650.7906980.9444440.680000
1700.1678000.5806680.8449610.7916670.8260870.760000
1800.1051000.6098610.8759690.8222220.9250000.740000
1900.1534000.6233640.8449610.7959180.8125000.780000
2000.0293000.6682980.8604650.7954550.9210530.700000
2100.0299000.6908290.8682170.8131870.9024390.740000
2200.0770000.7798870.8294570.7843140.7692310.800000
2300.0577000.7272030.8604650.8000000.9000000.720000
2400.0851000.7239600.8759690.8260870.9047620.760000
2500.0214000.7210290.8759690.8260870.9047620.760000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:11]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training bigbird on size 2048\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "69796bea16a644a7988dd096e2efec1a", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + " 0%| | 0/1 [00:00\n", + " \n", + " \n", + " [250/250 47:01, Epoch 8/9]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6773000.6542260.6124030.0000000.0000000.000000
200.5884000.6105540.6124030.0000000.0000000.000000
300.5951000.5206610.7984500.6829270.8750000.560000
400.4360000.4408620.8294570.7924530.7500000.840000
500.4101000.3731920.8604650.8125000.8478260.780000
600.3651000.4215950.8372090.7469880.9393940.620000
700.2908000.3621630.8682170.8282830.8367350.820000
800.2907000.3599470.8682170.8247420.8510640.800000
900.1960000.3576360.8914730.8510640.9090910.800000
1000.1337000.3958590.8759690.8181820.9473680.720000
1100.2049000.4027240.8759690.8181820.9473680.720000
1200.1471000.4107570.8914730.8409090.9736840.740000
1300.1146000.4779820.8759690.8139530.9722220.700000
1400.0465000.4561940.8682170.8210530.8666670.780000
1500.1082000.4464610.8992250.8539330.9743590.760000
1600.0265000.4582640.8914730.8444440.9500000.760000
1700.1228000.5022630.8682170.8247420.8510640.800000
1800.0698000.4933100.8837210.8387100.9069770.780000
1900.0905000.4932130.8837210.8387100.9069770.780000
2000.0092000.4654220.9069770.8666670.9750000.780000
2100.0249000.5097550.8837210.8387100.9069770.780000
2200.0684000.5513330.8759690.8367350.8541670.820000
2300.0440000.4657040.9069770.8666670.9750000.780000
2400.0815000.4744010.9069770.8666670.9750000.780000
2500.0075000.4708650.9069770.8666670.9750000.780000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:15]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Some weights of the model checkpoint at roberta-base were not used when initializing RobertaForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'roberta.pooler.dense.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'roberta.pooler.dense.weight', 'lm_head.bias']\n", + "- This IS expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", + "- This IS NOT expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", + "Some weights of RobertaForSequenceClassification were not initialized from the model checkpoint at roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", + "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", + "Some weights of the model checkpoint at allenai/longformer-base-4096 were not used when initializing LongformerForSequenceClassification: ['lm_head.decoder.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'lm_head.bias']\n", + "- This IS expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", + "- This IS NOT expected if you are initializing LongformerForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", + "Some weights of LongformerForSequenceClassification were not initialized from the model checkpoint at allenai/longformer-base-4096 and are newly initialized: ['classifier.out_proj.bias', 'classifier.dense.weight', 'classifier.out_proj.weight', 'classifier.dense.bias']\n", + "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", + "Some weights of the model checkpoint at google/bigbird-roberta-base were not used when initializing BigBirdForSequenceClassification: ['cls.predictions.transform.dense.weight', 'cls.seq_relationship.bias', 'cls.predictions.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.decoder.bias', 'cls.seq_relationship.weight', 'cls.predictions.transform.LayerNorm.weight']\n", + "- This IS expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", + "- This IS NOT expected if you are initializing BigBirdForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", + "Some weights of BigBirdForSequenceClassification were not initialized from the model checkpoint at google/bigbird-roberta-base and are newly initialized: ['classifier.out_proj.bias', 'classifier.out_proj.weight', 'classifier.dense.bias', 'classifier.dense.weight']\n", + "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training roberta on size 4096\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "b80d4028e09e460780242a310b2ce881", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + " 0%| | 0/1 [00:00\n", + " \n", + " \n", + " [250/250 09:22, Epoch 8/9]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6635000.6588280.6124030.0000000.0000000.000000
200.7123000.6524970.6124030.0000000.0000000.000000
300.6483000.6384170.6124030.0000000.0000000.000000
400.6462000.6032880.6124030.0000000.0000000.000000
500.6309000.5552630.7441860.5925930.7741940.480000
600.5424000.5814190.7519380.5428570.9500000.380000
700.5093000.5027400.7906980.7032970.7804880.640000
800.4019000.5551120.8062020.7058820.8571430.600000
900.3354000.4721360.7906980.7096770.7674420.660000
1000.3271000.4580290.8139530.7272730.8421050.640000
1100.2899000.4906750.8294570.7380950.9117650.620000
1200.2446000.5332360.7984500.6750000.9000000.540000
1300.2438000.4955920.8449610.7727270.8947370.680000
1400.4435000.5853940.7984500.6486491.0000000.480000
1500.2112000.5051090.8294570.7380950.9117650.620000
1600.1506000.5536820.8527130.7865170.8974360.700000
1700.1176000.6315130.8139530.7209300.8611110.620000
1800.1583000.6734380.8294570.7800000.7800000.780000
1900.3698000.6275180.8294570.7500000.8684210.660000
2000.1237000.6486760.8217050.7415730.8461540.660000
2100.1373000.7427670.8139530.7000000.9333330.560000
2200.1648000.6804870.8372090.7469880.9393940.620000
2300.0893000.7018560.8372090.7469880.9393940.620000
2400.1636000.6322020.8372090.7586210.8918920.660000
2500.0445000.6228560.8217050.7415730.8461540.660000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:02]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training longformer on size 4096\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "ecb7f32112aa457587e16f630402d093", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + " 0%| | 0/1 [00:00\n", + " \n", + " \n", + " [250/250 1:07:55, Epoch 8/9]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6627000.6625020.6124030.0000000.0000000.000000
200.7011000.6349400.6124030.0000000.0000000.000000
300.6244000.6928170.6124030.0000000.0000000.000000
400.6195000.5712760.7364340.6792450.6428570.720000
500.5235000.5326420.7674420.6341460.8125000.520000
600.4123000.5375400.7829460.6818180.7894740.600000
700.4279000.4852780.7751940.7238100.6909090.760000
800.3466000.5216630.7906980.6966290.7948720.620000
900.2342000.4585160.7984500.7450980.7307690.760000
1000.2691000.4783830.8294570.7608700.8333330.700000
1100.2478000.5600490.7984500.6829270.8750000.560000
1200.2523000.4638210.8217050.7472530.8292680.680000
1300.1952000.4824560.8294570.7608700.8333330.700000
1400.1931000.6225490.8139530.7209300.8611110.620000
1500.1286000.6279150.8372090.7640450.8717950.680000
1600.1453000.6785510.8372090.7640450.8717950.680000
1700.0657000.8652290.8139530.7142860.8823530.600000
1800.1158000.7094490.8527130.7956990.8604650.740000
1900.1314000.8761560.8294570.7441860.8888890.640000
2000.0234000.7345280.8372090.7789470.8222220.740000
2100.1692000.7785480.8372090.7692310.8536590.700000
2200.0768000.8179040.8294570.7555560.8500000.680000
2300.0509000.8291820.8372090.7640450.8717950.680000
2400.0355000.8332720.8372090.7640450.8717950.680000
2500.0284000.8304700.8449610.7777780.8750000.700000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:22]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training bigbird on size 4096\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "d9e4e6a08e4c4d6baee6358643090800", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + " 0%| | 0/1 [00:00\n", + " \n", + " \n", + " [250/250 1:29:59, Epoch 8/9]\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
StepTraining LossValidation LossAccuracyF1PrecisionRecall
100.6333000.6639030.6124030.0000000.0000000.000000
200.6412000.6060360.6124030.0000000.0000000.000000
300.5808000.6284150.6124030.0000000.0000000.000000
400.5534000.4947540.8449610.7727270.8947370.680000
500.4913000.4559930.8294570.7800000.7800000.780000
600.3898000.4539850.8372090.7407410.9677420.600000
700.3491000.3884940.8217050.7472530.8292680.680000
800.3235000.4043260.8217050.7889910.7288140.860000
900.2506000.3785070.8372090.7789470.8222220.740000
1000.2102000.3921780.8604650.8085110.8636360.760000
1100.2176000.3936500.8682170.8089890.9230770.720000
1200.1668000.3905460.8759690.8181820.9473680.720000
1300.1817000.3478400.8914730.8510640.9090910.800000
1400.1764000.3637090.8837210.8421050.8888890.800000
1500.1241000.3983420.8837210.8314610.9487180.740000
1600.0962000.3509450.8914730.8600000.8600000.860000
1700.0585000.4323160.8682170.8172040.8837210.760000
1800.0896000.4170360.8759690.8333330.8695650.800000
1900.0719000.4281600.8837210.8387100.9069770.780000
2000.0660000.4212070.8837210.8453610.8723400.820000
2100.0583000.4217710.8759690.8333330.8695650.800000
2200.0522000.4493450.8837210.8387100.9069770.780000
2300.0604000.4551530.8759690.8333330.8695650.800000
2400.0123000.4629740.8759690.8333330.8695650.800000
2500.0142000.4644450.8759690.8333330.8695650.800000

" + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning:\n", + "\n", + "Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n", + "/usr/local/lib/python3.7/dist-packages/transformers/trainer.py:1310: FutureWarning:\n", + "\n", + "Non-finite norm encountered in torch.nn.utils.clip_grad_norm_; continuing anyway. Note that the default behavior will change in a future release to error out if a non-finite total norm is encountered. At that point, setting error_if_nonfinite=false will be required to retain the old behavior.\n", + "\n" + ] + }, + { + "data": { + "text/html": [ + "\n", + "

\n", + " \n", + " \n", + " [33/33 00:28]\n", + "
\n", + " " + ], + "text/plain": [ + "" + ] + }, + "metadata": { + "tags": [] + }, + "output_type": "display_data" + } + ], + "source": [ + "timing = False\n", + "gradient_checkpointing = True\n", + "\n", + "for size in [256, 512, 1024, 2048, 4096]:\n", + "\n", + " for model_name, tokenizer, model in [\n", + " (\n", + " 'roberta',\n", + " RobertaTokenizer.from_pretrained('roberta-base'),\n", + " RobertaForSequenceClassification.from_pretrained('roberta-base',\n", + " gradient_checkpointing=gradient_checkpointing,\n", + " num_labels=2)\n", + " ),\n", + " (\n", + " 'longformer',\n", + " LongformerTokenizerFast.from_pretrained('allenai/longformer-base-4096'),\n", + " LongformerForSequenceClassification.from_pretrained('allenai/longformer-base-4096',\n", + " gradient_checkpointing=gradient_checkpointing,\n", + " attention_window=128,\n", + " num_labels=2)\n", + " ),\n", + " (\n", + " 'bigbird',\n", + " BigBirdTokenizerFast.from_pretrained('google/bigbird-roberta-base'),\n", + " BigBirdForSequenceClassification.from_pretrained('google/bigbird-roberta-base',\n", + " gradient_checkpointing=gradient_checkpointing,\n", + " num_labels=2)\n", + " )\n", + " ]:\n", + "\n", + " print(f\"Training {model_name} on size {size}\")\n", + "\n", + " run_training(timing, model, tokenizer, model_name, train_ds_dict[str(size)], test_ds, max_length=size)" + ] }, { "cell_type": "markdown", - "source": [ - "### โฑ๏ธ Timing runs" - ], "metadata": { "id": "ICGDQyU3d5ki" - } + }, + "source": [ + "### โฑ๏ธ Timing runs" + ] }, { "cell_type": "code", "execution_count": null, - "source": [ - "timing = True\r\n", - "gradient_checkpointing = True\r\n", - "\r\n", - "for size in [256, 512, 1024, 2048, 4096]:\r\n", - "\r\n", - " for model_name, tokenizer, model in [\r\n", - " (\r\n", - " 'roberta',\r\n", - " RobertaTokenizer.from_pretrained('roberta-base'),\r\n", - " RobertaForSequenceClassification.from_pretrained('roberta-base',\r\n", - " gradient_checkpointing=gradient_checkpointing,\r\n", - " num_labels=2)\r\n", - " ),\r\n", - " (\r\n", - " 'longformer',\r\n", - " LongformerTokenizerFast.from_pretrained('allenai/longformer-base-4096'),\r\n", - " LongformerForSequenceClassification.from_pretrained('allenai/longformer-base-4096',\r\n", - " gradient_checkpointing=gradient_checkpointing,\r\n", - " attention_window=256,\r\n", - " num_labels=2)\r\n", - " ),\r\n", - " (\r\n", - " 'bigbird',\r\n", - " BigBirdTokenizerFast.from_pretrained('google/bigbird-roberta-base'),\r\n", - " BigBirdForSequenceClassification.from_pretrained('google/bigbird-roberta-base',\r\n", - " gradient_checkpointing=gradient_checkpointing,\r\n", - " num_labels=2)\r\n", - " )\r\n", - " ]:\r\n", - "\r\n", - " print(f\"Training {model_name} on size {size}\")\r\n", - "\r\n", - " run_training(timing, model, tokenizer, model_name, train_ds_dict[str(size)], test_ds, max_length=size)" - ], - "outputs": [], "metadata": { "id": "-YQXVGyYf0vm" - } + }, + "outputs": [], + "source": [ + "timing = True\n", + "gradient_checkpointing = True\n", + "\n", + "for size in [256, 512, 1024, 2048, 4096]:\n", + "\n", + " for model_name, tokenizer, model in [\n", + " (\n", + " 'roberta',\n", + " RobertaTokenizer.from_pretrained('roberta-base'),\n", + " RobertaForSequenceClassification.from_pretrained('roberta-base',\n", + " gradient_checkpointing=gradient_checkpointing,\n", + " num_labels=2)\n", + " ),\n", + " (\n", + " 'longformer',\n", + " LongformerTokenizerFast.from_pretrained('allenai/longformer-base-4096'),\n", + " LongformerForSequenceClassification.from_pretrained('allenai/longformer-base-4096',\n", + " gradient_checkpointing=gradient_checkpointing,\n", + " attention_window=256,\n", + " num_labels=2)\n", + " ),\n", + " (\n", + " 'bigbird',\n", + " BigBirdTokenizerFast.from_pretrained('google/bigbird-roberta-base'),\n", + " BigBirdForSequenceClassification.from_pretrained('google/bigbird-roberta-base',\n", + " gradient_checkpointing=gradient_checkpointing,\n", + " num_labels=2)\n", + " )\n", + " ]:\n", + "\n", + " print(f\"Training {model_name} on size {size}\")\n", + "\n", + " run_training(timing, model, tokenizer, model_name, train_ds_dict[str(size)], test_ds, max_length=size)" + ] }, { "cell_type": "markdown", - "source": [ - "# ๐Ÿ“ˆ Visualize" - ], "metadata": { "id": "qSrz1KcnZrKe" - } + }, + "source": [ + "# ๐Ÿ“ˆ Visualize" + ] }, { "cell_type": "code", "execution_count": null, - "source": [ - "import os \r\n", - "run_dict = []\r\n", - "\r\n", - "# loop to collect the results from each model\r\n", - "for root, dirs, files in os.walk('./'):\r\n", - " for name in files:\r\n", - " if name.endswith((\"results.json\")):\r\n", - " full_path = os.path.join(root, name)\r\n", - " with open(full_path) as f:\r\n", - " data = json.load(f)\r\n", - " run_dict.append(data)" - ], - "outputs": [], "metadata": { "id": "BbwOrTy3BSZY" - } + }, + "outputs": [], + "source": [ + "import os \n", + "run_dict = []\n", + "\n", + "# loop to collect the results from each model\n", + "for root, dirs, files in os.walk('./'):\n", + " for name in files:\n", + " if name.endswith((\"results.json\")):\n", + " full_path = os.path.join(root, name)\n", + " with open(full_path) as f:\n", + " data = json.load(f)\n", + " run_dict.append(data)" + ] }, { "cell_type": "code", "execution_count": null, - "source": [ - "df = pd.DataFrame(run_dict)\r\n", - "df2 = df.metrics.apply(pd.Series)\r\n", - "result = pd.concat([df.drop('metrics', axis=1), df2], axis=1)\r\n", - "result[\"iden\"] = result[\"model_name\"] + '-' + result[\"tokencount\"].astype(str)\r\n", - "result['parameters'] /= 1e6\r\n", - "result.sort_values(by=['model_name', 'tokencount'], )" - ], + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 1000 + }, + "id": "fX1rutdXBSZZ", + "outputId": "fdd485e7-a0c6-4b66-d762-721916dc2e04" + }, "outputs": [ { - "output_type": "execute_result", "data": { - "text/plain": [ - " model_name parameters ... epoch iden\n", - "21 bigbird 128.060930 ... NaN bigbird-1024\n", - "22 bigbird 128.060930 ... 13.15 bigbird-1024\n", - "9 bigbird 128.060930 ... NaN bigbird-2048\n", - "24 bigbird 128.060930 ... 8.92 bigbird-2048\n", - "13 bigbird 128.060930 ... NaN bigbird-256\n", - "23 bigbird 128.060930 ... 83.30 bigbird-256\n", - "16 bigbird 128.060930 ... NaN bigbird-4096\n", - "29 bigbird 128.060930 ... 8.06 bigbird-4096\n", - "6 bigbird 128.060930 ... NaN bigbird-512\n", - "7 bigbird 128.060930 ... 27.71 bigbird-512\n", - "18 longformer 148.660994 ... NaN longformer-1024\n", - "27 longformer 148.660994 ... 13.15 longformer-1024\n", - "14 longformer 148.660994 ... 8.92 longformer-2048\n", - "20 longformer 148.660994 ... NaN longformer-2048\n", - "4 longformer 148.660994 ... NaN longformer-256\n", - "5 longformer 148.660994 ... 83.30 longformer-256\n", - "1 longformer 148.660994 ... NaN longformer-4096\n", - "10 longformer 148.660994 ... 8.06 longformer-4096\n", - "15 longformer 148.660994 ... NaN longformer-512\n", - "17 longformer 148.660994 ... 27.71 longformer-512\n", - "3 roberta 124.647170 ... 13.15 roberta-1024\n", - "25 roberta 124.647170 ... NaN roberta-1024\n", - "11 roberta 124.647170 ... NaN roberta-2048\n", - "12 roberta 124.647170 ... 8.92 roberta-2048\n", - "2 roberta 124.647170 ... 83.30 roberta-256\n", - "26 roberta 124.647170 ... NaN roberta-256\n", - "19 roberta 124.647170 ... NaN roberta-4096\n", - "28 roberta 124.647170 ... 8.06 roberta-4096\n", - "0 roberta 124.647170 ... NaN roberta-512\n", - "8 roberta 124.647170 ... 27.71 roberta-512\n", - "\n", - "[30 rows x 15 columns]" - ], "text/html": [ "
\n", "