Unverified Commit e52288a1 authored by Sylvain Gugger's avatar Sylvain Gugger Committed by GitHub
Browse files

Rework notebooks and move them to the Notebooks repo (#12471)

parent 2d1d9218
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% md\n"
}
},
"source": [
"## Tokenization doesn't have to be slow !\n",
"\n",
"### Introduction\n",
"\n",
"Before going deep into any Machine Learning or Deep Learning Natural Language Processing models, every practitioner\n",
"should find a way to map raw input strings to a representation understandable by a trainable model.\n",
"\n",
"One very simple approach would be to split inputs over every space and assign an identifier to each word. This approach\n",
"would look similar to the code below in python\n",
"\n",
"```python\n",
"s = \"very long corpus...\"\n",
"words = s.split(\" \") # Split over space\n",
"vocabulary = dict(enumerate(set(words))) # Map storing the word to it's corresponding id\n",
"```\n",
"\n",
"This approach might work well if your vocabulary remains small as it would store every word (or **token**) present in your original\n",
"input. Moreover, word variations like \"cat\" and \"cats\" would not share the same identifiers even if their meaning is \n",
"quite close.\n",
"\n",
"![tokenization_simple](https://cdn.analyticsvidhya.com/wp-content/uploads/2019/11/tokenization.png)\n",
"\n",
"### Subtoken Tokenization\n",
"\n",
"To overcome the issues described above, recent works have been done on tokenization, leveraging \"subtoken\" tokenization.\n",
"**Subtokens** extends the previous splitting strategy to furthermore explode a word into grammatically logicial sub-components learned\n",
"from the data.\n",
"\n",
"Taking our previous example of the words __cat__ and __cats__, a sub-tokenization of the word __cats__ would be [cat, ##s]. Where the prefix _\"##\"_ indicates a subtoken of the initial input. \n",
"Such training algorithms might extract sub-tokens such as _\"##ing\"_, _\"##ed\"_ over English corpus.\n",
"\n",
"As you might think of, this kind of sub-tokens construction leveraging compositions of _\"pieces\"_ overall reduces the size\n",
"of the vocabulary you have to carry to train a Machine Learning model. On the other side, as one token might be exploded\n",
"into multiple subtokens, the input of your model might increase and become an issue on model with non-linear complexity over the input sequence's length. \n",
" \n",
"![subtokenization](https://nlp.fast.ai/images/multifit_vocabularies.png)\n",
" \n",
"Among all the tokenization algorithms, we can highlight a few subtokens algorithms used in Transformers-based SoTA models : \n",
"\n",
"- [Byte Pair Encoding (BPE) - Neural Machine Translation of Rare Words with Subword Units (Sennrich et al., 2015)](https://arxiv.org/abs/1508.07909)\n",
"- [Word Piece - Japanese and Korean voice search (Schuster, M., and Nakajima, K., 2015)](https://research.google/pubs/pub37842/)\n",
"- [Unigram Language Model - Subword Regularization: Improving Neural Network Translation Models with Multiple Subword Candidates (Kudo, T., 2018)](https://arxiv.org/abs/1804.10959)\n",
"- [Sentence Piece - A simple and language independent subword tokenizer and detokenizer for Neural Text Processing (Taku Kudo and John Richardson, 2018)](https://arxiv.org/abs/1808.06226)\n",
"\n",
"Going through all of them is out of the scope of this notebook, so we will just highlight how you can use them.\n",
"\n",
"### @huggingface/tokenizers library \n",
"Along with the transformers library, we @huggingface provide a blazing fast tokenization library\n",
"able to train, tokenize and decode dozens of Gb/s of text on a common multi-core machine.\n",
"\n",
"The library is written in Rust allowing us to take full advantage of multi-core parallel computations in a native and memory-aware way, on-top of which \n",
"we provide bindings for Python and NodeJS (more bindings may be added in the future). \n",
"\n",
"We designed the library so that it provides all the required blocks to create end-to-end tokenizers in an interchangeable way. In that sense, we provide\n",
"these various components: \n",
"\n",
"- **Normalizer**: Executes all the initial transformations over the initial input string. For example when you need to\n",
"lowercase some text, maybe strip it, or even apply one of the common unicode normalization process, you will add a Normalizer. \n",
"- **PreTokenizer**: In charge of splitting the initial input string. That's the component that decides where and how to\n",
"pre-segment the origin string. The simplest example would be like we saw before, to simply split on spaces.\n",
"- **Model**: Handles all the sub-token discovery and generation, this part is trainable and really dependant\n",
" of your input data.\n",
"- **Post-Processor**: Provides advanced construction features to be compatible with some of the Transformers-based SoTA\n",
"models. For instance, for BERT it would wrap the tokenized sentence around [CLS] and [SEP] tokens.\n",
"- **Decoder**: In charge of mapping back a tokenized input to the original string. The decoder is usually chosen according\n",
"to the `PreTokenizer` we used previously.\n",
"- **Trainer**: Provides training capabilities to each model.\n",
"\n",
"For each of the components above we provide multiple implementations:\n",
"\n",
"- **Normalizer**: Lowercase, Unicode (NFD, NFKD, NFC, NFKC), Bert, Strip, ...\n",
"- **PreTokenizer**: ByteLevel, WhitespaceSplit, CharDelimiterSplit, Metaspace, ...\n",
"- **Model**: WordLevel, BPE, WordPiece\n",
"- **Post-Processor**: BertProcessor, ...\n",
"- **Decoder**: WordLevel, BPE, WordPiece, ...\n",
"\n",
"All of these building blocks can be combined to create working tokenization pipelines. \n",
"In the next section we will go over our first pipeline."
]
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
}
},
"source": [
"Alright, now we are ready to implement our first tokenization pipeline through `tokenizers`. \n",
"\n",
"For this, we will train a Byte-Pair Encoding (BPE) tokenizer on a quite small input for the purpose of this notebook.\n",
"We will work with [the file from Peter Norving](https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=2ahUKEwjYp9Ppru_nAhUBzIUKHfbUAG8QFjAAegQIBhAB&url=https%3A%2F%2Fnorvig.com%2Fbig.txt&usg=AOvVaw2ed9iwhcP1RKUiEROs15Dz).\n",
"This file contains around 130.000 lines of raw text that will be processed by the library to generate a working tokenizer.\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
}
},
"outputs": [],
"source": [
"!pip install tokenizers"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
}
},
"outputs": [],
"source": [
"BIG_FILE_URL = 'https://raw.githubusercontent.com/dscape/spell/master/test/resources/big.txt'\n",
"\n",
"# Let's download the file and save it somewhere\n",
"from requests import get\n",
"with open('big.txt', 'wb') as big_f:\n",
" response = get(BIG_FILE_URL, )\n",
" \n",
" if response.status_code == 200:\n",
" big_f.write(response.content)\n",
" else:\n",
" print(\"Unable to get the file: {}\".format(response.reason))\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% md\n"
}
},
"source": [
" \n",
"Now that we have our training data we need to create the overall pipeline for the tokenizer\n",
" "
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
}
},
"outputs": [],
"source": [
"# For the user's convenience `tokenizers` provides some very high-level classes encapsulating\n",
"# the overall pipeline for various well-known tokenization algorithm. \n",
"# Everything described below can be replaced by the ByteLevelBPETokenizer class. \n",
"\n",
"from tokenizers import Tokenizer\n",
"from tokenizers.decoders import ByteLevel as ByteLevelDecoder\n",
"from tokenizers.models import BPE\n",
"from tokenizers.normalizers import Lowercase, NFKC, Sequence\n",
"from tokenizers.pre_tokenizers import ByteLevel\n",
"\n",
"# First we create an empty Byte-Pair Encoding model (i.e. not trained model)\n",
"tokenizer = Tokenizer(BPE())\n",
"\n",
"# Then we enable lower-casing and unicode-normalization\n",
"# The Sequence normalizer allows us to combine multiple Normalizer that will be\n",
"# executed in order.\n",
"tokenizer.normalizer = Sequence([\n",
" NFKC(),\n",
" Lowercase()\n",
"])\n",
"\n",
"# Our tokenizer also needs a pre-tokenizer responsible for converting the input to a ByteLevel representation.\n",
"tokenizer.pre_tokenizer = ByteLevel()\n",
"\n",
"# And finally, let's plug a decoder so we can recover from a tokenized input to the original one\n",
"tokenizer.decoder = ByteLevelDecoder()"
]
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
}
},
"source": [
"The overall pipeline is now ready to be trained on the corpus we downloaded earlier in this notebook."
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Trained vocab size: 25000\n"
]
}
],
"source": [
"from tokenizers.trainers import BpeTrainer\n",
"\n",
"# We initialize our trainer, giving him the details about the vocabulary we want to generate\n",
"trainer = BpeTrainer(vocab_size=25000, show_progress=True, initial_alphabet=ByteLevel.alphabet())\n",
"tokenizer.train(files=[\"big.txt\"], trainer=trainer)\n",
"\n",
"print(\"Trained vocab size: {}\".format(tokenizer.get_vocab_size()))"
]
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
}
},
"source": [
"Et voilà ! You trained your very first tokenizer from scratch using `tokenizers`. Of course, this \n",
"covers only the basics, and you may want to have a look at the `add_special_tokens` or `special_tokens` parameters\n",
"on the `Trainer` class, but the overall process should be very similar.\n",
"\n",
"We can save the content of the model to reuse it later."
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
}
},
"outputs": [
{
"data": {
"text/plain": [
"['./vocab.json', './merges.txt']"
]
},
"execution_count": 12,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# You will see the generated files in the output.\n",
"tokenizer.model.save('.')"
]
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
}
},
"source": [
"Now, let load the trained model and start using out newly trained tokenizer"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Encoded string: ['Ġthis', 'Ġis', 'Ġa', 'Ġsimple', 'Ġin', 'put', 'Ġto', 'Ġbe', 'Ġtoken', 'ized']\n",
"Decoded string: this is a simple input to be tokenized\n"
]
}
],
"source": [
"# Let's tokenizer a simple input\n",
"tokenizer.model = BPE('vocab.json', 'merges.txt')\n",
"encoding = tokenizer.encode(\"This is a simple input to be tokenized\")\n",
"\n",
"print(\"Encoded string: {}\".format(encoding.tokens))\n",
"\n",
"decoded = tokenizer.decode(encoding.ids)\n",
"print(\"Decoded string: {}\".format(decoded))"
]
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
}
},
"source": [
"The Encoding structure exposes multiple properties which are useful when working with transformers models\n",
"\n",
"- normalized_str: The input string after normalization (lower-casing, unicode, stripping, etc.)\n",
"- original_str: The input string as it was provided\n",
"- tokens: The generated tokens with their string representation\n",
"- input_ids: The generated tokens with their integer representation\n",
"- attention_mask: If your input has been padded by the tokenizer, then this would be a vector of 1 for any non padded token and 0 for padded ones.\n",
"- special_token_mask: If your input contains special tokens such as [CLS], [SEP], [MASK], [PAD], then this would be a vector with 1 in places where a special token has been added.\n",
"- type_ids: If your input was made of multiple \"parts\" such as (question, context), then this would be a vector with for each token the segment it belongs to.\n",
"- overflowing: If your input has been truncated into multiple subparts because of a length limit (for BERT for example the sequence length is limited to 512), this will contain all the remaining overflowing parts."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.6"
},
"pycharm": {
"stem_cell": {
"cell_type": "raw",
"metadata": {
"collapsed": false
},
"source": []
}
}
},
"nbformat": 4,
"nbformat_minor": 1
}
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "YKdSeUmVSXah",
"pycharm": {
"is_executing": false,
"name": "#%% md\n"
}
},
"source": [
"## Introduction\n",
"The transformers library is an open-source, community-based repository to train, use and share models based on \n",
"the Transformer architecture [(Vaswani & al., 2017)](https://arxiv.org/abs/1706.03762) such as Bert [(Devlin & al., 2018)](https://arxiv.org/abs/1810.04805),\n",
"Roberta [(Liu & al., 2019)](https://arxiv.org/abs/1907.11692), GPT2 [(Radford & al., 2019)](https://cdn.openai.com/better-language-models/language_models_are_unsupervised_multitask_learners.pdf),\n",
"XLNet [(Yang & al., 2019)](https://arxiv.org/abs/1906.08237), etc. \n",
"\n",
"Along with the models, the library contains multiple variations of each of them for a large variety of \n",
"downstream-tasks like **Named Entity Recognition (NER)**, **Sentiment Analysis**, \n",
"**Language Modeling**, **Question Answering** and so on.\n",
"\n",
"## Before Transformer\n",
"\n",
"Back to 2017, most of the people using Neural Networks when working on Natural Language Processing were relying on \n",
"sequential processing of the input through [Recurrent Neural Network (RNN)](https://en.wikipedia.org/wiki/Recurrent_neural_network).\n",
"\n",
"![rnn](http://colah.github.io/posts/2015-09-NN-Types-FP/img/RNN-general.png) \n",
"\n",
"RNNs were performing well on large variety of tasks involving sequential dependency over the input sequence. \n",
"However, this sequentially-dependent process had issues modeling very long range dependencies and \n",
"was not well suited for the kind of hardware we're currently leveraging due to bad parallelization capabilities. \n",
"\n",
"Some extensions were provided by the academic community, such as Bidirectional RNN ([Schuster & Paliwal., 1997](https://www.researchgate.net/publication/3316656_Bidirectional_recurrent_neural_networks), [Graves & al., 2005](https://mediatum.ub.tum.de/doc/1290195/file.pdf)), \n",
"which can be seen as a concatenation of two sequential process, one going forward, the other one going backward over the sequence input.\n",
"\n",
"![birnn](https://miro.medium.com/max/764/1*6QnPUSv_t9BY9Fv8_aLb-Q.png)\n",
"\n",
"\n",
"And also, the Attention mechanism, which introduced a good improvement over \"raw\" RNNs by giving \n",
"a learned, weighted-importance to each element in the sequence, allowing the model to focus on important elements.\n",
"\n",
"![attention_rnn](https://3qeqpr26caki16dnhd19sv6by6v-wpengine.netdna-ssl.com/wp-content/uploads/2017/08/Example-of-Attention.png) \n",
"\n",
"## Then comes the Transformer \n",
"\n",
"The Transformers era originally started from the work of [(Vaswani & al., 2017)](https://arxiv.org/abs/1706.03762) who\n",
"demonstrated its superiority over [Recurrent Neural Network (RNN)](https://en.wikipedia.org/wiki/Recurrent_neural_network)\n",
"on translation tasks but it quickly extended to almost all the tasks RNNs were State-of-the-Art at that time.\n",
"\n",
"One advantage of Transformer over its RNN counterpart was its non sequential attention model. Remember, the RNNs had to\n",
"iterate over each element of the input sequence one-by-one and carry an \"updatable-state\" between each hop. With Transformer, the model is able to look at every position in the sequence, at the same time, in one operation.\n",
"\n",
"For a deep-dive into the Transformer architecture, [The Annotated Transformer](https://nlp.seas.harvard.edu/2018/04/03/attention.html#encoder-and-decoder-stacks) \n",
"will drive you along all the details of the paper.\n",
"\n",
"![transformer-encoder-decoder](https://nlp.seas.harvard.edu/images/the-annotated-transformer_14_0.png)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "TFHTP6CFSXai",
"pycharm": {
"name": "#%% md\n"
}
},
"source": [
"## Getting started with transformers\n",
"\n",
"For the rest of this notebook, we will use the [BERT (Devlin & al., 2018)](https://arxiv.org/abs/1810.04805) architecture, as it's the most simple and there are plenty of content about it\n",
"over the internet, it will be easy to dig more over this architecture if you want to.\n",
"\n",
"The transformers library allows you to benefits from large, pretrained language models without requiring a huge and costly computational\n",
"infrastructure. Most of the State-of-the-Art models are provided directly by their author and made available in the library \n",
"in PyTorch and TensorFlow in a transparent and interchangeable way. \n",
"\n",
"If you're executing this notebook in Colab, you will need to install the transformers library. You can do so with this command:"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"id": "KnT3Jn6fSXai",
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
},
"scrolled": true
},
"outputs": [],
"source": [
"# !pip install transformers"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "UIQGDTIDSXai",
"outputId": "9851454a-c898-4fba-a389-9b16462a27c1",
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
}
},
"outputs": [
{
"data": {
"text/plain": [
"<torch.autograd.grad_mode.set_grad_enabled at 0x7ff0cc2a2c50>"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"import torch\n",
"from transformers import AutoModel, AutoTokenizer, BertTokenizer\n",
"\n",
"torch.set_grad_enabled(False)"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"id": "1xMDTHQXSXai",
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
}
},
"outputs": [],
"source": [
"# Store the model we want to use\n",
"MODEL_NAME = \"bert-base-cased\"\n",
"\n",
"# We need to create the model and tokenizer\n",
"model = AutoModel.from_pretrained(MODEL_NAME)\n",
"tokenizer = AutoTokenizer.from_pretrained(MODEL_NAME)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "l6EcynhYSXai",
"pycharm": {
"name": "#%% md\n"
}
},
"source": [
"With only the above two lines of code, you're ready to use a BERT pre-trained model. \n",
"The tokenizers will allow us to map a raw textual input to a sequence of integers representing our textual input\n",
"in a way the model can manipulate. Since we will be using a PyTorch model, we ask the tokenizer to return to us PyTorch tensors."
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"input_ids:\n",
"\ttensor([[ 101, 1188, 1110, 1126, 7758, 1859, 102]])\n",
"token_type_ids:\n",
"\ttensor([[0, 0, 0, 0, 0, 0, 0]])\n",
"attention_mask:\n",
"\ttensor([[1, 1, 1, 1, 1, 1, 1]])\n"
]
}
],
"source": [
"tokens_pt = tokenizer(\"This is an input example\", return_tensors=\"pt\")\n",
"for key, value in tokens_pt.items():\n",
" print(\"{}:\\n\\t{}\".format(key, value))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The tokenizer automatically converted our input to all the inputs expected by the model. It generated some additional tensors on top of the IDs: \n",
"\n",
"- token_type_ids: This tensor will map every tokens to their corresponding segment (see below).\n",
"- attention_mask: This tensor is used to \"mask\" padded values in a batch of sequence with different lengths (see below).\n",
"\n",
"You can check our [glossary](https://huggingface.co/transformers/glossary.html) for more information about each of those keys. \n",
"\n",
"We can just feed this directly into our model:"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "XgkFg52fSXai",
"outputId": "94b569d4-5415-4327-f39e-c9541b0a53e0",
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Token wise output: torch.Size([1, 7, 768]), Pooled output: torch.Size([1, 768])\n"
]
}
],
"source": [
"outputs = model(**tokens_pt)\n",
"last_hidden_state = outputs.last_hidden_state\n",
"pooler_output = outputs.pooler_output\n",
"\n",
"print(\"Token wise output: {}, Pooled output: {}\".format(last_hidden_state.shape, pooler_output.shape))"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "lBbvwNKXSXaj",
"pycharm": {
"name": "#%% md\n"
}
},
"source": [
"As you can see, BERT outputs two tensors:\n",
" - One with the generated representation for every token in the input `(1, NB_TOKENS, REPRESENTATION_SIZE)`\n",
" - One with an aggregated representation for the whole input `(1, REPRESENTATION_SIZE)`\n",
" \n",
"The first, token-based, representation can be leveraged if your task requires to keep the sequence representation and you\n",
"want to operate at a token-level. This is particularly useful for Named Entity Recognition and Question-Answering.\n",
"\n",
"The second, aggregated, representation is especially useful if you need to extract the overall context of the sequence and don't\n",
"require a fine-grained token-level. This is the case for Sentiment-Analysis of the sequence or Information Retrieval."
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "Pl2HIcwDSXal",
"outputId": "22e5d010-47a9-4a12-a67d-208e5016157e",
"pycharm": {
"is_executing": false
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Single segment token (str): ['[CLS]', 'This', 'is', 'a', 'sample', 'input', '[SEP]']\n",
"Single segment token (int): [101, 1188, 1110, 170, 6876, 7758, 102]\n",
"Single segment type : [0, 0, 0, 0, 0, 0, 0]\n",
"\n",
"Multi segment token (str): ['[CLS]', 'This', 'is', 'segment', 'A', '[SEP]', 'This', 'is', 'segment', 'B', '[SEP]']\n",
"Multi segment token (int): [101, 1188, 1110, 6441, 138, 102, 1188, 1110, 6441, 139, 102]\n",
"Multi segment type : [0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1]\n"
]
}
],
"source": [
"# Single segment input\n",
"single_seg_input = tokenizer(\"This is a sample input\")\n",
"\n",
"# Multiple segment input\n",
"multi_seg_input = tokenizer(\"This is segment A\", \"This is segment B\")\n",
"\n",
"print(\"Single segment token (str): {}\".format(tokenizer.convert_ids_to_tokens(single_seg_input['input_ids'])))\n",
"print(\"Single segment token (int): {}\".format(single_seg_input['input_ids']))\n",
"print(\"Single segment type : {}\".format(single_seg_input['token_type_ids']))\n",
"\n",
"# Segments are concatened in the input to the model, with \n",
"print()\n",
"print(\"Multi segment token (str): {}\".format(tokenizer.convert_ids_to_tokens(multi_seg_input['input_ids'])))\n",
"print(\"Multi segment token (int): {}\".format(multi_seg_input['input_ids']))\n",
"print(\"Multi segment type : {}\".format(multi_seg_input['token_type_ids']))"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "1NtvWOgzSXam",
"outputId": "e66c47d0-e106-408d-d01c-9ac194ca3ec6",
"pycharm": {
"is_executing": false
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Tokens (int) : [101, 1188, 1110, 170, 6876, 102, 0, 0]\n",
"Tokens (str) : ['[CLS]', 'This', 'is', 'a', 'sample', '[SEP]', '[PAD]', '[PAD]']\n",
"Tokens (attn_mask): [1, 1, 1, 1, 1, 1, 0, 0]\n",
"\n",
"Tokens (int) : [101, 1188, 1110, 1330, 2039, 6876, 3087, 102]\n",
"Tokens (str) : ['[CLS]', 'This', 'is', 'another', 'longer', 'sample', 'text', '[SEP]']\n",
"Tokens (attn_mask): [1, 1, 1, 1, 1, 1, 1, 1]\n",
"\n"
]
}
],
"source": [
"# Padding highlight\n",
"tokens = tokenizer(\n",
" [\"This is a sample\", \"This is another longer sample text\"], \n",
" padding=True # First sentence will have some PADDED tokens to match second sequence length\n",
")\n",
"\n",
"for i in range(2):\n",
" print(\"Tokens (int) : {}\".format(tokens['input_ids'][i]))\n",
" print(\"Tokens (str) : {}\".format([tokenizer.convert_ids_to_tokens(s) for s in tokens['input_ids'][i]]))\n",
" print(\"Tokens (attn_mask): {}\".format(tokens['attention_mask'][i]))\n",
" print()"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "vkRYm2HESXan"
},
"source": [
"## Frameworks interoperability\n",
"\n",
"One of the most powerfull feature of transformers is its ability to seamlessly move from PyTorch to Tensorflow\n",
"without pain for the user.\n",
"\n",
"We provide some convenient methods to load TensorFlow pretrained weight insinde a PyTorch model and opposite."
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {
"id": "Kubwm-wJSXan",
"pycharm": {
"is_executing": false
}
},
"outputs": [
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "3b971be3639d4fedb02778fb5c6898a0",
"version_major": 2,
"version_minor": 0
},
"text/plain": [
"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=526681800.0, style=ProgressStyle(descri…"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Some layers from the model checkpoint at bert-base-cased were not used when initializing TFBertModel: ['nsp___cls', 'mlm___cls']\n",
"- This IS expected if you are initializing TFBertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n",
"- This IS NOT expected if you are initializing TFBertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n",
"All the layers of TFBertModel were initialized from the model checkpoint at bert-base-cased.\n",
"If your task is similar to the task the model of the checkpoint was trained on, you can already use TFBertModel for predictions without further training.\n"
]
}
],
"source": [
"from transformers import TFBertModel, BertModel\n",
"\n",
"# Let's load a BERT model for TensorFlow and PyTorch\n",
"model_tf = TFBertModel.from_pretrained('bert-base-cased')\n",
"model_pt = BertModel.from_pretrained('bert-base-cased')"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "lJ13tlzOSXan",
"outputId": "1e4ac151-a8fc-4b34-946a-da0bc44ed0e6",
"pycharm": {
"is_executing": false
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"last_hidden_state differences: 1.2933e-05\n",
"pooler_output differences: 2.9691e-06\n"
]
}
],
"source": [
"# transformers generates a ready to use dictionary with all the required parameters for the specific framework.\n",
"input_tf = tokenizer(\"This is a sample input\", return_tensors=\"tf\")\n",
"input_pt = tokenizer(\"This is a sample input\", return_tensors=\"pt\")\n",
"\n",
"# Let's compare the outputs\n",
"output_tf, output_pt = model_tf(input_tf), model_pt(**input_pt)\n",
"\n",
"# Models outputs 2 values (The value for each tokens, the pooled representation of the input sentence)\n",
"# Here we compare the output differences between PyTorch and TensorFlow.\n",
"for name in [\"last_hidden_state\", \"pooler_output\"]:\n",
" print(\"{} differences: {:.5}\".format(name, (output_tf[name].numpy() - output_pt[name].numpy()).sum()))"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "CQf_fpApSXao",
"pycharm": {
"name": "#%% md\n"
}
},
"source": [
"## Want it lighter? Faster? Let's talk distillation! \n",
"\n",
"One of the main concerns when using these Transformer based models is the computational power they require. All over this notebook we are using BERT model as it can be run on common machines but that's not the case for all of the models.\n",
"\n",
"For example, Google released a few months ago **T5** an Encoder/Decoder architecture based on Transformer and available in `transformers` with no more than 11 billions parameters. Microsoft also recently entered the game with **Turing-NLG** using 17 billions parameters. This kind of model requires tens of gigabytes to store the weights and a tremendous compute infrastructure to run such models which makes it impracticable for the common man !\n",
"\n",
"![transformers-parameters](https://github.com/huggingface/notebooks/blob/master/examples/images/model_parameters.png?raw=true)\n",
"\n",
"With the goal of making Transformer-based NLP accessible to everyone we @huggingface developed models that take advantage of a training process called **Distillation** which allows us to drastically reduce the resources needed to run such models with almost zero drop in performances.\n",
"\n",
"Going over the whole Distillation process is out of the scope of this notebook, but if you want more information on the subject you may refer to [this Medium article written by my colleague Victor SANH, author of DistilBERT paper](https://medium.com/huggingface/distilbert-8cf3380435b5), you might also want to directly have a look at the paper [(Sanh & al., 2019)](https://arxiv.org/abs/1910.01108)\n",
"\n",
"Of course, in `transformers` we have distilled some models and made them available directly in the library ! "
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 185,
"referenced_widgets": [
"fcffccb427714665bec7d621d00d4ce3",
"7aa02ef05fe64489ad6c969dd92d1b07",
"817f53e3fa5c43e29ad1b4410c3df7db",
"50a441d7a43c4a809a09505b36e83375",
"497ba6a585a147459f1346c0661d5c94",
"a18c5319739141af9a255bccf25f6884",
"cf319210c8134cdba487525c49e4813b",
"f1f0272d9bea4e9fad3be8646b45d629",
"530b39d56f6b4e0caae3317855c4bcf4",
"c5e735694f2c4813a1d6f0d867119f67",
"8d53b8dc213f405d8187f3c1f005826d",
"d492afe626804d95a5cfac0550913190",
"a657a312068b43529afed2050bce572f",
"fe230ff13a82400f97cf6f292e8851ba",
"be97cf2269d748f3b1a916b5376f7736",
"74bd90a09da74db5bcbbe86f044bd664"
]
},
"id": "wfxMOXb-SXao",
"outputId": "fa667556-fbf2-4c86-fc7e-9e3d3ec9da88",
"pycharm": {
"is_executing": false
}
},
"outputs": [
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "fcffccb427714665bec7d621d00d4ce3",
"version_major": 2,
"version_minor": 0
},
"text/plain": [
"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=411.0, style=ProgressStyle(description_…"
]
},
"metadata": {
"tags": []
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n"
]
},
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "530b39d56f6b4e0caae3317855c4bcf4",
"version_major": 2,
"version_minor": 0
},
"text/plain": [
"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=263273408.0, style=ProgressStyle(descri…"
]
},
"metadata": {
"tags": []
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"CPU times: user 64.4 ms, sys: 0 ns, total: 64.4 ms\n",
"Wall time: 72.9 ms\n",
"CPU times: user 130 ms, sys: 124 µs, total: 130 ms\n",
"Wall time: 131 ms\n"
]
}
],
"source": [
"from transformers import DistilBertModel\n",
"\n",
"bert_distil = DistilBertModel.from_pretrained('distilbert-base-cased')\n",
"input_pt = tokenizer(\n",
" 'This is a sample input to demonstrate performance of distiled models especially inference time', \n",
" return_tensors=\"pt\"\n",
")\n",
"\n",
"\n",
"%time _ = bert_distil(input_pt['input_ids'])\n",
"%time _ = model_pt(input_pt['input_ids'])"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "7lSIc7FbSXao"
},
"source": [
"## Community provided models\n",
"\n",
"Last but not least, earlier in this notebook we introduced Hugging Face `transformers` as a repository for the NLP community to exchange pretrained models. We wanted to highlight this features and all the possibilities it offers for the end-user.\n",
"\n",
"To leverage community pretrained models, just provide the organisation name and name of the model to `from_pretrained` and it will do all the magic for you ! \n",
"\n",
"\n",
"We currently have more 50 models provided by the community and more are added every day, don't hesitate to give it a try !"
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "cxLYnadGSXao",
"outputId": "70ab584a-e795-490a-8c6a-06e034b3df3d",
"pycharm": {
"is_executing": false
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Tokens (int) : [102, 12272, 9355, 5746, 30881, 215, 261, 5945, 4118, 212, 2414, 153, 1942, 232, 3532, 566, 103]\n",
"Tokens (str) : ['[CLS]', 'Hug', '##ging', 'Fac', '##e', 'ist', 'eine', 'französische', 'Firma', 'mit', 'Sitz', 'in', 'New', '-', 'York', '.', '[SEP]']\n",
"Tokens (attn_mask): [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]\n",
"\n",
"Token wise output: torch.Size([1, 17, 768]), Pooled output: torch.Size([1, 768])\n"
]
}
],
"source": [
"# Let's load German BERT from the Bavarian State Library\n",
"de_bert = BertModel.from_pretrained(\"dbmdz/bert-base-german-cased\")\n",
"de_tokenizer = BertTokenizer.from_pretrained(\"dbmdz/bert-base-german-cased\")\n",
"\n",
"de_input = de_tokenizer(\n",
" \"Hugging Face ist eine französische Firma mit Sitz in New-York.\",\n",
" return_tensors=\"pt\"\n",
")\n",
"print(\"Tokens (int) : {}\".format(de_input['input_ids'].tolist()[0]))\n",
"print(\"Tokens (str) : {}\".format([de_tokenizer.convert_ids_to_tokens(s) for s in de_input['input_ids'].tolist()[0]]))\n",
"print(\"Tokens (attn_mask): {}\".format(de_input['attention_mask'].tolist()[0]))\n",
"print()\n",
"\n",
"outputs_de = de_bert(**de_input)\n",
"last_hidden_state_de = outputs_de.last_hidden_state\n",
"pooler_output_de = outputs_de.pooler_output\n",
"\n",
"print(\"Token wise output: {}, Pooled output: {}\".format(last_hidden_state_de.shape, pooler_output_de.shape))"
]
}
],
"metadata": {
"colab": {
"name": "02-transformers.ipynb",
"provenance": []
},
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.9"
},
"pycharm": {
"stem_cell": {
"cell_type": "raw",
"metadata": {
"collapsed": false
},
"source": []
}
},
"widgets": {
"application/vnd.jupyter.widget-state+json": {
"497ba6a585a147459f1346c0661d5c94": {
"model_module": "@jupyter-widgets/controls",
"model_name": "ProgressStyleModel",
"state": {
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "ProgressStyleModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "StyleView",
"bar_color": null,
"description_width": "initial"
}
},
"50a441d7a43c4a809a09505b36e83375": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HTMLModel",
"state": {
"_dom_classes": [],
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "HTMLModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/controls",
"_view_module_version": "1.5.0",
"_view_name": "HTMLView",
"description": "",
"description_tooltip": null,
"layout": "IPY_MODEL_f1f0272d9bea4e9fad3be8646b45d629",
"placeholder": "​",
"style": "IPY_MODEL_cf319210c8134cdba487525c49e4813b",
"value": " 411/411 [00:19&lt;00:00, 21.5B/s]"
}
},
"530b39d56f6b4e0caae3317855c4bcf4": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HBoxModel",
"state": {
"_dom_classes": [],
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "HBoxModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/controls",
"_view_module_version": "1.5.0",
"_view_name": "HBoxView",
"box_style": "",
"children": [
"IPY_MODEL_8d53b8dc213f405d8187f3c1f005826d",
"IPY_MODEL_d492afe626804d95a5cfac0550913190"
],
"layout": "IPY_MODEL_c5e735694f2c4813a1d6f0d867119f67"
}
},
"74bd90a09da74db5bcbbe86f044bd664": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_model_module": "@jupyter-widgets/base",
"_model_module_version": "1.2.0",
"_model_name": "LayoutModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "LayoutView",
"align_content": null,
"align_items": null,
"align_self": null,
"border": null,
"bottom": null,
"display": null,
"flex": null,
"flex_flow": null,
"grid_area": null,
"grid_auto_columns": null,
"grid_auto_flow": null,
"grid_auto_rows": null,
"grid_column": null,
"grid_gap": null,
"grid_row": null,
"grid_template_areas": null,
"grid_template_columns": null,
"grid_template_rows": null,
"height": null,
"justify_content": null,
"justify_items": null,
"left": null,
"margin": null,
"max_height": null,
"max_width": null,
"min_height": null,
"min_width": null,
"object_fit": null,
"object_position": null,
"order": null,
"overflow": null,
"overflow_x": null,
"overflow_y": null,
"padding": null,
"right": null,
"top": null,
"visibility": null,
"width": null
}
},
"7aa02ef05fe64489ad6c969dd92d1b07": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_model_module": "@jupyter-widgets/base",
"_model_module_version": "1.2.0",
"_model_name": "LayoutModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "LayoutView",
"align_content": null,
"align_items": null,
"align_self": null,
"border": null,
"bottom": null,
"display": null,
"flex": null,
"flex_flow": null,
"grid_area": null,
"grid_auto_columns": null,
"grid_auto_flow": null,
"grid_auto_rows": null,
"grid_column": null,
"grid_gap": null,
"grid_row": null,
"grid_template_areas": null,
"grid_template_columns": null,
"grid_template_rows": null,
"height": null,
"justify_content": null,
"justify_items": null,
"left": null,
"margin": null,
"max_height": null,
"max_width": null,
"min_height": null,
"min_width": null,
"object_fit": null,
"object_position": null,
"order": null,
"overflow": null,
"overflow_x": null,
"overflow_y": null,
"padding": null,
"right": null,
"top": null,
"visibility": null,
"width": null
}
},
"817f53e3fa5c43e29ad1b4410c3df7db": {
"model_module": "@jupyter-widgets/controls",
"model_name": "FloatProgressModel",
"state": {
"_dom_classes": [],
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "FloatProgressModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/controls",
"_view_module_version": "1.5.0",
"_view_name": "ProgressView",
"bar_style": "success",
"description": "Downloading: 100%",
"description_tooltip": null,
"layout": "IPY_MODEL_a18c5319739141af9a255bccf25f6884",
"max": 411,
"min": 0,
"orientation": "horizontal",
"style": "IPY_MODEL_497ba6a585a147459f1346c0661d5c94",
"value": 411
}
},
"8d53b8dc213f405d8187f3c1f005826d": {
"model_module": "@jupyter-widgets/controls",
"model_name": "FloatProgressModel",
"state": {
"_dom_classes": [],
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "FloatProgressModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/controls",
"_view_module_version": "1.5.0",
"_view_name": "ProgressView",
"bar_style": "success",
"description": "Downloading: 100%",
"description_tooltip": null,
"layout": "IPY_MODEL_fe230ff13a82400f97cf6f292e8851ba",
"max": 263273408,
"min": 0,
"orientation": "horizontal",
"style": "IPY_MODEL_a657a312068b43529afed2050bce572f",
"value": 263273408
}
},
"a18c5319739141af9a255bccf25f6884": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_model_module": "@jupyter-widgets/base",
"_model_module_version": "1.2.0",
"_model_name": "LayoutModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "LayoutView",
"align_content": null,
"align_items": null,
"align_self": null,
"border": null,
"bottom": null,
"display": null,
"flex": null,
"flex_flow": null,
"grid_area": null,
"grid_auto_columns": null,
"grid_auto_flow": null,
"grid_auto_rows": null,
"grid_column": null,
"grid_gap": null,
"grid_row": null,
"grid_template_areas": null,
"grid_template_columns": null,
"grid_template_rows": null,
"height": null,
"justify_content": null,
"justify_items": null,
"left": null,
"margin": null,
"max_height": null,
"max_width": null,
"min_height": null,
"min_width": null,
"object_fit": null,
"object_position": null,
"order": null,
"overflow": null,
"overflow_x": null,
"overflow_y": null,
"padding": null,
"right": null,
"top": null,
"visibility": null,
"width": null
}
},
"a657a312068b43529afed2050bce572f": {
"model_module": "@jupyter-widgets/controls",
"model_name": "ProgressStyleModel",
"state": {
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "ProgressStyleModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "StyleView",
"bar_color": null,
"description_width": "initial"
}
},
"be97cf2269d748f3b1a916b5376f7736": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DescriptionStyleModel",
"state": {
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "DescriptionStyleModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "StyleView",
"description_width": ""
}
},
"c5e735694f2c4813a1d6f0d867119f67": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_model_module": "@jupyter-widgets/base",
"_model_module_version": "1.2.0",
"_model_name": "LayoutModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "LayoutView",
"align_content": null,
"align_items": null,
"align_self": null,
"border": null,
"bottom": null,
"display": null,
"flex": null,
"flex_flow": null,
"grid_area": null,
"grid_auto_columns": null,
"grid_auto_flow": null,
"grid_auto_rows": null,
"grid_column": null,
"grid_gap": null,
"grid_row": null,
"grid_template_areas": null,
"grid_template_columns": null,
"grid_template_rows": null,
"height": null,
"justify_content": null,
"justify_items": null,
"left": null,
"margin": null,
"max_height": null,
"max_width": null,
"min_height": null,
"min_width": null,
"object_fit": null,
"object_position": null,
"order": null,
"overflow": null,
"overflow_x": null,
"overflow_y": null,
"padding": null,
"right": null,
"top": null,
"visibility": null,
"width": null
}
},
"cf319210c8134cdba487525c49e4813b": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DescriptionStyleModel",
"state": {
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "DescriptionStyleModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "StyleView",
"description_width": ""
}
},
"d492afe626804d95a5cfac0550913190": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HTMLModel",
"state": {
"_dom_classes": [],
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "HTMLModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/controls",
"_view_module_version": "1.5.0",
"_view_name": "HTMLView",
"description": "",
"description_tooltip": null,
"layout": "IPY_MODEL_74bd90a09da74db5bcbbe86f044bd664",
"placeholder": "​",
"style": "IPY_MODEL_be97cf2269d748f3b1a916b5376f7736",
"value": " 263M/263M [00:06&lt;00:00, 43.5MB/s]"
}
},
"f1f0272d9bea4e9fad3be8646b45d629": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_model_module": "@jupyter-widgets/base",
"_model_module_version": "1.2.0",
"_model_name": "LayoutModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "LayoutView",
"align_content": null,
"align_items": null,
"align_self": null,
"border": null,
"bottom": null,
"display": null,
"flex": null,
"flex_flow": null,
"grid_area": null,
"grid_auto_columns": null,
"grid_auto_flow": null,
"grid_auto_rows": null,
"grid_column": null,
"grid_gap": null,
"grid_row": null,
"grid_template_areas": null,
"grid_template_columns": null,
"grid_template_rows": null,
"height": null,
"justify_content": null,
"justify_items": null,
"left": null,
"margin": null,
"max_height": null,
"max_width": null,
"min_height": null,
"min_width": null,
"object_fit": null,
"object_position": null,
"order": null,
"overflow": null,
"overflow_x": null,
"overflow_y": null,
"padding": null,
"right": null,
"top": null,
"visibility": null,
"width": null
}
},
"fcffccb427714665bec7d621d00d4ce3": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HBoxModel",
"state": {
"_dom_classes": [],
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "HBoxModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/controls",
"_view_module_version": "1.5.0",
"_view_name": "HBoxView",
"box_style": "",
"children": [
"IPY_MODEL_817f53e3fa5c43e29ad1b4410c3df7db",
"IPY_MODEL_50a441d7a43c4a809a09505b36e83375"
],
"layout": "IPY_MODEL_7aa02ef05fe64489ad6c969dd92d1b07"
}
},
"fe230ff13a82400f97cf6f292e8851ba": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_model_module": "@jupyter-widgets/base",
"_model_module_version": "1.2.0",
"_model_name": "LayoutModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "LayoutView",
"align_content": null,
"align_items": null,
"align_self": null,
"border": null,
"bottom": null,
"display": null,
"flex": null,
"flex_flow": null,
"grid_area": null,
"grid_auto_columns": null,
"grid_auto_flow": null,
"grid_auto_rows": null,
"grid_column": null,
"grid_gap": null,
"grid_row": null,
"grid_template_areas": null,
"grid_template_columns": null,
"grid_template_rows": null,
"height": null,
"justify_content": null,
"justify_items": null,
"left": null,
"margin": null,
"max_height": null,
"max_width": null,
"min_height": null,
"min_width": null,
"object_fit": null,
"object_position": null,
"order": null,
"overflow": null,
"overflow_x": null,
"overflow_y": null,
"padding": null,
"right": null,
"top": null,
"visibility": null,
"width": null
}
}
}
}
},
"nbformat": 4,
"nbformat_minor": 1
}
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.6"
},
"pycharm": {
"stem_cell": {
"cell_type": "raw",
"source": [],
"metadata": {
"collapsed": false
}
}
},
"colab": {
"name": "03-pipelines.ipynb",
"provenance": [],
"include_colab_link": true
},
"widgets": {
"application/vnd.jupyter.widget-state+json": {
"4bab5df43b3c46caadf48e264344ab42": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HBoxModel",
"state": {
"_view_name": "HBoxView",
"_dom_classes": [],
"_model_name": "HBoxModel",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.5.0",
"box_style": "",
"layout": "IPY_MODEL_9b426c68631f4bb288e2ca79aad9f9d9",
"_model_module": "@jupyter-widgets/controls",
"children": [
"IPY_MODEL_6902104f7ec143519fb1a6ab9363d4a0",
"IPY_MODEL_c133fb34fe2a4aba8a6b233671af8b04"
]
}
},
"9b426c68631f4bb288e2ca79aad9f9d9": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"6902104f7ec143519fb1a6ab9363d4a0": {
"model_module": "@jupyter-widgets/controls",
"model_name": "IntProgressModel",
"state": {
"_view_name": "ProgressView",
"style": "IPY_MODEL_e3f72d443a74414ca62c2b848d34b125",
"_dom_classes": [],
"description": "Downloading: 100%",
"_model_name": "IntProgressModel",
"bar_style": "success",
"max": 230,
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": 230,
"_view_count": null,
"_view_module_version": "1.5.0",
"orientation": "horizontal",
"min": 0,
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_5462b581976e47048642aa6bc12435bd"
}
},
"c133fb34fe2a4aba8a6b233671af8b04": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HTMLModel",
"state": {
"_view_name": "HTMLView",
"style": "IPY_MODEL_ad84da685cf44abb90d17d9d2e023b48",
"_dom_classes": [],
"description": "",
"_model_name": "HTMLModel",
"placeholder": "​",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": " 230/230 [00:01&lt;00:00, 194B/s]",
"_view_count": null,
"_view_module_version": "1.5.0",
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_a246f9eea2d7440cb979e728741d2e32"
}
},
"e3f72d443a74414ca62c2b848d34b125": {
"model_module": "@jupyter-widgets/controls",
"model_name": "ProgressStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "ProgressStyleModel",
"description_width": "initial",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"bar_color": null,
"_model_module": "@jupyter-widgets/controls"
}
},
"5462b581976e47048642aa6bc12435bd": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"ad84da685cf44abb90d17d9d2e023b48": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DescriptionStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "DescriptionStyleModel",
"description_width": "",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"_model_module": "@jupyter-widgets/controls"
}
},
"a246f9eea2d7440cb979e728741d2e32": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"451464c936444ba5a652b46c1b4f9931": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HBoxModel",
"state": {
"_view_name": "HBoxView",
"_dom_classes": [],
"_model_name": "HBoxModel",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.5.0",
"box_style": "",
"layout": "IPY_MODEL_279291efd2c14a9eb2c3b98efbf152ad",
"_model_module": "@jupyter-widgets/controls",
"children": [
"IPY_MODEL_b6e1a2e57f4948a39283f1370352612c",
"IPY_MODEL_9d4941ebdfa64978b47232f6e5908d97"
]
}
},
"279291efd2c14a9eb2c3b98efbf152ad": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"b6e1a2e57f4948a39283f1370352612c": {
"model_module": "@jupyter-widgets/controls",
"model_name": "IntProgressModel",
"state": {
"_view_name": "ProgressView",
"style": "IPY_MODEL_1006cc0fab1e4139bb7b135486261c92",
"_dom_classes": [],
"description": "Downloading: 100%",
"_model_name": "IntProgressModel",
"bar_style": "success",
"max": 230,
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": 230,
"_view_count": null,
"_view_module_version": "1.5.0",
"orientation": "horizontal",
"min": 0,
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_691c0bae60364890ab74934261207d4d"
}
},
"9d4941ebdfa64978b47232f6e5908d97": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HTMLModel",
"state": {
"_view_name": "HTMLView",
"style": "IPY_MODEL_405afa5bb8b840d8bc0850e02f593ce4",
"_dom_classes": [],
"description": "",
"_model_name": "HTMLModel",
"placeholder": "​",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": " 230/230 [00:01&lt;00:00, 198B/s]",
"_view_count": null,
"_view_module_version": "1.5.0",
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_78c718e3d5fa4cb892217260bea6d540"
}
},
"1006cc0fab1e4139bb7b135486261c92": {
"model_module": "@jupyter-widgets/controls",
"model_name": "ProgressStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "ProgressStyleModel",
"description_width": "initial",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"bar_color": null,
"_model_module": "@jupyter-widgets/controls"
}
},
"691c0bae60364890ab74934261207d4d": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"405afa5bb8b840d8bc0850e02f593ce4": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DescriptionStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "DescriptionStyleModel",
"description_width": "",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"_model_module": "@jupyter-widgets/controls"
}
},
"78c718e3d5fa4cb892217260bea6d540": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"7d66a4534c164d2f9493fc0467abebbd": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HBoxModel",
"state": {
"_view_name": "HBoxView",
"_dom_classes": [],
"_model_name": "HBoxModel",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.5.0",
"box_style": "",
"layout": "IPY_MODEL_7a15588f85b14f2b93e32b4c0442fa1b",
"_model_module": "@jupyter-widgets/controls",
"children": [
"IPY_MODEL_213567d815894ca08041f6d682ced3c9",
"IPY_MODEL_ee6c95e700e64d0a9ebec2c1545dd083"
]
}
},
"7a15588f85b14f2b93e32b4c0442fa1b": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"213567d815894ca08041f6d682ced3c9": {
"model_module": "@jupyter-widgets/controls",
"model_name": "IntProgressModel",
"state": {
"_view_name": "ProgressView",
"style": "IPY_MODEL_3e556abf5c4a4ee69d52366fd59471b2",
"_dom_classes": [],
"description": "Downloading: 100%",
"_model_name": "IntProgressModel",
"bar_style": "success",
"max": 230,
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": 230,
"_view_count": null,
"_view_module_version": "1.5.0",
"orientation": "horizontal",
"min": 0,
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_876b2eba73fa46a6a941d2e3a8a975ad"
}
},
"ee6c95e700e64d0a9ebec2c1545dd083": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HTMLModel",
"state": {
"_view_name": "HTMLView",
"style": "IPY_MODEL_cd64e3f20b23483daa79712bde6622ea",
"_dom_classes": [],
"description": "",
"_model_name": "HTMLModel",
"placeholder": "​",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": " 230/230 [00:02&lt;00:00, 85.4B/s]",
"_view_count": null,
"_view_module_version": "1.5.0",
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_67cbaa1f55d24e62ad6b022af36bca56"
}
},
"3e556abf5c4a4ee69d52366fd59471b2": {
"model_module": "@jupyter-widgets/controls",
"model_name": "ProgressStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "ProgressStyleModel",
"description_width": "initial",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"bar_color": null,
"_model_module": "@jupyter-widgets/controls"
}
},
"876b2eba73fa46a6a941d2e3a8a975ad": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"cd64e3f20b23483daa79712bde6622ea": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DescriptionStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "DescriptionStyleModel",
"description_width": "",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"_model_module": "@jupyter-widgets/controls"
}
},
"67cbaa1f55d24e62ad6b022af36bca56": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"58669943d3064f309436157270544c08": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HBoxModel",
"state": {
"_view_name": "HBoxView",
"_dom_classes": [],
"_model_name": "HBoxModel",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.5.0",
"box_style": "",
"layout": "IPY_MODEL_3eff293c2b554d85aefaea863e29b678",
"_model_module": "@jupyter-widgets/controls",
"children": [
"IPY_MODEL_d0b9925f3dde46008bf186cf5ef7722d",
"IPY_MODEL_427e07ce24a442af84ddc71f9463fdff"
]
}
},
"3eff293c2b554d85aefaea863e29b678": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"d0b9925f3dde46008bf186cf5ef7722d": {
"model_module": "@jupyter-widgets/controls",
"model_name": "IntProgressModel",
"state": {
"_view_name": "ProgressView",
"style": "IPY_MODEL_1eb2fa080ec44f8c8d5f6f52900277ab",
"_dom_classes": [],
"description": "Downloading: 100%",
"_model_name": "IntProgressModel",
"bar_style": "success",
"max": 230,
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": 230,
"_view_count": null,
"_view_module_version": "1.5.0",
"orientation": "horizontal",
"min": 0,
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_23377596349e40a89ea57c8558660073"
}
},
"427e07ce24a442af84ddc71f9463fdff": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HTMLModel",
"state": {
"_view_name": "HTMLView",
"style": "IPY_MODEL_a35703cc8ff44e93a8c0eb413caddc40",
"_dom_classes": [],
"description": "",
"_model_name": "HTMLModel",
"placeholder": "​",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": " 230/230 [00:01&lt;00:00, 120B/s]",
"_view_count": null,
"_view_module_version": "1.5.0",
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_9df7014c99b343f3b178fa020ff56010"
}
},
"1eb2fa080ec44f8c8d5f6f52900277ab": {
"model_module": "@jupyter-widgets/controls",
"model_name": "ProgressStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "ProgressStyleModel",
"description_width": "initial",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"bar_color": null,
"_model_module": "@jupyter-widgets/controls"
}
},
"23377596349e40a89ea57c8558660073": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"a35703cc8ff44e93a8c0eb413caddc40": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DescriptionStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "DescriptionStyleModel",
"description_width": "",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"_model_module": "@jupyter-widgets/controls"
}
},
"9df7014c99b343f3b178fa020ff56010": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"57e8c36594d043c581c766b434037771": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HBoxModel",
"state": {
"_view_name": "HBoxView",
"_dom_classes": [],
"_model_name": "HBoxModel",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.5.0",
"box_style": "",
"layout": "IPY_MODEL_82760185d5c14a808cbf6639b589f249",
"_model_module": "@jupyter-widgets/controls",
"children": [
"IPY_MODEL_f2a1b430594b4736879cdff4ec532098",
"IPY_MODEL_c81338551e60474fab9e9950fe5df294"
]
}
},
"82760185d5c14a808cbf6639b589f249": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"f2a1b430594b4736879cdff4ec532098": {
"model_module": "@jupyter-widgets/controls",
"model_name": "IntProgressModel",
"state": {
"_view_name": "ProgressView",
"style": "IPY_MODEL_98563b405bd043a9a301a43909e43157",
"_dom_classes": [],
"description": "Downloading: 100%",
"_model_name": "IntProgressModel",
"bar_style": "success",
"max": 230,
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": 230,
"_view_count": null,
"_view_module_version": "1.5.0",
"orientation": "horizontal",
"min": 0,
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_8c0e1b7fb6ac4ee7bbbaf6020b40cc77"
}
},
"c81338551e60474fab9e9950fe5df294": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HTMLModel",
"state": {
"_view_name": "HTMLView",
"style": "IPY_MODEL_ad78042ee71a41fd989e4b4ce9d2e3c1",
"_dom_classes": [],
"description": "",
"_model_name": "HTMLModel",
"placeholder": "​",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": " 230/230 [00:01&lt;00:00, 128B/s]",
"_view_count": null,
"_view_module_version": "1.5.0",
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_40c8d2617f3d4c84b923b140456fa5da"
}
},
"98563b405bd043a9a301a43909e43157": {
"model_module": "@jupyter-widgets/controls",
"model_name": "ProgressStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "ProgressStyleModel",
"description_width": "initial",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"bar_color": null,
"_model_module": "@jupyter-widgets/controls"
}
},
"8c0e1b7fb6ac4ee7bbbaf6020b40cc77": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"ad78042ee71a41fd989e4b4ce9d2e3c1": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DescriptionStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "DescriptionStyleModel",
"description_width": "",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"_model_module": "@jupyter-widgets/controls"
}
},
"40c8d2617f3d4c84b923b140456fa5da": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"311a65b811964ebfa2c064eb348b3ce9": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HBoxModel",
"state": {
"_view_name": "HBoxView",
"_dom_classes": [],
"_model_name": "HBoxModel",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.5.0",
"box_style": "",
"layout": "IPY_MODEL_5a2032c44d0e4f8cbaf512e6c29214cd",
"_model_module": "@jupyter-widgets/controls",
"children": [
"IPY_MODEL_54d1ff55e0094a4fa2b62ecdfb428328",
"IPY_MODEL_2e45f2d7d65246ecb8d6e666d026ac13"
]
}
},
"5a2032c44d0e4f8cbaf512e6c29214cd": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"54d1ff55e0094a4fa2b62ecdfb428328": {
"model_module": "@jupyter-widgets/controls",
"model_name": "IntProgressModel",
"state": {
"_view_name": "ProgressView",
"style": "IPY_MODEL_e05c0ec3b49e4d4990a943d428532fb0",
"_dom_classes": [],
"description": "Downloading: 100%",
"_model_name": "IntProgressModel",
"bar_style": "success",
"max": 230,
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": 230,
"_view_count": null,
"_view_module_version": "1.5.0",
"orientation": "horizontal",
"min": 0,
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_39721262fc1e4456966d92fabe0f54ea"
}
},
"2e45f2d7d65246ecb8d6e666d026ac13": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HTMLModel",
"state": {
"_view_name": "HTMLView",
"style": "IPY_MODEL_4486f8a2efc34b9aab3864eb5ad2ba48",
"_dom_classes": [],
"description": "",
"_model_name": "HTMLModel",
"placeholder": "​",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": " 230/230 [00:01&lt;00:00, 126B/s]",
"_view_count": null,
"_view_module_version": "1.5.0",
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_d6228324f3444aa6bd1323d65ae4ff75"
}
},
"e05c0ec3b49e4d4990a943d428532fb0": {
"model_module": "@jupyter-widgets/controls",
"model_name": "ProgressStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "ProgressStyleModel",
"description_width": "initial",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"bar_color": null,
"_model_module": "@jupyter-widgets/controls"
}
},
"39721262fc1e4456966d92fabe0f54ea": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"4486f8a2efc34b9aab3864eb5ad2ba48": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DescriptionStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "DescriptionStyleModel",
"description_width": "",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"_model_module": "@jupyter-widgets/controls"
}
},
"d6228324f3444aa6bd1323d65ae4ff75": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"3c86415352574190b71e1fe5a15d36f1": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HBoxModel",
"state": {
"_view_name": "HBoxView",
"_dom_classes": [],
"_model_name": "HBoxModel",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.5.0",
"box_style": "",
"layout": "IPY_MODEL_dd2c9dd935754cf2802233053554c21c",
"_model_module": "@jupyter-widgets/controls",
"children": [
"IPY_MODEL_8ae3be32d9c845e59fdb1c47884d48aa",
"IPY_MODEL_4dea0031f3554752ad5aad01fe516a60"
]
}
},
"dd2c9dd935754cf2802233053554c21c": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"8ae3be32d9c845e59fdb1c47884d48aa": {
"model_module": "@jupyter-widgets/controls",
"model_name": "FloatProgressModel",
"state": {
"_view_name": "ProgressView",
"style": "IPY_MODEL_1efb96d931a446de92f1930b973ae846",
"_dom_classes": [],
"description": "Downloading: 100%",
"_model_name": "FloatProgressModel",
"bar_style": "success",
"max": 230,
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": 230,
"_view_count": null,
"_view_module_version": "1.5.0",
"orientation": "horizontal",
"min": 0,
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_6a4f5aab5ba949fd860b5a35bba7db9c"
}
},
"4dea0031f3554752ad5aad01fe516a60": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HTMLModel",
"state": {
"_view_name": "HTMLView",
"style": "IPY_MODEL_4b02b2e964ad49af9f7ce7023131ceb8",
"_dom_classes": [],
"description": "",
"_model_name": "HTMLModel",
"placeholder": "​",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": " 230/230 [00:00&lt;00:00, 8.69kB/s]",
"_view_count": null,
"_view_module_version": "1.5.0",
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_0ae8a68c3668401da8d8a6d5ec9cac8f"
}
},
"1efb96d931a446de92f1930b973ae846": {
"model_module": "@jupyter-widgets/controls",
"model_name": "ProgressStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "ProgressStyleModel",
"description_width": "initial",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"bar_color": null,
"_model_module": "@jupyter-widgets/controls"
}
},
"6a4f5aab5ba949fd860b5a35bba7db9c": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"4b02b2e964ad49af9f7ce7023131ceb8": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DescriptionStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "DescriptionStyleModel",
"description_width": "",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"_model_module": "@jupyter-widgets/controls"
}
},
"0ae8a68c3668401da8d8a6d5ec9cac8f": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"fd44cf6ab17e4b768b2e1d5cb8ce5af9": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HBoxModel",
"state": {
"_view_name": "HBoxView",
"_dom_classes": [],
"_model_name": "HBoxModel",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.5.0",
"box_style": "",
"layout": "IPY_MODEL_b8c0ea31578d4eaaa69251d0004fd8c6",
"_model_module": "@jupyter-widgets/controls",
"children": [
"IPY_MODEL_2015cd9c1da9467290ecd9019af231eb",
"IPY_MODEL_17bacdaee55b43e8977c4dfe4f7245bb"
]
}
},
"b8c0ea31578d4eaaa69251d0004fd8c6": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"2015cd9c1da9467290ecd9019af231eb": {
"model_module": "@jupyter-widgets/controls",
"model_name": "IntProgressModel",
"state": {
"_view_name": "ProgressView",
"style": "IPY_MODEL_879ef9e1a0e94f3d96ed56fb4bae64b8",
"_dom_classes": [],
"description": "Downloading: 100%",
"_model_name": "IntProgressModel",
"bar_style": "success",
"max": 230,
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": 230,
"_view_count": null,
"_view_module_version": "1.5.0",
"orientation": "horizontal",
"min": 0,
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_7ab70324d42647acac5020b387955caf"
}
},
"17bacdaee55b43e8977c4dfe4f7245bb": {
"model_module": "@jupyter-widgets/controls",
"model_name": "HTMLModel",
"state": {
"_view_name": "HTMLView",
"style": "IPY_MODEL_31d97ecf78fa412c99e6659196d82828",
"_dom_classes": [],
"description": "",
"_model_name": "HTMLModel",
"placeholder": "​",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": " 230/230 [00:01&lt;00:00, 129B/s]",
"_view_count": null,
"_view_module_version": "1.5.0",
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_c6be5d48ec3c4c799d1445607e5f1ac6"
}
},
"879ef9e1a0e94f3d96ed56fb4bae64b8": {
"model_module": "@jupyter-widgets/controls",
"model_name": "ProgressStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "ProgressStyleModel",
"description_width": "initial",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"bar_color": null,
"_model_module": "@jupyter-widgets/controls"
}
},
"7ab70324d42647acac5020b387955caf": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"31d97ecf78fa412c99e6659196d82828": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DescriptionStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "DescriptionStyleModel",
"description_width": "",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"_model_module": "@jupyter-widgets/controls"
}
},
"c6be5d48ec3c4c799d1445607e5f1ac6": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"0bd407b4975f49c3827aede14c59501c": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DropdownModel",
"state": {
"_options_labels": [
"sentiment-analysis",
"ner",
"fill_mask"
],
"_view_name": "DropdownView",
"style": "IPY_MODEL_3f5406df699e44f5b60678c1c13500f5",
"_dom_classes": [],
"description": "Task:",
"_model_name": "DropdownModel",
"index": 1,
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_view_count": null,
"disabled": false,
"_view_module_version": "1.5.0",
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_17768469581445b68246ed308ce69326"
}
},
"3f5406df699e44f5b60678c1c13500f5": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DescriptionStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "DescriptionStyleModel",
"description_width": "",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"_model_module": "@jupyter-widgets/controls"
}
},
"17768469581445b68246ed308ce69326": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"74cbcbae5cac4f12abf080a38390f05c": {
"model_module": "@jupyter-widgets/controls",
"model_name": "TextModel",
"state": {
"_view_name": "TextView",
"style": "IPY_MODEL_62b10ca525cc4ac68f3a006434eb7416",
"_dom_classes": [],
"description": "Your input:",
"_model_name": "TextModel",
"placeholder": "Enter something",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": "Peter Pan likes to go on holiday in Marseille, France.",
"_view_count": null,
"disabled": false,
"_view_module_version": "1.5.0",
"continuous_update": true,
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_211109537fbe4e60b89a238c89db1346"
}
},
"62b10ca525cc4ac68f3a006434eb7416": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DescriptionStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "DescriptionStyleModel",
"description_width": "",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"_model_module": "@jupyter-widgets/controls"
}
},
"211109537fbe4e60b89a238c89db1346": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"d79946ac16ea4855a0bbe2ca2a4d4bf5": {
"model_module": "@jupyter-widgets/controls",
"model_name": "TextareaModel",
"state": {
"_view_name": "TextareaView",
"style": "IPY_MODEL_ab5774ac19f84ab18ddf09a63433df00",
"rows": null,
"_dom_classes": [],
"description": "Context:",
"_model_name": "TextareaModel",
"placeholder": "Enter something",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": "Einstein is famous for the general theory of relativity",
"_view_count": null,
"disabled": false,
"_view_module_version": "1.5.0",
"continuous_update": true,
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_a02164204f0f43668bc36a907e720af7"
}
},
"ab5774ac19f84ab18ddf09a63433df00": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DescriptionStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "DescriptionStyleModel",
"description_width": "",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"_model_module": "@jupyter-widgets/controls"
}
},
"a02164204f0f43668bc36a907e720af7": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
},
"3b12aec414b14221ad2a11dfd975faa0": {
"model_module": "@jupyter-widgets/controls",
"model_name": "TextModel",
"state": {
"_view_name": "TextView",
"style": "IPY_MODEL_d305ba1662e3466c93ab5cca7ebf8f33",
"_dom_classes": [],
"description": "Question:",
"_model_name": "TextModel",
"placeholder": "Enter something",
"_view_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"value": "Why is Einstein famous for ?",
"_view_count": null,
"disabled": false,
"_view_module_version": "1.5.0",
"continuous_update": true,
"description_tooltip": null,
"_model_module": "@jupyter-widgets/controls",
"layout": "IPY_MODEL_879f7a3747ad455d810c7a29918648ee"
}
},
"d305ba1662e3466c93ab5cca7ebf8f33": {
"model_module": "@jupyter-widgets/controls",
"model_name": "DescriptionStyleModel",
"state": {
"_view_name": "StyleView",
"_model_name": "DescriptionStyleModel",
"description_width": "",
"_view_module": "@jupyter-widgets/base",
"_model_module_version": "1.5.0",
"_view_count": null,
"_view_module_version": "1.2.0",
"_model_module": "@jupyter-widgets/controls"
}
},
"879f7a3747ad455d810c7a29918648ee": {
"model_module": "@jupyter-widgets/base",
"model_name": "LayoutModel",
"state": {
"_view_name": "LayoutView",
"grid_template_rows": null,
"right": null,
"justify_content": null,
"_view_module": "@jupyter-widgets/base",
"overflow": null,
"_model_module_version": "1.2.0",
"_view_count": null,
"flex_flow": null,
"width": null,
"min_width": null,
"border": null,
"align_items": null,
"bottom": null,
"_model_module": "@jupyter-widgets/base",
"top": null,
"grid_column": null,
"overflow_y": null,
"overflow_x": null,
"grid_auto_flow": null,
"grid_area": null,
"grid_template_columns": null,
"flex": null,
"_model_name": "LayoutModel",
"justify_items": null,
"grid_row": null,
"max_height": null,
"align_content": null,
"visibility": null,
"align_self": null,
"height": null,
"min_height": null,
"padding": null,
"grid_auto_rows": null,
"grid_gap": null,
"max_width": null,
"order": null,
"_view_module_version": "1.2.0",
"grid_template_areas": null,
"object_position": null,
"object_fit": null,
"grid_auto_columns": null,
"margin": null,
"display": null,
"left": null
}
}
}
}
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/github/huggingface/transformers/blob/master/notebooks/03-pipelines.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
},
"id": "qUU7wy-brl_H",
"colab_type": "text"
},
"source": [
"## How can I leverage State-of-the-Art Natural Language Models with only one line of code ?"
]
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
},
"id": "-HLOHXuArl_L",
"colab_type": "text"
},
"source": [
"Newly introduced in transformers v2.3.0, **pipelines** provides a high-level, easy to use,\n",
"API for doing inference over a variety of downstream-tasks, including: \n",
"\n",
"- ***Sentence Classification _(Sentiment Analysis)_***: Indicate if the overall sentence is either positive or negative, i.e. *binary classification task* or *logitic regression task*.\n",
"- ***Token Classification (Named Entity Recognition, Part-of-Speech tagging)***: For each sub-entities _(*tokens*)_ in the input, assign them a label, i.e. classification task.\n",
"- ***Question-Answering***: Provided a tuple (`question`, `context`) the model should find the span of text in `content` answering the `question`.\n",
"- ***Mask-Filling***: Suggests possible word(s) to fill the masked input with respect to the provided `context`.\n",
"- ***Summarization***: Summarizes the ``input`` article to a shorter article.\n",
"- ***Translation***: Translates the input from a language to another language.\n",
"- ***Feature Extraction***: Maps the input to a higher, multi-dimensional space learned from the data.\n",
"\n",
"Pipelines encapsulate the overall process of every NLP process:\n",
" \n",
" 1. *Tokenization*: Split the initial input into multiple sub-entities with ... properties (i.e. tokens).\n",
" 2. *Inference*: Maps every tokens into a more meaningful representation. \n",
" 3. *Decoding*: Use the above representation to generate and/or extract the final output for the underlying task.\n",
"\n",
"The overall API is exposed to the end-user through the `pipeline()` method with the following \n",
"structure:\n",
"\n",
"```python\n",
"from transformers import pipeline\n",
"\n",
"# Using default model and tokenizer for the task\n",
"pipeline(\"<task-name>\")\n",
"\n",
"# Using a user-specified model\n",
"pipeline(\"<task-name>\", model=\"<model_name>\")\n",
"\n",
"# Using custom model/tokenizer as str\n",
"pipeline('<task-name>', model='<model name>', tokenizer='<tokenizer_name>')\n",
"```"
]
},
{
"cell_type": "code",
"metadata": {
"pycharm": {
"name": "#%% code\n"
},
"id": "4maAknWNrl_N",
"colab_type": "code",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 102
},
"outputId": "467e3cc8-a069-47da-8029-86e4142c7dde"
},
"source": [
"!pip install -q transformers"
],
"execution_count": 2,
"outputs": [
{
"output_type": "stream",
"text": [
"\u001b[K |████████████████████████████████| 645kB 4.4MB/s \n",
"\u001b[K |████████████████████████████████| 3.8MB 11.7MB/s \n",
"\u001b[K |████████████████████████████████| 890kB 51.5MB/s \n",
"\u001b[K |████████████████████████████████| 1.0MB 46.0MB/s \n",
"\u001b[?25h Building wheel for sacremoses (setup.py) ... \u001b[?25l\u001b[?25hdone\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "code",
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% code \n"
},
"id": "uKaqzCh6rl_V",
"colab_type": "code",
"colab": {}
},
"source": [
"from __future__ import print_function\n",
"import ipywidgets as widgets\n",
"from transformers import pipeline"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
},
"id": "uDPZ42Uerl_b",
"colab_type": "text"
},
"source": [
"## 1. Sentence Classification - Sentiment Analysis"
]
},
{
"cell_type": "code",
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
},
"id": "AMRXHQw9rl_d",
"colab_type": "code",
"outputId": "a7a10851-b71e-4553-9afc-04066120410d",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 83,
"referenced_widgets": [
"4bab5df43b3c46caadf48e264344ab42",
"9b426c68631f4bb288e2ca79aad9f9d9",
"6902104f7ec143519fb1a6ab9363d4a0",
"c133fb34fe2a4aba8a6b233671af8b04",
"e3f72d443a74414ca62c2b848d34b125",
"5462b581976e47048642aa6bc12435bd",
"ad84da685cf44abb90d17d9d2e023b48",
"a246f9eea2d7440cb979e728741d2e32"
]
}
},
"source": [
"nlp_sentence_classif = pipeline('sentiment-analysis')\n",
"nlp_sentence_classif('Such a nice weather outside !')"
],
"execution_count": 0,
"outputs": [
{
"output_type": "display_data",
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "4bab5df43b3c46caadf48e264344ab42",
"version_minor": 0,
"version_major": 2
},
"text/plain": [
"HBox(children=(IntProgress(value=0, description='Downloading', max=230, style=ProgressStyle(description_width=…"
]
},
"metadata": {
"tags": []
}
},
{
"output_type": "stream",
"text": [
"\n"
],
"name": "stdout"
},
{
"output_type": "execute_result",
"data": {
"text/plain": [
"[{'label': 'POSITIVE', 'score': 0.9997656}]"
]
},
"metadata": {
"tags": []
},
"execution_count": 3
}
]
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
},
"id": "RY8aUJTvrl_k",
"colab_type": "text"
},
"source": [
"## 2. Token Classification - Named Entity Recognition"
]
},
{
"cell_type": "code",
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
},
"id": "B3BDRX_Krl_n",
"colab_type": "code",
"outputId": "a6b90b11-a272-4ecb-960d-4c682551b399",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 185,
"referenced_widgets": [
"451464c936444ba5a652b46c1b4f9931",
"279291efd2c14a9eb2c3b98efbf152ad",
"b6e1a2e57f4948a39283f1370352612c",
"9d4941ebdfa64978b47232f6e5908d97",
"1006cc0fab1e4139bb7b135486261c92",
"691c0bae60364890ab74934261207d4d",
"405afa5bb8b840d8bc0850e02f593ce4",
"78c718e3d5fa4cb892217260bea6d540"
]
}
},
"source": [
"nlp_token_class = pipeline('ner')\n",
"nlp_token_class('Hugging Face is a French company based in New-York.')"
],
"execution_count": 0,
"outputs": [
{
"output_type": "display_data",
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "451464c936444ba5a652b46c1b4f9931",
"version_minor": 0,
"version_major": 2
},
"text/plain": [
"HBox(children=(IntProgress(value=0, description='Downloading', max=230, style=ProgressStyle(description_width=…"
]
},
"metadata": {
"tags": []
}
},
{
"output_type": "stream",
"text": [
"\n"
],
"name": "stdout"
},
{
"output_type": "execute_result",
"data": {
"text/plain": [
"[{'entity': 'I-ORG', 'score': 0.9970937967300415, 'word': 'Hu'},\n",
" {'entity': 'I-ORG', 'score': 0.9345749020576477, 'word': '##gging'},\n",
" {'entity': 'I-ORG', 'score': 0.9787060022354126, 'word': 'Face'},\n",
" {'entity': 'I-MISC', 'score': 0.9981995820999146, 'word': 'French'},\n",
" {'entity': 'I-LOC', 'score': 0.9983047246932983, 'word': 'New'},\n",
" {'entity': 'I-LOC', 'score': 0.8913459181785583, 'word': '-'},\n",
" {'entity': 'I-LOC', 'score': 0.9979523420333862, 'word': 'York'}]"
]
},
"metadata": {
"tags": []
},
"execution_count": 4
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "qIvUFEVarl_s",
"colab_type": "text"
},
"source": [
"## 3. Question Answering"
]
},
{
"cell_type": "code",
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
},
"id": "ND_8LzQKrl_u",
"colab_type": "code",
"outputId": "c59ae695-c465-4de6-fa6e-181d8f1a3992",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 117,
"referenced_widgets": [
"7d66a4534c164d2f9493fc0467abebbd",
"7a15588f85b14f2b93e32b4c0442fa1b",
"213567d815894ca08041f6d682ced3c9",
"ee6c95e700e64d0a9ebec2c1545dd083",
"3e556abf5c4a4ee69d52366fd59471b2",
"876b2eba73fa46a6a941d2e3a8a975ad",
"cd64e3f20b23483daa79712bde6622ea",
"67cbaa1f55d24e62ad6b022af36bca56"
]
}
},
"source": [
"nlp_qa = pipeline('question-answering')\n",
"nlp_qa(context='Hugging Face is a French company based in New-York.', question='Where is based Hugging Face ?')"
],
"execution_count": 0,
"outputs": [
{
"output_type": "display_data",
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "7d66a4534c164d2f9493fc0467abebbd",
"version_minor": 0,
"version_major": 2
},
"text/plain": [
"HBox(children=(IntProgress(value=0, description='Downloading', max=230, style=ProgressStyle(description_width=…"
]
},
"metadata": {
"tags": []
}
},
{
"output_type": "stream",
"text": [
"\n"
],
"name": "stdout"
},
{
"output_type": "stream",
"text": [
"convert squad examples to features: 100%|██████████| 1/1 [00:00<00:00, 142.60it/s]\n",
"add example index and unique id: 100%|██████████| 1/1 [00:00<00:00, 4341.93it/s]\n"
],
"name": "stderr"
},
{
"output_type": "execute_result",
"data": {
"text/plain": [
"{'answer': 'New-York.', 'end': 50, 'score': 0.9632969241603995, 'start': 42}"
]
},
"metadata": {
"tags": []
},
"execution_count": 5
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "9W_CnP5Zrl_2",
"colab_type": "text"
},
"source": [
"## 4. Text Generation - Mask Filling"
]
},
{
"cell_type": "code",
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
},
"id": "zpJQ2HXNrl_4",
"colab_type": "code",
"outputId": "3fb62e7a-25a6-4b06-ced8-51eb8aa6bf33",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 321,
"referenced_widgets": [
"58669943d3064f309436157270544c08",
"3eff293c2b554d85aefaea863e29b678",
"d0b9925f3dde46008bf186cf5ef7722d",
"427e07ce24a442af84ddc71f9463fdff",
"1eb2fa080ec44f8c8d5f6f52900277ab",
"23377596349e40a89ea57c8558660073",
"a35703cc8ff44e93a8c0eb413caddc40",
"9df7014c99b343f3b178fa020ff56010"
]
}
},
"source": [
"nlp_fill = pipeline('fill-mask')\n",
"nlp_fill('Hugging Face is a French company based in ' + nlp_fill.tokenizer.mask_token)"
],
"execution_count": 0,
"outputs": [
{
"output_type": "display_data",
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "58669943d3064f309436157270544c08",
"version_minor": 0,
"version_major": 2
},
"text/plain": [
"HBox(children=(IntProgress(value=0, description='Downloading', max=230, style=ProgressStyle(description_width=…"
]
},
"metadata": {
"tags": []
}
},
{
"output_type": "stream",
"text": [
"\n"
],
"name": "stdout"
},
{
"output_type": "execute_result",
"data": {
"text/plain": [
"[{'score': 0.23106741905212402,\n",
" 'sequence': '<s> Hugging Face is a French company based in Paris</s>',\n",
" 'token': 2201},\n",
" {'score': 0.08198167383670807,\n",
" 'sequence': '<s> Hugging Face is a French company based in Lyon</s>',\n",
" 'token': 12790},\n",
" {'score': 0.04769487306475639,\n",
" 'sequence': '<s> Hugging Face is a French company based in Geneva</s>',\n",
" 'token': 11559},\n",
" {'score': 0.04762246832251549,\n",
" 'sequence': '<s> Hugging Face is a French company based in Brussels</s>',\n",
" 'token': 6497},\n",
" {'score': 0.041305847465991974,\n",
" 'sequence': '<s> Hugging Face is a French company based in France</s>',\n",
" 'token': 1470}]"
]
},
"metadata": {
"tags": []
},
"execution_count": 6
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "Fbs9t1KvrzDy",
"colab_type": "text"
},
"source": [
"## 5. Summarization\n",
"\n",
"Summarization is currently supported by `Bart` and `T5`."
]
},
{
"cell_type": "code",
"metadata": {
"id": "8BaOgzi1u1Yc",
"colab_type": "code",
"outputId": "2168e437-cfba-4247-a38c-07f02f555c6e",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 88
}
},
"source": [
"TEXT_TO_SUMMARIZE = \"\"\" \n",
"New York (CNN)When Liana Barrientos was 23 years old, she got married in Westchester County, New York. \n",
"A year later, she got married again in Westchester County, but to a different man and without divorcing her first husband. \n",
"Only 18 days after that marriage, she got hitched yet again. Then, Barrientos declared \"I do\" five more times, sometimes only within two weeks of each other. \n",
"In 2010, she married once more, this time in the Bronx. In an application for a marriage license, she stated it was her \"first and only\" marriage. \n",
"Barrientos, now 39, is facing two criminal counts of \"offering a false instrument for filing in the first degree,\" referring to her false statements on the \n",
"2010 marriage license application, according to court documents. \n",
"Prosecutors said the marriages were part of an immigration scam. \n",
"On Friday, she pleaded not guilty at State Supreme Court in the Bronx, according to her attorney, Christopher Wright, who declined to comment further. \n",
"After leaving court, Barrientos was arrested and charged with theft of service and criminal trespass for allegedly sneaking into the New York subway through an emergency exit, said Detective \n",
"Annette Markowski, a police spokeswoman. In total, Barrientos has been married 10 times, with nine of her marriages occurring between 1999 and 2002. \n",
"All occurred either in Westchester County, Long Island, New Jersey or the Bronx. She is believed to still be married to four men, and at one time, she was married to eight men at once, prosecutors say. \n",
"Prosecutors said the immigration scam involved some of her husbands, who filed for permanent residence status shortly after the marriages. \n",
"Any divorces happened only after such filings were approved. It was unclear whether any of the men will be prosecuted. \n",
"The case was referred to the Bronx District Attorney\\'s Office by Immigration and Customs Enforcement and the Department of Homeland Security\\'s \n",
"Investigation Division. Seven of the men are from so-called \"red-flagged\" countries, including Egypt, Turkey, Georgia, Pakistan and Mali. \n",
"Her eighth husband, Rashid Rajput, was deported in 2006 to his native Pakistan after an investigation by the Joint Terrorism Task Force. \n",
"If convicted, Barrientos faces up to four years in prison. Her next court appearance is scheduled for May 18.\n",
"\"\"\"\n",
"\n",
"summarizer = pipeline('summarization')\n",
"summarizer(TEXT_TO_SUMMARIZE)"
],
"execution_count": 0,
"outputs": [
{
"output_type": "stream",
"text": [
"Couldn't reach server at 'https://s3.amazonaws.com/models.huggingface.co/bert/facebook/bart-large-cnn/modelcard.json' to download model card file.\n",
"Creating an empty model card.\n"
],
"name": "stderr"
},
{
"output_type": "execute_result",
"data": {
"text/plain": [
"[{'summary_text': 'Liana Barrientos has been married 10 times, sometimes within two weeks of each other. Prosecutors say the marriages were part of an immigration scam. She is believed to still be married to four men, and at one time, she was married to eight men at once. Her eighth husband was deported in 2006 to his native Pakistan.'}]"
]
},
"metadata": {
"tags": []
},
"execution_count": 7
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "u5JA6IJsr-G0",
"colab_type": "text"
},
"source": [
"## 6. Translation\n",
"\n",
"Translation is currently supported by `T5` for the language mappings English-to-French (`translation_en_to_fr`), English-to-German (`translation_en_to_de`) and English-to-Romanian (`translation_en_to_ro`)."
]
},
{
"cell_type": "code",
"metadata": {
"id": "8FwayP4nwV3Z",
"colab_type": "code",
"outputId": "66956816-c924-4718-fe58-cabef7d51974",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 83,
"referenced_widgets": [
"57e8c36594d043c581c766b434037771",
"82760185d5c14a808cbf6639b589f249",
"f2a1b430594b4736879cdff4ec532098",
"c81338551e60474fab9e9950fe5df294",
"98563b405bd043a9a301a43909e43157",
"8c0e1b7fb6ac4ee7bbbaf6020b40cc77",
"ad78042ee71a41fd989e4b4ce9d2e3c1",
"40c8d2617f3d4c84b923b140456fa5da"
]
}
},
"source": [
"# English to French\n",
"translator = pipeline('translation_en_to_fr')\n",
"translator(\"HuggingFace is a French company that is based in New York City. HuggingFace's mission is to solve NLP one commit at a time\")"
],
"execution_count": 0,
"outputs": [
{
"output_type": "display_data",
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "57e8c36594d043c581c766b434037771",
"version_minor": 0,
"version_major": 2
},
"text/plain": [
"HBox(children=(IntProgress(value=0, description='Downloading', max=230, style=ProgressStyle(description_width=…"
]
},
"metadata": {
"tags": []
}
},
{
"output_type": "stream",
"text": [
"\n"
],
"name": "stdout"
},
{
"output_type": "execute_result",
"data": {
"text/plain": [
"[{'translation_text': 'HuggingFace est une entreprise française basée à New York et dont la mission est de résoudre les problèmes de NLP, un engagement à la fois.'}]"
]
},
"metadata": {
"tags": []
},
"execution_count": 8
}
]
},
{
"cell_type": "code",
"metadata": {
"colab_type": "code",
"id": "ra0-WfznwoIW",
"outputId": "278a3d5f-cc42-40bc-a9db-c92ec5a3a2f0",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 83,
"referenced_widgets": [
"311a65b811964ebfa2c064eb348b3ce9",
"5a2032c44d0e4f8cbaf512e6c29214cd",
"54d1ff55e0094a4fa2b62ecdfb428328",
"2e45f2d7d65246ecb8d6e666d026ac13",
"e05c0ec3b49e4d4990a943d428532fb0",
"39721262fc1e4456966d92fabe0f54ea",
"4486f8a2efc34b9aab3864eb5ad2ba48",
"d6228324f3444aa6bd1323d65ae4ff75"
]
}
},
"source": [
"# English to German\n",
"translator = pipeline('translation_en_to_de')\n",
"translator(\"The history of natural language processing (NLP) generally started in the 1950s, although work can be found from earlier periods.\")"
],
"execution_count": 0,
"outputs": [
{
"output_type": "display_data",
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "311a65b811964ebfa2c064eb348b3ce9",
"version_minor": 0,
"version_major": 2
},
"text/plain": [
"HBox(children=(IntProgress(value=0, description='Downloading', max=230, style=ProgressStyle(description_width=…"
]
},
"metadata": {
"tags": []
}
},
{
"output_type": "stream",
"text": [
"\n"
],
"name": "stdout"
},
{
"output_type": "execute_result",
"data": {
"text/plain": [
"[{'translation_text': 'Die Geschichte der natürlichen Sprachenverarbeitung (NLP) begann im Allgemeinen in den 1950er Jahren, obwohl die Arbeit aus früheren Zeiten zu finden ist.'}]"
]
},
"metadata": {
"tags": []
},
"execution_count": 9
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "qPUpg0M8hCtB",
"colab_type": "text"
},
"source": [
"## 7. Text Generation\n",
"\n",
"Text generation is currently supported by GPT-2, OpenAi-GPT, TransfoXL, XLNet, CTRL and Reformer."
]
},
{
"cell_type": "code",
"metadata": {
"id": "5pKfxTxohXuZ",
"colab_type": "code",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 120,
"referenced_widgets": [
"3c86415352574190b71e1fe5a15d36f1",
"dd2c9dd935754cf2802233053554c21c",
"8ae3be32d9c845e59fdb1c47884d48aa",
"4dea0031f3554752ad5aad01fe516a60",
"1efb96d931a446de92f1930b973ae846",
"6a4f5aab5ba949fd860b5a35bba7db9c",
"4b02b2e964ad49af9f7ce7023131ceb8",
"0ae8a68c3668401da8d8a6d5ec9cac8f"
]
},
"outputId": "8705f6b4-2413-4ac6-f72d-e5ecce160662"
},
"source": [
"text_generator = pipeline(\"text-generation\")\n",
"text_generator(\"Today is a beautiful day and I will\")"
],
"execution_count": 5,
"outputs": [
{
"output_type": "display_data",
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "3c86415352574190b71e1fe5a15d36f1",
"version_minor": 0,
"version_major": 2
},
"text/plain": [
"HBox(children=(FloatProgress(value=0.0, description='Downloading', max=230.0, style=ProgressStyle(description_…"
]
},
"metadata": {
"tags": []
}
},
{
"output_type": "stream",
"text": [
"\n"
],
"name": "stdout"
},
{
"output_type": "stream",
"text": [
"Setting `pad_token_id` to 50256 (first `eos_token_id`) to generate sequence\n"
],
"name": "stderr"
},
{
"output_type": "execute_result",
"data": {
"text/plain": [
"[{'generated_text': 'Today is a beautiful day and I will celebrate my birthday!\"\\n\\nThe mother told CNN the two had planned their meal together. After dinner, she added that she and I walked down the street and stopped at a diner near her home. \"He'}]"
]
},
"metadata": {
"tags": []
},
"execution_count": 5
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "Utmldmetrl_9",
"colab_type": "text"
},
"source": [
"## 8. Projection - Features Extraction "
]
},
{
"cell_type": "code",
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
},
"id": "O4SjR1QQrl__",
"colab_type": "code",
"outputId": "2ce966d5-7a89-4488-d48f-626d1c2a8222",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 83,
"referenced_widgets": [
"fd44cf6ab17e4b768b2e1d5cb8ce5af9",
"b8c0ea31578d4eaaa69251d0004fd8c6",
"2015cd9c1da9467290ecd9019af231eb",
"17bacdaee55b43e8977c4dfe4f7245bb",
"879ef9e1a0e94f3d96ed56fb4bae64b8",
"7ab70324d42647acac5020b387955caf",
"31d97ecf78fa412c99e6659196d82828",
"c6be5d48ec3c4c799d1445607e5f1ac6"
]
}
},
"source": [
"import numpy as np\n",
"nlp_features = pipeline('feature-extraction')\n",
"output = nlp_features('Hugging Face is a French company based in Paris')\n",
"np.array(output).shape # (Samples, Tokens, Vector Size)\n"
],
"execution_count": 0,
"outputs": [
{
"output_type": "display_data",
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "fd44cf6ab17e4b768b2e1d5cb8ce5af9",
"version_minor": 0,
"version_major": 2
},
"text/plain": [
"HBox(children=(IntProgress(value=0, description='Downloading', max=230, style=ProgressStyle(description_width=…"
]
},
"metadata": {
"tags": []
}
},
{
"output_type": "stream",
"text": [
"\n"
],
"name": "stdout"
},
{
"output_type": "execute_result",
"data": {
"text/plain": [
"(1, 12, 768)"
]
},
"metadata": {
"tags": []
},
"execution_count": 10
}
]
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
},
"id": "02j8km8YrmAE",
"colab_type": "text"
},
"source": [
"Alright ! Now you have a nice picture of what is possible through transformers' pipelines, and there is more\n",
"to come in future releases. \n",
"\n",
"In the meantime, you can try the different pipelines with your own inputs"
]
},
{
"cell_type": "code",
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% code\n"
},
"id": "yFlBPQHtrmAH",
"colab_type": "code",
"outputId": "03cc3207-a7e8-49fd-904a-63a7a1d0eb7a",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 116,
"referenced_widgets": [
"0bd407b4975f49c3827aede14c59501c",
"3f5406df699e44f5b60678c1c13500f5",
"17768469581445b68246ed308ce69326",
"74cbcbae5cac4f12abf080a38390f05c",
"62b10ca525cc4ac68f3a006434eb7416",
"211109537fbe4e60b89a238c89db1346"
]
}
},
"source": [
"task = widgets.Dropdown(\n",
" options=['sentiment-analysis', 'ner', 'fill_mask'],\n",
" value='ner',\n",
" description='Task:',\n",
" disabled=False\n",
")\n",
"\n",
"input = widgets.Text(\n",
" value='',\n",
" placeholder='Enter something',\n",
" description='Your input:',\n",
" disabled=False\n",
")\n",
"\n",
"def forward(_):\n",
" if len(input.value) > 0: \n",
" if task.value == 'ner':\n",
" output = nlp_token_class(input.value)\n",
" elif task.value == 'sentiment-analysis':\n",
" output = nlp_sentence_classif(input.value)\n",
" else:\n",
" if input.value.find('<mask>') == -1:\n",
" output = nlp_fill(input.value + ' <mask>')\n",
" else:\n",
" output = nlp_fill(input.value) \n",
" print(output)\n",
"\n",
"input.on_submit(forward)\n",
"display(task, input)"
],
"execution_count": 0,
"outputs": [
{
"output_type": "display_data",
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "0bd407b4975f49c3827aede14c59501c",
"version_minor": 0,
"version_major": 2
},
"text/plain": [
"Dropdown(description='Task:', index=1, options=('sentiment-analysis', 'ner', 'fill_mask'), value='ner')"
]
},
"metadata": {
"tags": []
}
},
{
"output_type": "display_data",
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "74cbcbae5cac4f12abf080a38390f05c",
"version_minor": 0,
"version_major": 2
},
"text/plain": [
"Text(value='', description='Your input:', placeholder='Enter something')"
]
},
"metadata": {
"tags": []
}
},
{
"output_type": "stream",
"text": [
"[{'word': 'Peter', 'score': 0.9935821294784546, 'entity': 'I-PER'}, {'word': 'Pan', 'score': 0.9901397228240967, 'entity': 'I-PER'}, {'word': 'Marseille', 'score': 0.9984904527664185, 'entity': 'I-LOC'}, {'word': 'France', 'score': 0.9998687505722046, 'entity': 'I-LOC'}]\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "code",
"metadata": {
"pycharm": {
"is_executing": false,
"name": "#%% Question Answering\n"
},
"id": "GCoKbBTYrmAN",
"colab_type": "code",
"outputId": "57c3a647-160a-4b3a-e852-e7a1daf1294a",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 143,
"referenced_widgets": [
"d79946ac16ea4855a0bbe2ca2a4d4bf5",
"ab5774ac19f84ab18ddf09a63433df00",
"a02164204f0f43668bc36a907e720af7",
"3b12aec414b14221ad2a11dfd975faa0",
"d305ba1662e3466c93ab5cca7ebf8f33",
"879f7a3747ad455d810c7a29918648ee"
]
}
},
"source": [
"context = widgets.Textarea(\n",
" value='Einstein is famous for the general theory of relativity',\n",
" placeholder='Enter something',\n",
" description='Context:',\n",
" disabled=False\n",
")\n",
"\n",
"query = widgets.Text(\n",
" value='Why is Einstein famous for ?',\n",
" placeholder='Enter something',\n",
" description='Question:',\n",
" disabled=False\n",
")\n",
"\n",
"def forward(_):\n",
" if len(context.value) > 0 and len(query.value) > 0: \n",
" output = nlp_qa(question=query.value, context=context.value) \n",
" print(output)\n",
"\n",
"query.on_submit(forward)\n",
"display(context, query)"
],
"execution_count": 0,
"outputs": [
{
"output_type": "display_data",
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "d79946ac16ea4855a0bbe2ca2a4d4bf5",
"version_minor": 0,
"version_major": 2
},
"text/plain": [
"Textarea(value='Einstein is famous for the general theory of relativity', description='Context:', placeholder=…"
]
},
"metadata": {
"tags": []
}
},
{
"output_type": "display_data",
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "3b12aec414b14221ad2a11dfd975faa0",
"version_minor": 0,
"version_major": 2
},
"text/plain": [
"Text(value='Why is Einstein famous for ?', description='Question:', placeholder='Enter something')"
]
},
"metadata": {
"tags": []
}
},
{
"output_type": "stream",
"text": [
"convert squad examples to features: 100%|██████████| 1/1 [00:00<00:00, 363.99it/s]\n",
"add example index and unique id: 100%|██████████| 1/1 [00:00<00:00, 5178.15it/s]\n"
],
"name": "stderr"
},
{
"output_type": "stream",
"text": [
"{'score': 0.40340594113729367, 'start': 27, 'end': 54, 'answer': 'general theory of relativity'}\n"
],
"name": "stdout"
}
]
}
]
}
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "jBasof3bv1LB"
},
"source": [
"<h1><center>How to export 🤗 Transformers Models to ONNX ?<h1><center>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"[ONNX](http://onnx.ai/) is open format for machine learning models. It allows to save your neural network's computation graph in a framework agnostic way, which might be particulary helpful when deploying deep learning models.\n",
"\n",
"Indeed, businesses might have other requirements _(languages, hardware, ...)_ for which the training framework might not be the best suited in inference scenarios. In that context, having a representation of the actual computation graph that can be shared accross various business units and logics across an organization might be a desirable component.\n",
"\n",
"Along with the serialization format, ONNX also provides a runtime library which allows efficient and hardware specific execution of the ONNX graph. This is done through the [onnxruntime](https://microsoft.github.io/onnxruntime/) project and already includes collaborations with many hardware vendors to seamlessly deploy models on various platforms.\n",
"\n",
"Through this notebook we'll walk you through the process to convert a PyTorch or TensorFlow transformers model to the [ONNX](http://onnx.ai/) and leverage [onnxruntime](https://microsoft.github.io/onnxruntime/) to run inference tasks on models from 🤗 __transformers__"
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "yNnbrSg-5e1s"
},
"source": [
"## Exporting 🤗 transformers model to ONNX\n",
"\n",
"---\n",
"\n",
"Exporting models _(either PyTorch or TensorFlow)_ is easily achieved through the conversion tool provided as part of 🤗 __transformers__ repository. \n",
"\n",
"Under the hood the process is sensibly the following: \n",
"\n",
"1. Allocate the model from transformers (**PyTorch or TensorFlow**)\n",
"2. Forward dummy inputs through the model this way **ONNX** can record the set of operations executed\n",
"3. Optionally define dynamic axes on input and output tensors\n",
"4. Save the graph along with the network parameters"
]
},
{
"cell_type": "code",
"execution_count": 22,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Collecting git+https://github.com/huggingface/transformers\n",
" Cloning https://github.com/huggingface/transformers to /tmp/pip-req-build-9rvbp9p8\n",
" Running command git clone -q https://github.com/huggingface/transformers /tmp/pip-req-build-9rvbp9p8\n",
"Requirement already satisfied, skipping upgrade: numpy in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (1.18.1)\n",
"Requirement already satisfied, skipping upgrade: tokenizers==0.8.1.rc2 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (0.8.1rc2)\n",
"Requirement already satisfied, skipping upgrade: packaging in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (20.4)\n",
"Requirement already satisfied, skipping upgrade: filelock in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (3.0.12)\n",
"Requirement already satisfied, skipping upgrade: requests in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (2.23.0)\n",
"Requirement already satisfied, skipping upgrade: tqdm>=4.27 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (4.46.1)\n",
"Requirement already satisfied, skipping upgrade: regex!=2019.12.17 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (2020.6.8)\n",
"Requirement already satisfied, skipping upgrade: sentencepiece!=0.1.92 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (0.1.91)\n",
"Requirement already satisfied, skipping upgrade: sacremoses in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (0.0.43)\n",
"Requirement already satisfied, skipping upgrade: pyparsing>=2.0.2 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from packaging->transformers==3.0.2) (2.4.7)\n",
"Requirement already satisfied, skipping upgrade: six in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from packaging->transformers==3.0.2) (1.15.0)\n",
"Requirement already satisfied, skipping upgrade: chardet<4,>=3.0.2 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from requests->transformers==3.0.2) (3.0.4)\n",
"Requirement already satisfied, skipping upgrade: idna<3,>=2.5 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from requests->transformers==3.0.2) (2.9)\n",
"Requirement already satisfied, skipping upgrade: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from requests->transformers==3.0.2) (1.25.9)\n",
"Requirement already satisfied, skipping upgrade: certifi>=2017.4.17 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from requests->transformers==3.0.2) (2020.6.20)\n",
"Requirement already satisfied, skipping upgrade: click in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from sacremoses->transformers==3.0.2) (7.1.2)\n",
"Requirement already satisfied, skipping upgrade: joblib in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from sacremoses->transformers==3.0.2) (0.15.1)\n",
"Building wheels for collected packages: transformers\n",
" Building wheel for transformers (setup.py) ... \u001b[?25ldone\n",
"\u001b[?25h Created wheel for transformers: filename=transformers-3.0.2-py3-none-any.whl size=883063 sha256=5f2caef76450921ae2e5b10abbbaab436e9c87c83486114fa08d305e4396d4cd\n",
" Stored in directory: /tmp/pip-ephem-wheel-cache-kftypcjz/wheels/42/68/45/c63edff61c292f2dfd4df4ef6522dcbecc603e7af82813c1d7\n",
"Successfully built transformers\n",
"Installing collected packages: transformers\n",
" Attempting uninstall: transformers\n",
" Found existing installation: transformers 3.0.2\n",
" Uninstalling transformers-3.0.2:\n",
" Successfully uninstalled transformers-3.0.2\n",
"Successfully installed transformers-3.0.2\n",
"Looking in links: https://download.pytorch.org/whl/torch_stable.html\n",
"Requirement already up-to-date: torch==1.6.0+cpu in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (1.6.0+cpu)\n",
"Requirement already up-to-date: torchvision==0.7.0+cpu in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (0.7.0+cpu)\n",
"Requirement already satisfied, skipping upgrade: numpy in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from torch==1.6.0+cpu) (1.18.1)\n",
"Requirement already satisfied, skipping upgrade: future in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from torch==1.6.0+cpu) (0.18.2)\n",
"Requirement already satisfied, skipping upgrade: pillow>=4.1.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from torchvision==0.7.0+cpu) (7.2.0)\n",
"Requirement already up-to-date: onnxruntime==1.4.0 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (1.4.0)\n",
"Requirement already satisfied, skipping upgrade: protobuf in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime==1.4.0) (3.12.2)\n",
"Requirement already satisfied, skipping upgrade: numpy>=1.16.6 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime==1.4.0) (1.18.1)\n",
"Requirement already satisfied, skipping upgrade: setuptools in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from protobuf->onnxruntime==1.4.0) (47.1.1.post20200604)\n",
"Requirement already satisfied, skipping upgrade: six>=1.9 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from protobuf->onnxruntime==1.4.0) (1.15.0)\n",
"Looking in indexes: https://test.pypi.org/simple/\n",
"Requirement already satisfied: ort-nightly in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (1.4.0.dev202008262)\n",
"Requirement already satisfied: protobuf in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from ort-nightly) (3.12.2)\n",
"Requirement already satisfied: numpy>=1.16.6 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from ort-nightly) (1.18.1)\n",
"Requirement already satisfied: setuptools in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from protobuf->ort-nightly) (47.1.1.post20200604)\n",
"Requirement already satisfied: six>=1.9 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from protobuf->ort-nightly) (1.15.0)\n",
"Requirement already up-to-date: onnxruntime-tools in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (1.4.2)\n",
"Requirement already satisfied, skipping upgrade: numpy in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime-tools) (1.18.1)\n",
"Requirement already satisfied, skipping upgrade: coloredlogs in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime-tools) (14.0)\n",
"Requirement already satisfied, skipping upgrade: py3nvml in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime-tools) (0.2.6)\n",
"Requirement already satisfied, skipping upgrade: psutil in /home/mfuntowicz/.local/lib/python3.8/site-packages/psutil-5.7.0-py3.8-linux-x86_64.egg (from onnxruntime-tools) (5.7.0)\n",
"Requirement already satisfied, skipping upgrade: packaging in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime-tools) (20.4)\n",
"Requirement already satisfied, skipping upgrade: py-cpuinfo in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime-tools) (5.0.0)\n",
"Requirement already satisfied, skipping upgrade: onnx in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime-tools) (1.7.0)\n",
"Requirement already satisfied, skipping upgrade: humanfriendly>=7.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from coloredlogs->onnxruntime-tools) (8.2)\n",
"Requirement already satisfied, skipping upgrade: xmltodict in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from py3nvml->onnxruntime-tools) (0.12.0)\n",
"Requirement already satisfied, skipping upgrade: six in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from packaging->onnxruntime-tools) (1.15.0)\n",
"Requirement already satisfied, skipping upgrade: pyparsing>=2.0.2 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from packaging->onnxruntime-tools) (2.4.7)\n",
"Requirement already satisfied, skipping upgrade: typing-extensions>=3.6.2.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnx->onnxruntime-tools) (3.7.4.2)\n",
"Requirement already satisfied, skipping upgrade: protobuf in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnx->onnxruntime-tools) (3.12.2)\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Requirement already satisfied, skipping upgrade: setuptools in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from protobuf->onnx->onnxruntime-tools) (47.1.1.post20200604)\r\n"
]
}
],
"source": [
"import sys\n",
"!{sys.executable} -m pip install --upgrade git+https://github.com/huggingface/transformers\n",
"!{sys.executable} -m pip install --upgrade torch==1.6.0+cpu torchvision==0.7.0+cpu -f https://download.pytorch.org/whl/torch_stable.html\n",
"!{sys.executable} -m pip install --upgrade onnxruntime==1.4.0\n",
"!{sys.executable} -m pip install -i https://test.pypi.org/simple/ ort-nightly\n",
"!{sys.executable} -m pip install --upgrade onnxruntime-tools"
]
},
{
"cell_type": "code",
"execution_count": 23,
"metadata": {
"colab": {},
"colab_type": "code",
"id": "PwAaOchY4N2-"
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-config.json from cache at /home/mfuntowicz/.cache/torch/transformers/b945b69218e98b3e2c95acf911789741307dec43c698d35fad11c1ae28bda352.9da767be51e1327499df13488672789394e2ca38b877837e52618a67d7002391\n",
"Model config BertConfig {\n",
" \"architectures\": [\n",
" \"BertForMaskedLM\"\n",
" ],\n",
" \"attention_probs_dropout_prob\": 0.1,\n",
" \"gradient_checkpointing\": false,\n",
" \"hidden_act\": \"gelu\",\n",
" \"hidden_dropout_prob\": 0.1,\n",
" \"hidden_size\": 768,\n",
" \"initializer_range\": 0.02,\n",
" \"intermediate_size\": 3072,\n",
" \"layer_norm_eps\": 1e-12,\n",
" \"max_position_embeddings\": 512,\n",
" \"model_type\": \"bert\",\n",
" \"num_attention_heads\": 12,\n",
" \"num_hidden_layers\": 12,\n",
" \"pad_token_id\": 0,\n",
" \"type_vocab_size\": 2,\n",
" \"vocab_size\": 28996\n",
"}\n",
"\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"ONNX opset version set to: 11\n",
"Loading pipeline (model: bert-base-cased, tokenizer: bert-base-cased)\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt from cache at /home/mfuntowicz/.cache/torch/transformers/5e8a2b4893d13790ed4150ca1906be5f7a03d6c4ddf62296c383f6db42814db2.e13dbb970cb325137104fb2e5f36fe865f27746c6b526f6352861b1980eb80b1\n",
"loading model card file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-modelcard.json from cache at /home/mfuntowicz/.cache/torch/transformers/72b46f187c40a666d54782e06684c2870e109350a3efe9aa5027253dec2e671d.455d944f3d1572ab55ed579849f751cf37f303e3388980a42d94f7cd57a4e331\n",
"Model card: {\n",
" \"caveats_and_recommendations\": {},\n",
" \"ethical_considerations\": {},\n",
" \"evaluation_data\": {},\n",
" \"factors\": {},\n",
" \"intended_use\": {},\n",
" \"metrics\": {},\n",
" \"model_details\": {},\n",
" \"quantitative_analyses\": {},\n",
" \"training_data\": {}\n",
"}\n",
"\n",
"loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-config.json from cache at /home/mfuntowicz/.cache/torch/transformers/b945b69218e98b3e2c95acf911789741307dec43c698d35fad11c1ae28bda352.9da767be51e1327499df13488672789394e2ca38b877837e52618a67d7002391\n",
"Model config BertConfig {\n",
" \"architectures\": [\n",
" \"BertForMaskedLM\"\n",
" ],\n",
" \"attention_probs_dropout_prob\": 0.1,\n",
" \"gradient_checkpointing\": false,\n",
" \"hidden_act\": \"gelu\",\n",
" \"hidden_dropout_prob\": 0.1,\n",
" \"hidden_size\": 768,\n",
" \"initializer_range\": 0.02,\n",
" \"intermediate_size\": 3072,\n",
" \"layer_norm_eps\": 1e-12,\n",
" \"max_position_embeddings\": 512,\n",
" \"model_type\": \"bert\",\n",
" \"num_attention_heads\": 12,\n",
" \"num_hidden_layers\": 12,\n",
" \"pad_token_id\": 0,\n",
" \"type_vocab_size\": 2,\n",
" \"vocab_size\": 28996\n",
"}\n",
"\n",
"loading weights file https://cdn.huggingface.co/bert-base-cased-pytorch_model.bin from cache at /home/mfuntowicz/.cache/torch/transformers/d8f11f061e407be64c4d5d7867ee61d1465263e24085cfa26abf183fdc830569.3fadbea36527ae472139fe84cddaa65454d7429f12d543d80bfc3ad70de55ac2\n",
"All model checkpoint weights were used when initializing BertModel.\n",
"\n",
"All the weights of BertModel were initialized from the model checkpoint at bert-base-cased.\n",
"If your task is similar to the task the model of the checkpoint was trained on, you can already use BertModel for predictions without further training.\n",
"/home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages/transformers/modeling_bert.py:201: TracerWarning: Converting a tensor to a Python index might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!\n",
" position_ids = self.position_ids[:, :seq_length]\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Creating folder onnx\n",
"Using framework PyTorch: 1.6.0\n",
"Found input input_ids with shape: {0: 'batch', 1: 'sequence'}\n",
"Found input token_type_ids with shape: {0: 'batch', 1: 'sequence'}\n",
"Found input attention_mask with shape: {0: 'batch', 1: 'sequence'}\n",
"Found output output_0 with shape: {0: 'batch', 1: 'sequence'}\n",
"Found output output_1 with shape: {0: 'batch'}\n",
"Ensuring inputs are in correct order\n",
"position_ids is not present in the generated input list.\n",
"Generated inputs order: ['input_ids', 'attention_mask', 'token_type_ids']\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"/home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages/transformers/modeling_utils.py:1570: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!\n",
" input_tensor.shape == tensor_shape for input_tensor in input_tensors\n"
]
}
],
"source": [
"!rm -rf onnx/\n",
"from pathlib import Path\n",
"from transformers.convert_graph_to_onnx import convert\n",
"\n",
"# Handles all the above steps for you\n",
"convert(framework=\"pt\", model=\"bert-base-cased\", output=Path(\"onnx/bert-base-cased.onnx\"), opset=11)\n",
"\n",
"# Tensorflow \n",
"# convert(framework=\"tf\", model=\"bert-base-cased\", output=\"onnx/bert-base-cased.onnx\", opset=11)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## How to leverage runtime for inference over an ONNX graph\n",
"\n",
"---\n",
"\n",
"As mentionned in the introduction, **ONNX** is a serialization format and many side projects can load the saved graph and run the actual computations from it. Here, we'll focus on the official [onnxruntime](https://microsoft.github.io/onnxruntime/). The runtime is implemented in C++ for performance reasons and provides API/Bindings for C++, C, C#, Java and Python.\n",
"\n",
"In the case of this notebook, we will use the Python API to highlight how to load a serialized **ONNX** graph and run inference workload on various backends through **onnxruntime**.\n",
"\n",
"**onnxruntime** is available on pypi:\n",
"\n",
"- onnxruntime: ONNX + MLAS (Microsoft Linear Algebra Subprograms)\n",
"- onnxruntime-gpu: ONNX + MLAS + CUDA\n"
]
},
{
"cell_type": "code",
"execution_count": 24,
"metadata": {
"pycharm": {
"name": "#%%\n"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Requirement already satisfied: transformers in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (3.0.2)\n",
"Requirement already satisfied: onnxruntime-gpu in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (1.3.0)\n",
"Requirement already satisfied: onnx in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (1.7.0)\n",
"Requirement already satisfied: psutil in /home/mfuntowicz/.local/lib/python3.8/site-packages/psutil-5.7.0-py3.8-linux-x86_64.egg (5.7.0)\n",
"Requirement already satisfied: matplotlib in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (3.3.1)\n",
"Requirement already satisfied: tqdm>=4.27 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (4.46.1)\n",
"Requirement already satisfied: numpy in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (1.18.1)\n",
"Requirement already satisfied: sacremoses in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (0.0.43)\n",
"Requirement already satisfied: regex!=2019.12.17 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (2020.6.8)\n",
"Requirement already satisfied: filelock in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (3.0.12)\n",
"Requirement already satisfied: sentencepiece!=0.1.92 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (0.1.91)\n",
"Requirement already satisfied: requests in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (2.23.0)\n",
"Requirement already satisfied: packaging in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (20.4)\n",
"Requirement already satisfied: tokenizers==0.8.1.rc2 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (0.8.1rc2)\n",
"Requirement already satisfied: protobuf in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime-gpu) (3.12.2)\n",
"Requirement already satisfied: six in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnx) (1.15.0)\n",
"Requirement already satisfied: typing-extensions>=3.6.2.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnx) (3.7.4.2)\n",
"Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.3 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from matplotlib) (2.4.7)\n",
"Requirement already satisfied: kiwisolver>=1.0.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from matplotlib) (1.2.0)\n",
"Requirement already satisfied: python-dateutil>=2.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from matplotlib) (2.8.1)\n",
"Requirement already satisfied: cycler>=0.10 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from matplotlib) (0.10.0)\n",
"Requirement already satisfied: pillow>=6.2.0 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from matplotlib) (7.2.0)\n",
"Requirement already satisfied: certifi>=2020.06.20 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from matplotlib) (2020.6.20)\n",
"Requirement already satisfied: click in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from sacremoses->transformers) (7.1.2)\n",
"Requirement already satisfied: joblib in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from sacremoses->transformers) (0.15.1)\n",
"Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from requests->transformers) (1.25.9)\n",
"Requirement already satisfied: chardet<4,>=3.0.2 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from requests->transformers) (3.0.4)\n",
"Requirement already satisfied: idna<3,>=2.5 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from requests->transformers) (2.9)\n",
"Requirement already satisfied: setuptools in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from protobuf->onnxruntime-gpu) (47.1.1.post20200604)\n"
]
}
],
"source": [
"!pip install transformers onnxruntime-gpu onnx psutil matplotlib"
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "-gP08tHfBvgY"
},
"source": [
"## Preparing for an Inference Session\n",
"\n",
"---\n",
"\n",
"Inference is done using a specific backend definition which turns on hardware specific optimizations of the graph. \n",
"\n",
"Optimizations are basically of three kinds: \n",
"\n",
"- **Constant Folding**: Convert static variables to constants in the graph \n",
"- **Deadcode Elimination**: Remove nodes never accessed in the graph\n",
"- **Operator Fusing**: Merge multiple instruction into one (Linear -> ReLU can be fused to be LinearReLU)\n",
"\n",
"ONNX Runtime automatically applies most optimizations by setting specific `SessionOptions`.\n",
"\n",
"Note:Some of the latest optimizations that are not yet integrated into ONNX Runtime are available in [optimization script](https://github.com/microsoft/onnxruntime/tree/master/onnxruntime/python/tools/transformers) that tunes models for the best performance."
]
},
{
"cell_type": "code",
"execution_count": 25,
"metadata": {
"pycharm": {
"name": "#%%\n"
}
},
"outputs": [],
"source": [
"# # An optional step unless\n",
"# # you want to get a model with mixed precision for perf accelartion on newer GPU\n",
"# # or you are working with Tensorflow(tf.keras) models or pytorch models other than bert\n",
"\n",
"# !pip install onnxruntime-tools\n",
"# from onnxruntime_tools import optimizer\n",
"\n",
"# # Mixed precision conversion for bert-base-cased model converted from Pytorch\n",
"# optimized_model = optimizer.optimize_model(\"bert-base-cased.onnx\", model_type='bert', num_heads=12, hidden_size=768)\n",
"# optimized_model.convert_model_float32_to_float16()\n",
"# optimized_model.save_model_to_file(\"bert-base-cased.onnx\")\n",
"\n",
"# # optimizations for bert-base-cased model converted from Tensorflow(tf.keras)\n",
"# optimized_model = optimizer.optimize_model(\"bert-base-cased.onnx\", model_type='bert_keras', num_heads=12, hidden_size=768)\n",
"# optimized_model.save_model_to_file(\"bert-base-cased.onnx\")\n",
"\n",
"\n",
"# optimize transformer-based models with onnxruntime-tools\n",
"from onnxruntime_tools import optimizer\n",
"from onnxruntime_tools.transformers.onnx_model_bert import BertOptimizationOptions\n",
"\n",
"# disable embedding layer norm optimization for better model size reduction\n",
"opt_options = BertOptimizationOptions('bert')\n",
"opt_options.enable_embed_layer_norm = False\n",
"\n",
"opt_model = optimizer.optimize_model(\n",
" 'onnx/bert-base-cased.onnx',\n",
" 'bert', \n",
" num_heads=12,\n",
" hidden_size=768,\n",
" optimization_options=opt_options)\n",
"opt_model.save_model_to_file('bert.opt.onnx')\n"
]
},
{
"cell_type": "code",
"execution_count": 26,
"metadata": {
"pycharm": {
"name": "#%%\n"
}
},
"outputs": [],
"source": [
"from os import environ\n",
"from psutil import cpu_count\n",
"\n",
"# Constants from the performance optimization available in onnxruntime\n",
"# It needs to be done before importing onnxruntime\n",
"environ[\"OMP_NUM_THREADS\"] = str(cpu_count(logical=True))\n",
"environ[\"OMP_WAIT_POLICY\"] = 'ACTIVE'\n",
"\n",
"from onnxruntime import GraphOptimizationLevel, InferenceSession, SessionOptions, get_all_providers"
]
},
{
"cell_type": "code",
"execution_count": 27,
"metadata": {
"colab": {},
"colab_type": "code",
"id": "2k-jHLfdcTFS"
},
"outputs": [],
"source": [
"from contextlib import contextmanager\n",
"from dataclasses import dataclass\n",
"from time import time\n",
"from tqdm import trange\n",
"\n",
"def create_model_for_provider(model_path: str, provider: str) -> InferenceSession: \n",
" \n",
" assert provider in get_all_providers(), f\"provider {provider} not found, {get_all_providers()}\"\n",
"\n",
" # Few properties that might have an impact on performances (provided by MS)\n",
" options = SessionOptions()\n",
" options.intra_op_num_threads = 1\n",
" options.graph_optimization_level = GraphOptimizationLevel.ORT_ENABLE_ALL\n",
"\n",
" # Load the model as a graph and prepare the CPU backend \n",
" session = InferenceSession(model_path, options, providers=[provider])\n",
" session.disable_fallback()\n",
" \n",
" return session\n",
"\n",
"\n",
"@contextmanager\n",
"def track_infer_time(buffer: [int]):\n",
" start = time()\n",
" yield\n",
" end = time()\n",
"\n",
" buffer.append(end - start)\n",
"\n",
"\n",
"@dataclass\n",
"class OnnxInferenceResult:\n",
" model_inference_time: [int] \n",
" optimized_model_path: str"
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "teJdG3amE-hR"
},
"source": [
"## Forwarding through our optimized ONNX model running on CPU\n",
"\n",
"---\n",
"\n",
"When the model is loaded for inference over a specific provider, for instance **CPUExecutionProvider** as above, an optimized graph can be saved. This graph will might include various optimizations, and you might be able to see some **higher-level** operations in the graph _(through [Netron](https://github.com/lutzroeder/Netron) for instance)_ such as:\n",
"- **EmbedLayerNormalization**\n",
"- **Attention**\n",
"- **FastGeLU**\n",
"\n",
"These operations are an example of the kind of optimization **onnxruntime** is doing, for instance here gathering multiple operations into bigger one _(Operator Fusing)_."
]
},
{
"cell_type": "code",
"execution_count": 28,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 34
},
"colab_type": "code",
"id": "dmC22kJfVGYe",
"outputId": "f3aba5dc-15c0-4f82-b38c-1bbae1bf112e"
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt from cache at /home/mfuntowicz/.cache/torch/transformers/5e8a2b4893d13790ed4150ca1906be5f7a03d6c4ddf62296c383f6db42814db2.e13dbb970cb325137104fb2e5f36fe865f27746c6b526f6352861b1980eb80b1\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Sequence output: (1, 6, 768), Pooled output: (1, 768)\n"
]
}
],
"source": [
"from transformers import BertTokenizerFast\n",
"\n",
"tokenizer = BertTokenizerFast.from_pretrained(\"bert-base-cased\")\n",
"cpu_model = create_model_for_provider(\"onnx/bert-base-cased.onnx\", \"CPUExecutionProvider\")\n",
"\n",
"# Inputs are provided through numpy array\n",
"model_inputs = tokenizer(\"My name is Bert\", return_tensors=\"pt\")\n",
"inputs_onnx = {k: v.cpu().detach().numpy() for k, v in model_inputs.items()}\n",
"\n",
"# Run the model (None = get all the outputs)\n",
"sequence, pooled = cpu_model.run(None, inputs_onnx)\n",
"\n",
"# Print information about outputs\n",
"\n",
"print(f\"Sequence output: {sequence.shape}, Pooled output: {pooled.shape}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Benchmarking PyTorch model\n",
"\n",
"_Note: PyTorch model benchmark is run on CPU_"
]
},
{
"cell_type": "code",
"execution_count": 29,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 51
},
"colab_type": "code",
"id": "PS_49goe197g",
"outputId": "0ef0f70c-f5a7-46a0-949a-1a93f231d193"
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-config.json from cache at /home/mfuntowicz/.cache/torch/transformers/b945b69218e98b3e2c95acf911789741307dec43c698d35fad11c1ae28bda352.9da767be51e1327499df13488672789394e2ca38b877837e52618a67d7002391\n",
"Model config BertConfig {\n",
" \"architectures\": [\n",
" \"BertForMaskedLM\"\n",
" ],\n",
" \"attention_probs_dropout_prob\": 0.1,\n",
" \"gradient_checkpointing\": false,\n",
" \"hidden_act\": \"gelu\",\n",
" \"hidden_dropout_prob\": 0.1,\n",
" \"hidden_size\": 768,\n",
" \"initializer_range\": 0.02,\n",
" \"intermediate_size\": 3072,\n",
" \"layer_norm_eps\": 1e-12,\n",
" \"max_position_embeddings\": 512,\n",
" \"model_type\": \"bert\",\n",
" \"num_attention_heads\": 12,\n",
" \"num_hidden_layers\": 12,\n",
" \"pad_token_id\": 0,\n",
" \"type_vocab_size\": 2,\n",
" \"vocab_size\": 28996\n",
"}\n",
"\n",
"loading weights file https://cdn.huggingface.co/bert-base-cased-pytorch_model.bin from cache at /home/mfuntowicz/.cache/torch/transformers/d8f11f061e407be64c4d5d7867ee61d1465263e24085cfa26abf183fdc830569.3fadbea36527ae472139fe84cddaa65454d7429f12d543d80bfc3ad70de55ac2\n",
"All model checkpoint weights were used when initializing BertModel.\n",
"\n",
"All the weights of BertModel were initialized from the model checkpoint at bert-base-cased.\n",
"If your task is similar to the task the model of the checkpoint was trained on, you can already use BertModel for predictions without further training.\n",
"Warming up: 100%|██████████| 10/10 [00:00<00:00, 39.30it/s]\n",
"Tracking inference time on PyTorch: 100%|██████████| 100/100 [00:02<00:00, 41.09it/s]\n"
]
}
],
"source": [
"from transformers import BertModel\n",
"\n",
"PROVIDERS = {\n",
" (\"cpu\", \"PyTorch CPU\"),\n",
"# Uncomment this line to enable GPU benchmarking\n",
"# (\"cuda:0\", \"PyTorch GPU\")\n",
"}\n",
"\n",
"results = {}\n",
"\n",
"for device, label in PROVIDERS:\n",
" \n",
" # Move inputs to the correct device\n",
" model_inputs_on_device = {\n",
" arg_name: tensor.to(device)\n",
" for arg_name, tensor in model_inputs.items()\n",
" }\n",
"\n",
" # Add PyTorch to the providers\n",
" model_pt = BertModel.from_pretrained(\"bert-base-cased\").to(device)\n",
" for _ in trange(10, desc=\"Warming up\"):\n",
" model_pt(**model_inputs_on_device)\n",
"\n",
" # Compute \n",
" time_buffer = []\n",
" for _ in trange(100, desc=f\"Tracking inference time on PyTorch\"):\n",
" with track_infer_time(time_buffer):\n",
" model_pt(**model_inputs_on_device)\n",
"\n",
" # Store the result\n",
" results[label] = OnnxInferenceResult(\n",
" time_buffer, \n",
" None\n",
" ) "
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "Kda1e7TkEqNR"
},
"source": [
"## Benchmarking PyTorch & ONNX on CPU\n",
"\n",
"_**Disclamer: results may vary from the actual hardware used to run the model**_"
]
},
{
"cell_type": "code",
"execution_count": 30,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 170
},
"colab_type": "code",
"id": "WcdFZCvImVig",
"outputId": "bfd779a1-0bc7-42db-8587-e52a485ec5e3"
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"Tracking inference time on CPUExecutionProvider: 100%|██████████| 100/100 [00:01<00:00, 63.62it/s]\n"
]
}
],
"source": [
"PROVIDERS = {\n",
" (\"CPUExecutionProvider\", \"ONNX CPU\"),\n",
"# Uncomment this line to enable GPU benchmarking\n",
"# (\"CUDAExecutionProvider\", \"ONNX GPU\")\n",
"}\n",
"\n",
"\n",
"for provider, label in PROVIDERS:\n",
" # Create the model with the specified provider\n",
" model = create_model_for_provider(\"onnx/bert-base-cased.onnx\", provider)\n",
"\n",
" # Keep track of the inference time\n",
" time_buffer = []\n",
"\n",
" # Warm up the model\n",
" model.run(None, inputs_onnx)\n",
"\n",
" # Compute \n",
" for _ in trange(100, desc=f\"Tracking inference time on {provider}\"):\n",
" with track_infer_time(time_buffer):\n",
" model.run(None, inputs_onnx)\n",
"\n",
" # Store the result\n",
" results[label] = OnnxInferenceResult(\n",
" time_buffer,\n",
" model.get_session_options().optimized_model_filepath\n",
" )"
]
},
{
"cell_type": "code",
"execution_count": 31,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAABRoAAAPeCAYAAABjjKazAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/d3fzzAAAACXBIWXMAAA9hAAAPYQGoP6dpAABezklEQVR4nOzdeZjd8/3//8fJNklkEyIJQmxFbPWh1dDGFiL2rYQuUVsX1Fof+WgtpVJUpVW7fkI1GruqfuxLKtaKWtuqqBBrbEkkYZA5vz/6zfyMLDJeM2aG2+26znXlvM/7nPfznFmucfdeKtVqtRoAAAAAgALtWnoAAAAAAKDtExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAYA2ZeDAgdlnn30+8fNPP/30rLzyymnfvn2++MUvNtlcnyV33XVXKpVK7rrrrpYepYFZs2ZlmWWWybhx4z7V7R5zzDHZaKONGvWcz/r32bzvkauuuqqlR2kyF198cSqVSqZMmfKx65b+HgKAzyqhEQBaoXPOOSeVSqXRcYNFu+WWW3L00Udnk002ydixY3PKKae09Egt6pxzzsnFF1/c0mMstl/96lfp3r17RowY8alu97DDDsujjz6a66+/frHW930GAHxedWjpAQCA+Y0bNy4DBw7Mgw8+mMmTJ2fVVVdt6ZFajaeeeirt2n2y/1d6xx13pF27dvntb3+bTp06NfFkbc8555yTpZdeer49s4YMGZJ33nmnVX1G77//fn71q1/l8MMPT/v27T/Vbffr1y877bRTfvGLX2THHXf82PV9n7VN3/rWtzJixIjU1NS09CgA0GbZoxEAWplnn3029957b375y1+mT58+n/phoklSV1eXd99991Pf7uKoqalJx44dP9Fzp02bli5dujRp/JkzZ06TvVZr0a5du3Tu3PkTB93mcMMNN+S1117LHnvs0SLb32OPPTJx4sT8+9///th1m/r7rFqt5p133mmS1/osaK7fT+3bt0/nzp1TqVSa/LUX5YMPPsh77733qW4TAJpL6/nrEQBI8p+9GZdccslst9122X333RuExvfffz+9e/fOd77znfmeN3PmzHTu3DlHHXVU/bLa2tocf/zxWXXVVVNTU5MBAwbk6KOPTm1tbYPnViqVHHzwwRk3blzWWmut1NTU5KabbkqS/OIXv8jGG2+cpZZaKl26dMkGG2ywwPOyvfPOO/nhD3+YpZdeOt27d8+OO+6YF198MZVKJSeccEKDdV988cXsu+++6du3b2pqarLWWmvlf//3fxfr8/noudHmnVftnnvuyRFHHJE+ffpkiSWWyC677JLXXnutwXscO3ZsZs+enUqlkkql0uCw4d///vfZYIMN0qVLl/Tu3TsjRozI1KlTG2x7s802y9prr51JkyZlyJAh6dq1a/7nf/7nE33W1113XdZee+369z/v8/7o57Tffvtl2WWXTU1NTVZaaaV8//vfbxAlpk+fnsMOOywDBgxITU1NVl111Zx66qmpq6v72M/xySefzIQJE+o/j8022yzJgs/ROO+9P/bYY9l0003TtWvXrLrqqvXfCxMmTMhGG22ULl26ZPXVV89tt922wPfzSb/u1113XQYOHJhVVlmlwfJ99tkn3bp1y/PPP5/tt98+3bp1y3LLLZezzz47SfL4449niy22yBJLLJEVV1wxl112WYPnv//++znxxBOz2mqrpXPnzllqqaXy1a9+NbfeemuD9YYOHZok+eMf/7jIORf1ffbBBx/kpJNOyiqrrJKampoMHDgw//M//zPf98jAgQOz/fbb5+abb86GG26YLl265Pzzz1/kdh944IFss8026dmzZ7p27ZpNN90099xzT4N1nnvuufzgBz/I6quvni5dumSppZbK17/+9QWek3D69Ok5/PDDM3DgwNTU1GT55ZfPt7/97bz++usN1qurq8vPfvazLL/88uncuXO23HLLTJ48eZGzJskJJ5yQSqWSf/7zn9ljjz3So0ePLLXUUjn00EPni4iL+v30t7/9LcOHD0+PHj3SrVu3bLnllrn//vvrn/vQQw+lUqnkkksumW+Gm2++OZVKJTfccEOSBZ+jsVqt5uSTT87yyy+frl27ZvPNN8+TTz65wPe0OD+LU6ZMSaVSyS9+8YuMGTOm/nvh73//+8d+ZgDQFjh0GgBamXHjxmXXXXdNp06dstdee+Xcc8/NX//613zpS19Kx44ds8suu+Saa67J+eef32CPqeuuuy61tbX156+rq6vLjjvumIkTJ+bAAw/Mmmuumccffzxnnnlm/vWvf+W6665rsN077rgjV1xxRQ4++OAsvfTSGThwYJL/nBdvxx13zDe+8Y289957GT9+fL7+9a/nhhtuyHbbbVf//H322SdXXHFFvvWtb+UrX/lKJkyY0ODxeV599dV85StfqY8Hffr0yY033pj99tsvM2fOzGGHHfaJPrdDDjkkSy65ZI4//vhMmTIlY8aMycEHH5zLL788SXLppZfmggsuyIMPPpiLLrooSbLxxhsnSX72s5/lJz/5SfbYY4/sv//+ee2113LWWWdlyJAh+dvf/pZevXrVb+eNN97I8OHDM2LEiHzzm99M3759G/1ZT5w4Mddcc01+8IMfpHv37vn1r3+d3XbbLc8//3yWWmqpJMlLL72UL3/5y5k+fXoOPPDArLHGGnnxxRdz1VVXZc6cOenUqVPmzJmTTTfdNC+++GK++93vZoUVVsi9996bUaNG5eWXX86YMWMW+nmNGTMmhxxySLp165Zjjz02SdK3b99FfsZvvfVWtt9++4wYMSJf//rXc+6552bEiBEZN25cDjvssHzve9/L3nvvndNPPz277757pk6dmu7duycp/7rfe++9+a//+q8FPjZ37twMHz48Q4YMyWmnnZZx48bl4IMPzhJLLJFjjz023/jGN7LrrrvmvPPOy7e//e0MHjw4K620UpL/BK/Ro0dn//33z5e//OXMnDkzDz30UB5++OFstdVW9dvo2bNnVlllldxzzz05/PDDFzrnor7P9t9//1xyySXZfffdc+SRR+aBBx7I6NGj849//CPXXnttg9d56qmnstdee+W73/1uDjjggKy++uoL3eYdd9yR4cOHZ4MNNsjxxx+fdu3aZezYsdliiy1y991358tf/nKS5K9//WvuvffejBgxIssvv3ymTJmSc889N5tttln+/ve/p2vXrkn+c9Gdr33ta/nHP/6RfffdN//1X/+V119/Pddff31eeOGFLL300vXb/vnPf5527drlqKOOyowZM3LaaaflG9/4Rh544IGFzvthe+yxRwYOHJjRo0fn/vvvz69//eu89dZb+d3vfjffe/zo76cnn3wyX/va19KjR48cffTR6dixY84///xsttlm9eF7ww03zMorr5wrrrgiI0eObPCal19+eZZccskMGzZsofMdd9xxOfnkk7Pttttm2223zcMPP5ytt956vj0QG/uzOHbs2Lz77rs58MADU1NTk969ey/W5wUArV4VAGg1HnrooWqS6q233lqtVqvVurq66vLLL1899NBD69e5+eabq0mqf/rTnxo8d9ttt62uvPLK9fcvvfTSart27ap33313g/XOO++8apLqPffcU78sSbVdu3bVJ598cr6Z5syZ0+D+e++9V1177bWrW2yxRf2ySZMmVZNUDzvssAbr7rPPPtUk1eOPP75+2X777Vft379/9fXXX2+w7ogRI6o9e/acb3sfteKKK1ZHjhxZf3/s2LHVJNWhQ4dW6+rq6pcffvjh1fbt21enT59ev2zkyJHVJZZYosHrTZkypdq+ffvqz372swbLH3/88WqHDh0aLN90002rSarnnXdeg3Ub+1l36tSpOnny5Ppljz76aDVJ9ayzzqpf9u1vf7varl276l//+tf5PoN57/Okk06qLrHEEtV//etfDR4/5phjqu3bt68+//zz8z33w9Zaa63qpptuOt/yO++8s5qkeuedd8733i+77LL6Zf/85z/rv3fuv//++uXzvkfHjh1bv6zk6/7+++9XK5VK9cgjj5zvsZEjR1aTVE855ZT6ZW+99Va1S5cu1UqlUh0/fvx88374+3G99darbrfddgvd9odtvfXW1TXXXPNj11vQ99kjjzxSTVLdf//9Gyw/6qijqkmqd9xxR/2yFVdcsZqketNNN33sturq6qqrrbZaddiwYQ2+/+fMmVNdaaWVqltttVWDZR913333VZNUf/e739UvO+6446pJqtdcc80Ct1et/v/fI2uuuWa1tra2/vFf/epX1STVxx9/fJFzH3/88dUk1R133LHB8h/84AfVJNVHH320ftnCfj/tvPPO1U6dOlWfeeaZ+mUvvfRStXv37tUhQ4bULxs1alS1Y8eO1TfffLN+WW1tbbVXr17Vfffdt37ZvN8lzz77bLVarVanTZtW7dSpU3W77bZr8Nn+z//8TzVJg99Di/uz+Oyzz1aTVHv06FGdNm3aIj8jAGiLHDoNAK3IuHHj0rdv32y++eZJ/nPI4J577pnx48dn7ty5SZItttgiSy+9dP2eesl/9jS79dZbs+eee9Yvu/LKK7PmmmtmjTXWyOuvv15/22KLLZIkd955Z4Ntb7rpphk0aNB8M3Xp0qXBdmbMmJGvfe1refjhh+uXzzuM8Qc/+EGD5x5yyCEN7ler1Vx99dXZYYcdUq1WG8w1bNiwzJgxo8HrNsaBBx7Y4NxqX/va1zJ37tw899xzi3zeNddck7q6uuyxxx4N5unXr19WW221+T6nmpqa+Q5db+xnPXTo0AaHAK+77rrp0aNH/fn/6urqct1112WHHXbIhhtuON/M897nlVdema997WtZcsklG2x36NChmTt3bv7yl7983MfWKN26dWtwxefVV189vXr1ypprrtngCunz/j3v/ZR+3d98881Uq9UsueSSC11n//33r/93r169svrqq2eJJZZocE7HefN++DyLvXr1ypNPPpmnn376Y9//vM/5k/i///u/JMkRRxzRYPmRRx6ZJPnzn//cYPlKK620yD3t5nnkkUfy9NNPZ++9984bb7xR/7nOnj07W265Zf7yl7/UH7r74Z/l999/P2+88UZWXXXV9OrVq8Hnf/XVV2e99dbLLrvsMt/2Pnr+wu985zsN9qz+2te+liSLdS7LJDnooIMa3J/3O2Pe5zXPR38/zZ07N7fcckt23nnnrLzyyvXL+/fvn7333jsTJ07MzJkzkyR77rln3n///VxzzTX1691yyy2ZPn16g9+ZH3XbbbflvffeyyGHHNLgfS9o79vG/izutttu6dOnz0K3DQBtlUOnAaCVmDt3bsaPH5/NN988zz77bP3yjTbaKGeccUZuv/32bL311unQoUN22223XHbZZamtrU1NTU2uueaavP/++w3+o/npp5/OP/7xj4X+x+y0adMa3J93KOlH3XDDDTn55JPzyCOPNDiX3If/w/u5555Lu3bt5nuNj14t+7XXXsv06dNzwQUX5IILLlisuRbXCius0OD+vCj11ltvLfJ5Tz/9dKrValZbbbUFPv7RC88st9xy813ko7Gf9UdnnTfvvFlfe+21zJw5M2uvvfbHzv7YY48t9nZLLb/88vOFpp49e2bAgAHzLUvS4P00xde9Wq0ucHnnzp3n+wx69uy50Hk//D3x05/+NDvttFO+8IUvZO21184222yTb33rW1l33XUXuP1PeqGQeT8jH/2Z6NevX3r16jVfEF/Yz+NHzQukHz0s+MNmzJiRJZdcMu+8805Gjx6dsWPH5sUXX2zwec6YMaP+388880x22223xdr+J/25m+ejP3errLJK2rVrN995Iz/6ebz22muZM2fOAg8pX3PNNVNXV5epU6dmrbXWynrrrZc11lgjl19+efbbb78k/zlseumll67/nwELMu9r8tEZ+/TpM1/0buzP4uJ+fQGgrREaAaCVuOOOO/Lyyy9n/PjxGT9+/HyPjxs3LltvvXWSZMSIETn//PNz4403Zuedd84VV1yRNdZYI+utt179+nV1dVlnnXXyy1/+coHb+2gc+vDeTvPcfffd2XHHHTNkyJCcc8456d+/fzp27JixY8fOd1GNxTFvz6pvfvObCw0jCwo8i6N9+/YLXL6wOPXhmSqVSm688cYFvka3bt0a3F/Q59TYz/qTzrqg7W611VY5+uijF/j4F77whUa93sdZ2Nwf935Kv+69e/dOpVJZaLz6pHMlyZAhQ/LMM8/kj3/8Y2655ZZcdNFFOfPMM3Peeec12Esy+U88+/D5CT+JxQ2VC/o+W5B5n+3pp5+eL37xiwtcZ9738CGHHJKxY8fmsMMOy+DBg9OzZ89UKpWMGDHiYy8etDBN9b08z8I+n8X9PBZmzz33zM9+9rO8/vrr6d69e66//vrstdde6dChaf5zqLE/i6XvBwBaK6ERAFqJcePGZZlllqm/Wu6HXXPNNbn22mtz3nnnpUuXLhkyZEj69++fyy+/PF/96ldzxx131F/QY55VVlkljz76aLbccstPvBfW1Vdfnc6dO+fmm29OTU1N/fKxY8c2WG/FFVdMXV1dnn322QZ7/3z06rN9+vRJ9+7dM3fu3Pqr+La0VVZZJdVqNSuttNInDnNN8Vl/WJ8+fdKjR4888cQTH7vdWbNmfeLPsilmXRylX/cOHTpklVVWabCnb1OadyX373znO5k1a1aGDBmSE044Yb7Q+OyzzzaI+Y0x72fk6aefzpprrlm//NVXX8306dOz4oorfqLXnXcIfo8ePT72s73qqqsycuTInHHGGfXL3n333UyfPn2+1/y4772m8vTTTzfYu2/y5Mmpq6urvxjVwvTp0yddu3bNU089Nd9j//znP9OuXbsGgX/PPffMiSeemKuvvjp9+/bNzJkzG5wGYEHmfU2efvrpBodnv/baa/NF79KfRQD4rHCORgBoBd55551cc8012X777bP77rvPdzv44IPz9ttv5/rrr0+StGvXLrvvvnv+9Kc/5dJLL80HH3ww37nG9thjj7z44ou58MILF7i92bNnf+xc7du3T6VSqT8/ZJJMmTJlvqsozzuX3DnnnNNg+VlnnTXf6+222265+uqrFxgyXnvttY+dqantuuuuad++fU488cT59sKqVqt54403PvY1muKz/rB27dpl5513zp/+9Kc89NBD8z0+b8499tgj9913X26++eb51pk+fXo++OCDRW5niSWWmC8yNYem+LoPHjx4gZ9FqY9+fbt165ZVV121wWkCkv8cWvzMM8/UX0G6sbbddtskme/qw/P2gl3QFdoXxwYbbJBVVlklv/jFLzJr1qz5Hv/wZ9u+ffv5vsfPOuusBj/fyX/OH/joo4/OdyXs5JPvqbgwH/0fK/N+ZwwfPnyRz2vfvn223nrr/PGPf2xwmPWrr76ayy67LF/96lfTo0eP+uVrrrlm1llnnVx++eW5/PLL079//wwZMmSR2xg6dGg6duyYs846q8H7XtDV3Et/FgHgs8IejQDQClx//fV5++23s+OOOy7w8a985Svp06dPxo0bVx8U99xzz5x11lk5/vjjs8466zTYSypJvvWtb+WKK67I9773vdx5553ZZJNNMnfu3Pzzn//MFVdckZtvvnmBFxr5sO222y6//OUvs80222TvvffOtGnTcvbZZ2fVVVfNY489Vr/eBhtskN122y1jxozJG2+8ka985SuZMGFC/vWvfyVpuOfcz3/+89x5553ZaKONcsABB2TQoEF588038/DDD+e2227Lm2+++Yk+w09qlVVWycknn5xRo0ZlypQp2XnnndO9e/c8++yzufbaa3PggQfmqKOOWuRrNMVn/VGnnHJKbrnllmy66aY58MADs+aaa+bll1/OlVdemYkTJ6ZXr1750Y9+lOuvvz7bb7999tlnn2ywwQaZPXt2Hn/88Vx11VWZMmXKIg/13WCDDXLuuefm5JNPzqqrrppllllmkeesK1H6dd9pp51y6aWX5l//+leTHhI+aNCgbLbZZtlggw3Su3fvPPTQQ7nqqqty8MEHN1jvtttuS7VazU477fSJtrPeeutl5MiRueCCCzJ9+vRsuummefDBB3PJJZdk5513rr8AVGO1a9cuF110UYYPH5611lor3/nOd7LccsvlxRdfzJ133pkePXrkT3/6U5Jk++23z6WXXpqePXtm0KBBue+++3LbbbdlqaWWavCaP/rRj3LVVVfl61//evbdd99ssMEGefPNN3P99dfnvPPO+8R7dS7Is88+mx133DHbbLNN7rvvvvz+97/P3nvvvVjbOPnkk3Prrbfmq1/9an7wgx+kQ4cOOf/881NbW5vTTjttvvX33HPPHHfccencuXP222+/tGu36H0u+vTpk6OOOiqjR4/O9ttvn2233TZ/+9vfcuONN873c1X6swgAnxVCIwC0AuPGjUvnzp2z1VZbLfDxdu3aZbvttsu4cePyxhtvZKmllsrGG2+cAQMGZOrUqQu8cmq7du1y3XXX5cwzz8zvfve7XHvttenatWtWXnnlHHrooYsVa7bYYov89re/zc9//vMcdthhWWmllXLqqadmypQpDUJjkvzud79Lv3798oc//CHXXntthg4dmssvvzyrr756OnfuXL9e37598+CDD+anP/1prrnmmpxzzjlZaqmlstZaa+XUU09t5CfXNI455ph84QtfyJlnnpkTTzwxyX/Oq7j11lsvNP5+WFN81h+13HLL5YEHHshPfvKTjBs3LjNnzsxyyy2X4cOHp2vXrkmSrl27ZsKECTnllFNy5ZVX5ne/+1169OiRL3zhCznxxBPrL8qyMMcdd1yee+65nHbaaXn77bez6aabNltoLP2677DDDll66aVzxRVX5Mc//nGTzfXDH/4w119/fW655ZbU1tZmxRVXzMknn5wf/ehHDda78sor89WvfrXB1cIb66KLLsrKK6+ciy++ONdee2369euXUaNG5fjjjy96D5tttlnuu+++nHTSSfnNb36TWbNmpV+/ftloo43y3e9+t369X/3qV2nfvn3GjRuXd999N5tsskluu+22+a5u3a1bt9x99905/vjjc+211+aSSy7JMsssky233DLLL7980awfdfnll+e4447LMccckw4dOuTggw/O6aefvljPXWuttXL33Xdn1KhRGT16dOrq6rLRRhvl97//fYOroM+z55575sc//nHmzJmzyKtNf9jJJ5+czp0757zzzqsP5bfccst8e6CW/iwCwGdFpdrUxz8AAPw/jzzySNZff/38/ve/zze+8Y2WHoc27qSTTsrYsWPz9NNPL/QiJM3hlVdeyUorrZTx48d/4j0aaeiEE07IiSeemNdee82efgDwGeIcjQBAk3jnnXfmWzZmzJi0a9fuY8+FBovj8MMPz6xZsxZ4VfbmNGbMmKyzzjoiIwDAx3DoNADQJE477bRMmjQpm2++eTp06JAbb7wxN954Yw488MAGV3+FT6pbt26ZNm3ap77dn//855/6NgEA2iKhEQBoEhtvvHFuvfXWnHTSSZk1a1ZWWGGFnHDCCTn22GNbejQAAOBT0KLnaDz33HNz7rnnZsqUKUn+c0Ln4447LsOHD0+SvPvuuznyyCMzfvz41NbWZtiwYTnnnHPSt2/flhoZAAAAAFiAFg2Nf/rTn9K+ffusttpqqVarueSSS3L66afnb3/7W9Zaa618//vfz5///OdcfPHF6dmzZw4++OC0a9cu99xzT0uNDAAAAAAsQKu76nTv3r1z+umnZ/fdd0+fPn1y2WWXZffdd0+S/POf/8yaa66Z++67L1/5yldaeFIAAAAAYJ5Wc47GuXPn5sorr8zs2bMzePDgTJo0Ke+//36GDh1av84aa6yRFVZYoVGhsa6uLi+99FK6d++eSqXSXOMDAAAAwGdStVrN22+/nWWXXTbt2rVb6HotHhoff/zxDB48OO+++266deuWa6+9NoMGDcojjzySTp06pVevXg3W79u3b1555ZWFvl5tbW1qa2vr77/44osZNGhQc40PAAAAAJ8LU6dOzfLLL7/Qx1s8NK6++up55JFHMmPGjFx11VUZOXJkJkyY8Ilfb/To0TnxxBPnWz516tT06NGjZFQAAAAA+NyZOXNmBgwYkO7duy9yvVZ3jsahQ4dmlVVWyZ577pktt9wyb731VoO9GldcccUcdthhOfzwwxf4/I/u0Tjvg5gxY4bQCAAAAACNNHPmzPTs2fNj+9rCD6puIXV1damtrc0GG2yQjh075vbbb69/7Kmnnsrzzz+fwYMHL/T5NTU16dGjR4MbAAAAANC8WvTQ6VGjRmX48OFZYYUV8vbbb+eyyy7LXXfdlZtvvjk9e/bMfvvtlyOOOCK9e/dOjx49csghh2Tw4MGuOA0AAAAArUyLhsZp06bl29/+dl5++eX07Nkz6667bm6++eZstdVWSZIzzzwz7dq1y2677Zba2toMGzYs55xzTkuODAAAAAAsQKs7R2NTW9xjyAEAAACA+bXZczQCAAAAAG2P0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAK3c7NmzU6lUUqlUMnv27JYeBwAAABZIaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIq1aGgcPXp0vvSlL6V79+5ZZpllsvPOO+epp55qsM5mm22WSqXS4Pa9732vhSYGAAAAABakRUPjhAkTctBBB+X+++/Prbfemvfffz9bb711Zs+e3WC9Aw44IC+//HL97bTTTmuhiQEAAACABenQkhu/6aabGty/+OKLs8wyy2TSpEkZMmRI/fKuXbumX79+n/Z4AAAAAMBialXnaJwxY0aSpHfv3g2Wjxs3LksvvXTWXnvtjBo1KnPmzGmJ8QAAAACAhWjRPRo/rK6uLocddlg22WSTrL322vXL995776y44opZdtll89hjj+W///u/89RTT+Waa65Z4OvU1tamtra2/v7MmTObfXYAAAAA+LxrNaHxoIMOyhNPPJGJEyc2WH7ggQfW/3udddZJ//79s+WWW+aZZ57JKqusMt/rjB49OieeeGKzzwsAAAAA/P9axaHTBx98cG644YbceeedWX755Re57kYbbZQkmTx58gIfHzVqVGbMmFF/mzp1apPPCwAAAAA01KJ7NFar1RxyyCG59tprc9ddd2WllVb62Oc88sgjSZL+/fsv8PGamprU1NQ05ZgAAAAAwMdo0dB40EEH5bLLLssf//jHdO/ePa+88kqSpGfPnunSpUueeeaZXHbZZdl2222z1FJL5bHHHsvhhx+eIUOGZN11123J0QEAAACAD6lUq9Vqi228Ulng8rFjx2afffbJ1KlT881vfjNPPPFEZs+enQEDBmSXXXbJj3/84/To0WOxtjFz5sz07NkzM2bMWOznALQms2fPTrdu3ZIks2bNyhJLLNHCEwEAAPB5srh9rcUPnV6UAQMGZMKECZ/SNAAAAADAJ9UqLgYDAAAAALRtQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQrENLD0C5gcf8uaVHAJpR3Xvv1v97zZ/clHadOrfgNEBzmvLz7Vp6BAAA+MTs0QgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKtWhoHD16dL70pS+le/fuWWaZZbLzzjvnqaeearDOu+++m4MOOihLLbVUunXrlt122y2vvvpqC00MAAAAACxIi4bGCRMm5KCDDsr999+fW2+9Ne+//3623nrrzJ49u36dww8/PH/6059y5ZVXZsKECXnppZey6667tuDUAAAAAMBHdWjJjd90000N7l988cVZZpllMmnSpAwZMiQzZszIb3/721x22WXZYostkiRjx47Nmmuumfvvvz9f+cpXWmJsAAAAAOAjWtU5GmfMmJEk6d27d5Jk0qRJef/99zN06ND6ddZYY42ssMIKue+++1pkRgAAAABgfi26R+OH1dXV5bDDDssmm2yStddeO0nyyiuvpFOnTunVq1eDdfv27ZtXXnllga9TW1ub2tra+vszZ85stpkBAAAAgP9oNXs0HnTQQXniiScyfvz4otcZPXp0evbsWX8bMGBAE00IAAAAi2/27NmpVCqpVCoNrkUA8FnVKkLjwQcfnBtuuCF33nlnll9++frl/fr1y3vvvZfp06c3WP/VV19Nv379Fvhao0aNyowZM+pvU6dObc7RAQAAAIC0cGisVqs5+OCDc+211+aOO+7ISiut1ODxDTbYIB07dsztt99ev+ypp57K888/n8GDBy/wNWtqatKjR48GNwAAAACgebXoORoPOuigXHbZZfnjH/+Y7t271593sWfPnunSpUt69uyZ/fbbL0cccUR69+6dHj165JBDDsngwYNdcRoAAAAAWpEWDY3nnntukmSzzTZrsHzs2LHZZ599kiRnnnlm2rVrl9122y21tbUZNmxYzjnnnE95UoCW065T56z43ze09BgAAACwSC0aGqvV6seu07lz55x99tk5++yzP4WJAAAAAIBPolVcDAYAAAAAaNuERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKBYh8asPH369Fx77bW5++6789xzz2XOnDnp06dP1l9//QwbNiwbb7xxc80JAAAAALRii7VH40svvZT9998//fv3z8knn5x33nknX/ziF7Pllltm+eWXz5133pmtttoqgwYNyuWXX97cMwMAAAAArcxi7dG4/vrrZ+TIkZk0aVIGDRq0wHXeeeedXHfddRkzZkymTp2ao446qkkHBQAAAABar8UKjX//+9+z1FJLLXKdLl26ZK+99spee+2VN954o0mGAwAAAADahsU6dPrjImPp+gAAAABA29boq05fcskl+fOf/1x//+ijj06vXr2y8cYb57nnnmvS4QAAAACAtqHRofGUU05Jly5dkiT33Xdfzj777Jx22mlZeumlc/jhhzf5gAAAAABA67dY52j8sKlTp2bVVVdNklx33XXZbbfdcuCBB2aTTTbJZptt1tTzAQAAAABtQKP3aOzWrVv9xV5uueWWbLXVVkmSzp0755133mna6QAAAACANqHRezRutdVW2X///bP++uvnX//6V7bddtskyZNPPpmBAwc29XwAAAAAQBvQ6D0azz777AwePDivvfZarr766vorTE+aNCl77bVXkw8IAAAAALR+jd6jsVevXvnNb34z3/ITTzyxSQYCAAAAANqeRofGJHn33Xfz2GOPZdq0aamrq6tfXqlUssMOOzTZcAAAAABA29Do0HjTTTflW9/6Vv0FYT6sUqlk7ty5TTIYAAAAANB2NPocjYccckj22GOPvPzyy6mrq2twExkBAAAA4POp0aHx1VdfzRFHHJG+ffs2xzwAAAAAQBvU6NC4++6756677mqGUQAAAACAtqrR52j8zW9+k69//eu5++67s84666Rjx44NHv/hD3/YZMMBAAAAAG1Do0PjH/7wh9xyyy3p3Llz7rrrrlQqlfrHKpWK0AgAAAAAn0ONDo3HHntsTjzxxBxzzDFp167RR14DAAAAAJ9BjS6F7733Xvbcc0+REQAAAACo1+haOHLkyFx++eXNMQsAAAAA0EY1+tDpuXPn5rTTTsvNN9+cddddd76Lwfzyl79ssuEAAAAAgLah0aHx8ccfz/rrr58keeKJJxo89uELwwAAAAAAnx+NDo133nlnc8wBAAAAALRhrugCAAAAABRbrND4ve99Ly+88MJiveDll1+ecePGFQ0FAAAAALQti3XodJ8+fbLWWmtlk002yQ477JANN9wwyy67bDp37py33norf//73zNx4sSMHz8+yy67bC644ILmnhsAAAAAaEUWKzSedNJJOfjgg3PRRRflnHPOyd///vcGj3fv3j1Dhw7NBRdckG222aZZBgUAAAAAWq/FvhhM3759c+yxx+bYY4/NW2+9leeffz7vvPNOll566ayyyiquOA0AAAAAn2ONvup0kiy55JJZcsklm3oWAAAAAKCNctVpAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUOwThcYPPvggt912W84///y8/fbbSZKXXnops2bNatLhAAAAAIC2odFXnX7uueeyzTbb5Pnnn09tbW222mqrdO/ePaeeempqa2tz3nnnNcecAAAAAEAr1ug9Gg899NBsuOGGeeutt9KlS5f65bvssktuv/32Jh0OAAAAAGgbGr1H491335177703nTp1arB84MCBefHFF5tsMAAAAACg7Wj0Ho11dXWZO3fufMtfeOGFdO/evUmGAgAAAADalkaHxq233jpjxoypv1+pVDJr1qwcf/zx2XbbbZtyNgAAAACgjWj0odNnnHFGhg0blkGDBuXdd9/N3nvvnaeffjpLL710/vCHPzTHjAAAAABAK9fo0Lj88svn0Ucfzfjx4/PYY49l1qxZ2W+//fKNb3yjwcVhAAAAAIDPj0aHxiTp0KFDvvnNbzb1LAAAAABAG/WJQuNLL72UiRMnZtq0aamrq2vw2A9/+MMmGQwAAAAAaDsaHRovvvjifPe7302nTp2y1FJLpVKp1D9WqVSERgAAAAD4HGp0aPzJT36S4447LqNGjUq7do2+aDUAAAAA8BnU6FI4Z86cjBgxQmQEAAAAAOo1uhbut99+ufLKK5tjFgAAAACgjWr0odOjR4/O9ttvn5tuuinrrLNOOnbs2ODxX/7yl002HAAAAADQNnyi0HjzzTdn9dVXT5L5LgYDAAAAAHz+NDo0nnHGGfnf//3f7LPPPs0wDgAAAADQFjX6HI01NTXZZJNNmmMWAAAAAKCNanRoPPTQQ3PWWWc1xywAAAAAQBvV6EOnH3zwwdxxxx254YYbstZaa813MZhrrrmmyYYDAAAAANqGRofGXr16Zdddd22OWQAAAACANqrRoXHs2LHNMQcAAAAA0IY1+hyNAAAAAAAftVh7NP7Xf/1Xbr/99iy55JJZf/31U6lUFrruww8/3GTDAQAAAABtw2KFxp122ik1NTX1/15UaAQAAAAAPn8WKzQef/zx9f8+4YQTmmsWAAAAAKCNavQ5GldeeeW88cYb8y2fPn16Vl555SYZCgAAAABoWxodGqdMmZK5c+fOt7y2tjYvvPBCkwwFAAAAALQti3XodJJcf/319f+++eab07Nnz/r7c+fOze23356VVlqpaacDAIDPuIHH/LmlRwCaSd1779b/e82f3JR2nTq34DRAc5vy8+1aeoQWt9ihceedd06SVCqVjBw5ssFjHTt2zMCBA3PGGWc06XAAAAAAQNuw2KGxrq4uSbLSSivlr3/9a5ZeeulmGwoAAAAAaFsWOzTO8+yzzzbHHAAAAABAG9boi8EAAAAAAHyU0AgAAAAAFBMaAQAAAIBiQiMAAAAAUOwThcZnnnkmP/7xj7PXXntl2rRpSZIbb7wxTz75ZJMOBwAAAAC0DY0OjRMmTMg666yTBx54INdcc01mzZqVJHn00Udz/PHHN/mAAAAAAEDr1+jQeMwxx+Tkk0/Orbfemk6dOtUv32KLLXL//fc36XAAAAAAQNvQ6ND4+OOPZ5dddplv+TLLLJPXX3+9SYYCAAAAANqWRofGXr165eWXX55v+d/+9rcst9xyTTIUAAAAANC2NDo0jhgxIv/93/+dV155JZVKJXV1dbnnnnty1FFH5dvf/nZzzAgAAAAAtHKNDo2nnHJK1lhjjQwYMCCzZs3KoEGDMmTIkGy88cb58Y9/3KjX+stf/pIddtghyy67bCqVSq677roGj++zzz6pVCoNbttss01jRwYAAAAAmlmHxj6hU6dOufDCC3Pcccfl8ccfz6xZs7L++utntdVWa/TGZ8+enfXWWy/77rtvdt111wWus80222Ts2LH192tqahq9HQAAAACgeTU6NM4zYMCADBgwoGjjw4cPz/Dhwxe5Tk1NTfr161e0HQAAAACgeTX60Onddtstp5566nzLTzvttHz9619vkqE+7K677soyyyyT1VdfPd///vfzxhtvNPk2AAAAAIAyjQ6Nf/nLX7LtttvOt3z48OH5y1/+0iRDzbPNNtvkd7/7XW6//faceuqpmTBhQoYPH565c+cu9Dm1tbWZOXNmgxsAAAAA0Lwafej0rFmz0qlTp/mWd+zYscmj3ogRI+r/vc4662TdddfNKquskrvuuitbbrnlAp8zevTonHjiiU06BwAAAACwaI3eo3GdddbJ5ZdfPt/y8ePHZ9CgQU0y1MKsvPLKWXrppTN58uSFrjNq1KjMmDGj/jZ16tRmnQkAAAAA+AR7NP7kJz/JrrvummeeeSZbbLFFkuT222/PH/7wh1x55ZVNPuCHvfDCC3njjTfSv3//ha5TU1PjytQAAAAA8ClrdGjcYYcdct111+WUU07JVVddlS5dumTdddfNbbfdlk033bRRrzVr1qwGeyc+++yzeeSRR9K7d+/07t07J554Ynbbbbf069cvzzzzTI4++uisuuqqGTZsWGPHBgAAAACaUaNDY5Jst9122W677Yo3/tBDD2XzzTevv3/EEUckSUaOHJlzzz03jz32WC655JJMnz49yy67bLbeeuucdNJJ9lgEAAAAgFbmE4XGJHnvvfcybdq01NXVNVi+wgorLPZrbLbZZqlWqwt9/Oabb/6k4wEAAAAAn6JGh8ann346++67b+69994Gy6vVaiqVSubOndtkwwEAAAAAbUOjQ+M+++yTDh065IYbbkj//v1TqVSaYy4AAAAAoA1pdGh85JFHMmnSpKyxxhrNMQ8AAAAA0Aa1a+wTBg0alNdff705ZgEAAAAA2qhGh8ZTTz01Rx99dO6666688cYbmTlzZoMbAAAAAPD50+hDp4cOHZok2XLLLRssdzEYAAAAAPj8anRovPPOO5tjDgAAAACgDWt0aNx0002bYw4AAAAAoA1r9Dkak+Tuu+/ON7/5zWy88cZ58cUXkySXXnppJk6c2KTDAQAAAABtQ6ND49VXX51hw4alS5cuefjhh1NbW5skmTFjRk455ZQmHxAAAAAAaP0aHRpPPvnknHfeebnwwgvTsWPH+uWbbLJJHn744SYdDgAAAABoGxodGp966qkMGTJkvuU9e/bM9OnTm2ImAAAAAKCNaXRo7NevXyZPnjzf8okTJ2bllVdukqEAAAAAgLal0aHxgAMOyKGHHpoHHngglUolL730UsaNG5ejjjoq3//+95tjRgAAAACglevQ2Cccc8wxqaury5Zbbpk5c+ZkyJAhqampyVFHHZVDDjmkOWYEAAAAAFq5RoXGuXPn5p577slBBx2UH/3oR5k8eXJmzZqVQYMGpVu3bs01IwAAAADQyjUqNLZv3z5bb711/vGPf6RXr14ZNGhQc80FAAAAALQhjT5H49prr51///vfzTELAAAAANBGNTo0nnzyyTnqqKNyww035OWXX87MmTMb3AAAAACAz59GXwxm2223TZLsuOOOqVQq9cur1WoqlUrmzp3bdNMBAAAAAG1Co0PjnXfe2RxzAAAAAABtWKND46abbtoccwAAAAAAbVijz9GYJHfffXe++c1vZuONN86LL76YJLn00kszceLEJh0OAAAAAGgbGh0ar7766gwbNixdunTJww8/nNra2iTJjBkzcsoppzT5gAAAAABA6/eJrjp93nnn5cILL0zHjh3rl2+yySZ5+OGHm3Q4AAAAAKBtaHRofOqppzJkyJD5lvfs2TPTp09vipkAAAAAgDam0aGxX79+mTx58nzLJ06cmJVXXrlJhgIAAAAA2pZGh8YDDjgghx56aB544IFUKpW89NJLGTduXI466qh8//vfb44ZAQAAAIBWrkNjn3DMMcekrq4uW265ZebMmZMhQ4akpqYmRx11VA455JDmmBEAAAAAaOUWKzQ+9thjWXvttdOuXbtUKpUce+yx+dGPfpTJkydn1qxZGTRoULp169bcswIAAAAArdRiHTq9/vrr5/XXX0+SrLzyynnjjTfSqVOnDBo0KF/+8pdFRgAAAAD4nFus0NirV688++yzSZIpU6akrq6uWYcCAAAAANqWxTp0erfddsumm26a/v37p1KpZMMNN0z79u0XuO6///3vJh0QAAAAAGj9Fis0XnDBBdl1110zefLk/PCHP8wBBxyQ7t27N/dsAAAAAEAbsdhXnd5mm22SJJMmTcqhhx4qNAIAAAAA9RY7NM4zduzY5pgDAAAAAGjDGh0aZ8+enZ///Oe5/fbbM23atPkuDOMcjQAAAADw+dPo0Lj//vtnwoQJ+da3vlV/cRgAAAAA4POt0aHxxhtvzJ///OdssskmzTEPAAAAANAGtWvsE5Zccsn07t27OWYBAAAAANqoRofGk046Kccdd1zmzJnTHPMAAAAAAG1Qow+dPuOMM/LMM8+kb9++GThwYDp27Njg8YcffrjJhgMAAAAA2oZGh8add965GcYAAAAAANqyRofG448/vjnmAAAAAADasEafoxEAAAAA4KMWe4/GJZdcMpVK5WPXe/PNN4sGAgAAAADansUOjWPGjGnGMQAAAACAtmyxQ+PIkSObcw4AAAAAoA1zjkYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIot9lWn5zniiCMWuLxSqaRz585ZddVVs9NOO6V3797FwwEAAAAAbUOjQ+Pf/va3PPzww5k7d25WX331JMm//vWvtG/fPmussUbOOeecHHnkkZk4cWIGDRrU5AMDAAAAAK1Pow+d3mmnnTJ06NC89NJLmTRpUiZNmpQXXnghW221Vfbaa6+8+OKLGTJkSA4//PDmmBcAAAAAaIUaHRpPP/30nHTSSenRo0f9sp49e+aEE07Iaaedlq5du+a4447LpEmTmnRQAAAAAKD1anRonDFjRqZNmzbf8tdeey0zZ85MkvTq1Svvvfde+XQAAAAAQJvwiQ6d3nfffXPttdfmhRdeyAsvvJBrr702++23X3beeeckyYMPPpgvfOELTT0rAAAAANBKNfpiMOeff34OP/zwjBgxIh988MF/XqRDh4wcOTJnnnlmkmSNNdbIRRdd1LSTAgAAAACtVqNDY7du3XLhhRfmzDPPzL///e8kycorr5xu3brVr/PFL36xyQYEAAAAAFq/Rh86/fvf/z5z5sxJt27dsu6662bddddtEBkBAAAAgM+fRofGww8/PMsss0z23nvv/N///V/mzp3bHHMBAAAAAG1Io0Pjyy+/nPHjx6dSqWSPPfZI//79c9BBB+Xee+9tjvkAAAAAgDag0aGxQ4cO2X777TNu3LhMmzYtZ555ZqZMmZLNN988q6yySnPMCAAAAAC0co2+GMyHde3aNcOGDctbb72V5557Lv/4xz+aai4AAAAAoA1p9B6NSTJnzpyMGzcu2267bZZbbrmMGTMmu+yyS5588smmng8AAAAAaAMavUfjiBEjcsMNN6Rr167ZY4898pOf/CSDBw9ujtkAAAAAgDai0aGxffv2ueKKKzJs2LC0b9++wWNPPPFE1l577SYbDgAAAABoGxodGseNG9fg/ttvv50//OEPueiiizJp0qTMnTu3yYYDAAAAANqGT3SOxiT5y1/+kpEjR6Z///75xS9+kS222CL3339/U84GAAAAALQRjdqj8ZVXXsnFF1+c3/72t5k5c2b22GOP1NbW5rrrrsugQYOaa0YAAAAAoJVb7D0ad9hhh6y++up57LHHMmbMmLz00ks566yzmnM2AAAAAKCNWOw9Gm+88cb88Ic/zPe///2sttpqzTkTAAAAANDGLPYejRMnTszbb7+dDTbYIBtttFF+85vf5PXXX2/O2QAAAACANmKxQ+NXvvKVXHjhhXn55Zfz3e9+N+PHj8+yyy6burq63HrrrXn77bebc04AAAAAoBVr9FWnl1hiiey7776ZOHFiHn/88Rx55JH5+c9/nmWWWSY77rhjc8wIAAAAALRyjQ6NH7b66qvntNNOywsvvJA//OEPTTUTAAAAANDGFIXGedq3b5+dd945119/fVO8HAAAAADQxjRJaAQAAAAAPt+ERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAinVo6QEAAADgs6hdp85Z8b9vaOkxAD419mgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKtWho/Mtf/pIddtghyy67bCqVSq677roGj1er1Rx33HHp379/unTpkqFDh+bpp59umWEBAAAAgIVq0dA4e/bsrLfeejn77LMX+Phpp52WX//61znvvPPywAMPZIkllsiwYcPy7rvvfsqTAgAAAACL0qElNz58+PAMHz58gY9Vq9WMGTMmP/7xj7PTTjslSX73u9+lb9++ue666zJixIhPc1QAAAAAYBFa7Tkan3322bzyyisZOnRo/bKePXtmo402yn333bfQ59XW1mbmzJkNbgAAAABA82q1ofGVV15JkvTt27fB8r59+9Y/tiCjR49Oz549628DBgxo1jkBAAAAgFYcGj+pUaNGZcaMGfW3qVOntvRIAAAAAPCZ12pDY79+/ZIkr776aoPlr776av1jC1JTU5MePXo0uAEAAAAAzavVhsaVVlop/fr1y+23316/bObMmXnggQcyePDgFpwMAAAAAPioFr3q9KxZszJ58uT6+88++2weeeSR9O7dOyussEIOO+ywnHzyyVlttdWy0kor5Sc/+UmWXXbZ7Lzzzi03NAAAAAAwnxYNjQ899FA233zz+vtHHHFEkmTkyJG5+OKLc/TRR2f27Nk58MADM3369Hz1q1/NTTfdlM6dO7fUyAAAAADAArRoaNxss81SrVYX+nilUslPf/rT/PSnP/0UpwIAAAAAGqvVnqMRAAAAAGg7hEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAECxVh0aTzjhhFQqlQa3NdZYo6XHAgAAAAA+okNLD/Bx1lprrdx222319zt0aPUjAwAAAMDnTquvdh06dEi/fv1aegwAAAAAYBFa9aHTSfL0009n2WWXzcorr5xvfOMbef755xe5fm1tbWbOnNngBgAAAAA0r1YdGjfaaKNcfPHFuemmm3Luuefm2Wefzde+9rW8/fbbC33O6NGj07Nnz/rbgAEDPsWJAQAAAODzqVWHxuHDh+frX/961l133QwbNiz/93//l+nTp+eKK65Y6HNGjRqVGTNm1N+mTp36KU4MAAAAAJ9Prf4cjR/Wq1evfOELX8jkyZMXuk5NTU1qamo+xakAAAAAgFa9R+NHzZo1K88880z69+/f0qMAAAAAAB/SqkPjUUcdlQkTJmTKlCm59957s8suu6R9+/bZa6+9Wno0AAAAAOBDWvWh0y+88EL22muvvPHGG+nTp0+++tWv5v7770+fPn1aejQAAAAA4ENadWgcP358S48AAAAAACyGVn3oNAAAAADQNgiNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYkIjAAAAAFBMaAQAAAAAigmNAAAAAEAxoREAAAAAKCY0AgAAAADFhEYAAAAAoJjQCAAAAAAUExoBAAAAgGJCIwAAAABQTGgEAAAAAIoJjQAAAABAMaERAAAAACgmNAIAAAAAxYRGAAAAAKCY0AgAAAAAFBMaAQAAAIBiQiMAAAAAUExoBAAAAACKCY0AAAAAQDGhEQAAAAAoJjQCAAAAAMWERgAAAACgmNAIAAAAABQTGgEAAACAYm0iNJ599tkZOHBgOnfunI022igPPvhgS48EAAAAAHxIqw+Nl19+eY444ogcf/zxefjhh7Peeutl2LBhmTZtWkuPBgAAAAD8P60+NP7yl7/MAQcckO985zsZNGhQzjvvvHTt2jX/+7//29KjAQAAAAD/T4eWHmBR3nvvvUyaNCmjRo2qX9auXbsMHTo099133wKfU1tbm9ra2vr7M2bMSJLMnDmzeYdtQXW1c1p6BACgCXyW/15h4fwtBwCfDZ/lv+XmvbdqtbrI9Vp1aHz99dczd+7c9O3bt8Hyvn375p///OcCnzN69OiceOKJ8y0fMGBAs8wIANBUeo5p6QkAAPikPg9/y7399tvp2bPnQh9v1aHxkxg1alSOOOKI+vt1dXV58803s9RSS6VSqbTgZACf3MyZMzNgwIBMnTo1PXr0aOlxAABYTP6OAz4LqtVq3n777Sy77LKLXK9Vh8all1467du3z6uvvtpg+auvvpp+/fot8Dk1NTWpqalpsKxXr17NNSLAp6pHjx7+QAUAaIP8HQe0dYvak3GeVn0xmE6dOmWDDTbI7bffXr+srq4ut99+ewYPHtyCkwEAAAAAH9aq92hMkiOOOCIjR47MhhtumC9/+csZM2ZMZs+ene985zstPRoAAAAA8P+0+tC455575rXXXstxxx2XV155JV/84hdz0003zXeBGIDPspqamhx//PHznRoCAIDWzd9xwOdJpfpx16UGAAAAAPgYrfocjQAAAABA2yA0AgAAAADFhEYAAAAAoJjQCNDGXHzxxenVq1dLjwEAAAANCI3A594+++yTSqWSSqWSTp06ZdVVV81Pf/rTfPDBBx/73Isvvrj+uQu7TZkypfnfxGK6+uqrs9lmm6Vnz57p1q1b1l133fz0pz/Nm2++maTh+2nXrl2WX375fOc738m0adOSJFOmTEmlUskjjzwy32tvttlmOeywwz7FdwMAsHBTp07Nvvvum2WXXTadOnXKiiuumEMPPTRvvPFGg/U222yzVCqVjB8/vsHyMWPGZODAgfX35/2dtM022zRYb/r06alUKrnrrruSJI8++mg6deqU66+/vsF6V199dTp37pwnnnhikXP7ew1oy4RGgCTbbLNNXn755Tz99NM58sgjc8IJJ+T000//2Oftueeeefnll+tvgwcPzgEHHNBg2YABAxZ7jvfee6/kbSzSsccemz333DNf+tKXcuONN+aJJ57IGWeckUcffTSXXnpp/Xo9evTIyy+/nBdeeCEXXnhhbrzxxnzrW99qtrkAAJrav//972y44YZ5+umn84c//CGTJ0/Oeeedl9tvvz2DBw+uj3bzdO7cOT/+8Y/z/vvvL/J1O3TokNtuuy133nnnQtdZb731ctxxx+XAAw+sj5rTpk3L9773vZx44olZe+21F/pcf68BbZ3QCJCkpqYm/fr1y4orrpjvf//7GTp0aK6//vrMnj07PXr0yFVXXdVg/euuuy5LLLFEPvjgg/Tr16/+1qlTp3Tt2rX+/nvvvZddd9013bp1S48ePbLHHnvk1VdfrX+dE044IV/84hdz0UUXZaWVVkrnzp2T/Of/jH/3u99N375907lz56y99tq54YYbGsxw8803Z80110y3bt3qQ+nCPPjggznllFNyxhln5PTTT8/GG2+cgQMHZquttsrVV1+dkSNH1q9bqVTSr1+/LLvsshk+fHh++MMf5rbbbss777zTFB81AECzO+igg9KpU6fccsst2XTTTbPCCitk+PDhue222/Liiy/m2GOPbbD+XnvtlenTp+fCCy9c5OsuscQS2XfffXPMMccscr1Ro0ZlhRVWyEEHHZQk+e53v5vVVlstRx111EKf4+814LNAaARYgC5duuS9997LEksskREjRmTs2LENHh87dmx23333dO/efaGvUVdXl5122ilvvvlmJkyYkFtvvTX//ve/s+eeezZYb/Lkybn66v+vvTsLifJt4zj+G61wy60kK0sYzVLJqDTPFEMTqSOlJJWiFAMpUkNDPEjFSkOCJKwgyj08SBCiHNQioyRbEAyFbDOjIqNM1MpI34NoyNTRsqh/7/dzNHPfz73MHF1c9/KcV21trdra2jQyMqKoqChdv35dlZWV6ujoUEFBgaytrc1thoaGVFRUpIqKCjU3N+vp06cWA9eqqio5ODgoJSVlwnpLdz7a2tpqZGRkWkfJAQAA/rQ3b97IZDIpJSVFtra2Y+rc3d0VHx+vmpoajY6OmssdHR2VnZ2tvLw8DQ4OWuw/JydH7e3t4xaiv2Vtba2ysjLV1dUpLi5OJpNJpaWlY+K57xGvAfgXkGgEgG+Mjo6qsbFRJpNJ69evlyQlJSXJZDKZdwy+evVKFy9e1M6dOy321dTUpPb2dlVXV2vt2rUKDg5WeXm5rl69qlu3bpmfGx4eVnl5uVavXq2AgAA1NjaqtbVVtbW1ioiIkNFo1KZNmxQVFWVu8+nTJ508eVKBgYFas2aNdu/eraampknn0tXVJaPRqNmzZ//Q/9HV1WUex1JSFQAA4G/R1dWl0dFR+fr6Tljv6+urt2/fqre3d0x5SkqKbGxsdPToUYv9L1q0SHv37lV2drbFxJ6vr69SU1N17tw55eTkyMfHZ8p5E68B+K8j0QgAki5cuCAHBwfZ2NgoKipKsbGxysnJkSStW7dO/v7+KisrkyRVVlbK09NTISEhFvvs7OzUkiVLxtzR6OfnJ2dnZ3V2dprLPD095ebmZv7e1tYmDw8Pi8GonZ2dvLy8zN8XLlxovgB8It+u2E/l3bt3cnBwkJ2dnZYvX64FCxaoqqpq2u0BAAD+Bj8S/0hfrtLJy8tTUVGRXr9+bfHZ/fv3q7e3V2fOnJn0mYGBAdXU1MjOzk7Xrl37pfMlXgPwtyLRCACSwsLC1NbWpq6uLr1//15lZWWyt7c31yclJam0tFTSl2PTO3bskMFg+CVjfzuOpHFHfCby/Uq3wWCwGJz6+Pjo0aNHU15wLklz585VW1ub7t27p8HBQTU3N5uTno6OjpK+BLff6+vrk5OT05T9AwAA/E7e3t4yGAxjFna/1dnZKRcXlzELvV8lJCTI09NT+fn5FsdwdnZWVlaWcnNzNTQ0NOEzGRkZsrGx0Y0bN9TY2Kjy8nKLfRKvAfgXkGgEAH1J9nl7e2vp0qWaNWvWuPqEhAR1d3eruLhYHR0dYy7jnoyvr696enrU09NjLuvo6FBfX5/8/PwmbRcQEKBnz57p/v37P/djJhAXF6eBgQGVlJRMWN/X12f+bGVlJW9vbxmNxnFJT1dXV82fP1937twZU97f368HDx5MeSQIAADgd5s3b54iIiJUUlIy7uUoL1++VFVVlWJjYydcNLaystLhw4d14sQJPXnyxOI4e/bskZWVlY4dOzaurqGhQadPn1ZZWZlWrVql/Px8paamWnx5H/EagH8BiUYAmAYXFxdFR0crIyNDGzZskIeHx5RtwsPDtXLlSsXHx+vu3btqbW3Vtm3bFBoaqsDAwEnbhYaGKiQkRDExMWpoaNDjx4916dIl1dfX//T8g4ODlZmZqX379ikzM1MtLS3q7u5WU1OTNm/ebD4WPh3p6ek6dOiQqqqq9PDhQ7W2tio+Pl5ubm6Kjo7+6TkCAAD8KsePH9fHjx8VGRmp5uZm9fT0qL6+XhEREVq8eLEOHjw4aduNGzcqODhYp06dsjiGjY2NcnNzVVxcPKa8v79fiYmJysjIUFBQkCQpLS1Nfn5+Sk5OnrQ/4jUA/wISjQAwTYmJiRoeHp7yJTBfGQwG1dXVycXFRSEhIQoPD5fRaFRNTc2Ubc+fP6+goCBt3bpVfn5+yszM1OfPn2c0/8LCQlVXV+vmzZuKjIyUv7+/0tPTFRAQMK0dml9lZmbqwIEDKiwsVEBAgGJiYmRvb68rV65M69g3AADA77Zs2TLdvn1bRqNRW7ZskZeXl5KTkxUWFqaWlha5urpabF9YWKgPHz5MOc727dtlNBrHlKWmpsrJycl837f0ZQfi2bNndfnyZYtHqInXAPzXGUZ/9IZcAPg/VVFRobS0ND1//lxz5sz509MBAAAAAOCvMv4iMgDAGENDQ3rx4oUKCgq0a9cukowAAAAAAEyAo9MAMIUjR45oxYoVcnd3V1ZW1p+eDgAAAAAAfyWOTgMAAAAAAACYMXY0AgAAAAAAAJgxEo0AAAAAAAAAZoxEIwAAAAAAAIAZI9EIAAAAAAAAYMZINAIAAAAAAACYMRKNAAAAAAAAAGaMRCMAAAAAAACAGSPRCAAAAAAAAGDGSDQCAAAAAAAAmLH/AdVxUJyGIWHGAAAAAElFTkSuQmCC\n",
"text/plain": [
"<Figure size 1600x1200 with 1 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"%matplotlib inline\n",
"\n",
"import matplotlib\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"import os\n",
"\n",
"\n",
"# Compute average inference time + std\n",
"time_results = {k: np.mean(v.model_inference_time) * 1e3 for k, v in results.items()}\n",
"time_results_std = np.std([v.model_inference_time for v in results.values()]) * 1000\n",
"\n",
"plt.rcdefaults()\n",
"fig, ax = plt.subplots(figsize=(16, 12))\n",
"ax.set_ylabel(\"Avg Inference time (ms)\")\n",
"ax.set_title(\"Average inference time (ms) for each provider\")\n",
"ax.bar(time_results.keys(), time_results.values(), yerr=time_results_std)\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Quantization support from transformers\n",
"\n",
"Quantization enables the use of integers (_instead of floatting point_) arithmetic to run neural networks models faster. From a high-level point of view, quantization works as mapping the float32 ranges of values as int8 with the less loss in the performances of the model.\n",
"\n",
"Hugging Face provides a conversion tool as part of the transformers repository to easily export quantized models to ONNX Runtime. For more information, please refer to the following: \n",
"\n",
"- [Hugging Face Documentation on ONNX Runtime quantization supports](https://huggingface.co/transformers/master/serialization.html#quantization)\n",
"- [Intel's Explanation of Quantization](https://nervanasystems.github.io/distiller/quantization.html)\n",
"\n",
"With this method, the accuracy of the model remains at the same level than the full-precision model. If you want to see benchmarks on model performances, we recommand reading the [ONNX Runtime notebook](https://github.com/microsoft/onnxruntime/blob/master/onnxruntime/python/tools/quantization/notebooks/Bert-GLUE_OnnxRuntime_quantization.ipynb) on the subject."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Benchmarking PyTorch quantized model"
]
},
{
"cell_type": "code",
"execution_count": 32,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"100%|██████████| 100/100 [00:01<00:00, 90.15it/s]\n"
]
}
],
"source": [
"import torch \n",
"\n",
"# Quantize\n",
"model_pt_quantized = torch.quantization.quantize_dynamic(\n",
" model_pt.to(\"cpu\"), {torch.nn.Linear}, dtype=torch.qint8\n",
")\n",
"\n",
"# Warm up \n",
"model_pt_quantized(**model_inputs)\n",
"\n",
"# Benchmark PyTorch quantized model\n",
"time_buffer = []\n",
"for _ in trange(100):\n",
" with track_infer_time(time_buffer):\n",
" model_pt_quantized(**model_inputs)\n",
" \n",
"results[\"PyTorch CPU Quantized\"] = OnnxInferenceResult(\n",
" time_buffer,\n",
" None\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Benchmarking ONNX quantized model"
]
},
{
"cell_type": "code",
"execution_count": 33,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"As of onnxruntime 1.4.0, models larger than 2GB will fail to quantize due to protobuf constraint.\n",
"This limitation will be removed in the next release of onnxruntime.\n",
"Quantized model has been written at bert.onnx: ✔\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Tracking inference time on CPUExecutionProvider with quantized model: 100%|██████████| 100/100 [00:00<00:00, 237.49it/s]\n"
]
}
],
"source": [
"from transformers.convert_graph_to_onnx import quantize\n",
"\n",
"# Transformers allow you to easily convert float32 model to quantized int8 with ONNX Runtime\n",
"quantized_model_path = quantize(Path(\"bert.opt.onnx\"))\n",
"\n",
"# Then you just have to load through ONNX runtime as you would normally do\n",
"quantized_model = create_model_for_provider(quantized_model_path.as_posix(), \"CPUExecutionProvider\")\n",
"\n",
"# Warm up the overall model to have a fair comparaison\n",
"outputs = quantized_model.run(None, inputs_onnx)\n",
"\n",
"# Evaluate performances\n",
"time_buffer = []\n",
"for _ in trange(100, desc=f\"Tracking inference time on CPUExecutionProvider with quantized model\"):\n",
" with track_infer_time(time_buffer):\n",
" outputs = quantized_model.run(None, inputs_onnx)\n",
"\n",
"# Store the result\n",
"results[\"ONNX CPU Quantized\"] = OnnxInferenceResult(\n",
" time_buffer, \n",
" quantized_model_path\n",
") "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Show the inference performance of each providers "
]
},
{
"cell_type": "code",
"execution_count": 34,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 676
},
"colab_type": "code",
"id": "dj-rS8AcqRZQ",
"outputId": "b4bf07d1-a7b4-4eff-e6bd-d5d424fd17fb"
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAABRoAAAPeCAYAAABjjKazAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/d3fzzAAAACXBIWXMAAA9hAAAPYQGoP6dpAABnA0lEQVR4nOzdd3iV9fn48fuEEWaCDBmCshwgai22igNQUcSJooKruFvrVmqlDkSpOKrSWhVHi6MouKBoiwsRxVmx7lZBQVFQFIVA0KDk/P7ol/yMAUz4JCbR1+u6znVxnvOc89w5SY7y5hmZbDabDQAAAACABDnVPQAAAAAAUPsJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AQK3SsWPHOProo9f7+VdeeWV07tw56tSpEz/5yU8qba4fkieeeCIymUw88cQT1T1KKcuXL48NN9wwxo8f/71u99xzz43tt9++Qs/5of+crf4Zuffee6t7lEpz6623RiaTiXnz5n3nuqmfQwDwQyU0AkANdP3110cmk6lw3GDdHnnkkTjnnHNip512inHjxsWll15a3SNVq+uvvz5uvfXW6h6j3P74xz9G06ZNY8iQId/rds8444x45ZVXYsqUKeVa388ZAPBjVbe6BwAAyho/fnx07NgxXnjhhZgzZ0507dq1ukeqMd56663IyVm/fyt9/PHHIycnJ/7yl79E/fr1K3my2uf666+Pli1bltkzq3fv3vHFF1/UqPfoq6++ij/+8Y9x5plnRp06db7Xbbdp0yYOOOCA+MMf/hD777//d67v56x2Ouqoo2LIkCGRm5tb3aMAQK1lj0YAqGHmzp0bzzzzTFx99dXRqlWr7/0w0YiI4uLi+PLLL7/37ZZHbm5u1KtXb72eu2jRomjYsGGlxp8VK1ZU2mvVFDk5OdGgQYP1DrpV4cEHH4xPPvkkDj300GrZ/qGHHhozZ86Md9999zvXreyfs2w2G1988UWlvNYPQVV9PtWpUycaNGgQmUym0l97Xb7++utYuXLl97pNAKgqNef/HgGAiPjf3owbbLBB7LPPPnHwwQeXCo1fffVVNG/ePI455pgyzysoKIgGDRrEsGHDSpYVFRXFiBEjomvXrpGbmxsdOnSIc845J4qKiko9N5PJxCmnnBLjx4+PLbfcMnJzc+Ohhx6KiIg//OEPseOOO0aLFi2iYcOG0bNnzzWel+2LL76I0047LVq2bBlNmzaN/fffPz788MPIZDJx0UUXlVr3ww8/jGOPPTZat24dubm5seWWW8Zf//rXcr0/3z432urzqj399NNx1llnRatWraJx48Zx4IEHxieffFLqaxw3blwUFhZGJpOJTCZT6rDhv/3tb9GzZ89o2LBhNG/ePIYMGRLz588vte2+fftGjx49YtasWdG7d+9o1KhR/O53v1uv93ry5MnRo0ePkq9/9fv97ffpuOOOi3bt2kVubm506tQpTjrppFJRYsmSJXHGGWdEhw4dIjc3N7p27RqXX355FBcXf+f7+MYbb8SMGTNK3o++fftGxJrP0bj6a3/11VejT58+0ahRo+jatWvJz8KMGTNi++23j4YNG8bmm28ejz322Bq/nvX9vk+ePDk6duwYXbp0KbX86KOPjiZNmsT7778f++67bzRp0iQ22mijuO666yIi4rXXXovddtstGjduHJtssknceeedpZ7/1VdfxciRI2PTTTeNBg0aRIsWLWLnnXeORx99tNR6/fr1i4iIv//97+ucc10/Z19//XVccskl0aVLl8jNzY2OHTvG7373uzI/Ix07dox99903Hn744dhuu+2iYcOGceONN65zu88//3zstddekZ+fH40aNYo+ffrE008/XWqd9957L37961/H5ptvHg0bNowWLVrEIYccssZzEi5ZsiTOPPPM6NixY+Tm5kb79u3jF7/4RXz66ael1isuLo7f//730b59+2jQoEHsvvvuMWfOnHXOGhFx0UUXRSaTif/+979x6KGHRl5eXrRo0SJOP/30MhFxXZ9P//73v2PAgAGRl5cXTZo0id133z2ee+65kue++OKLkclk4rbbbiszw8MPPxyZTCYefPDBiFjzORqz2WyMGjUq2rdvH40aNYpdd9013njjjTV+TeX5XZw3b15kMpn4wx/+EGPGjCn5WXjzzTe/8z0DgNrAodMAUMOMHz8+DjrooKhfv34cdthhccMNN8S//vWv+NnPfhb16tWLAw88MO6///648cYbS+0xNXny5CgqKio5f11xcXHsv//+MXPmzDjxxBOjW7du8dprr8U111wTb7/9dkyePLnUdh9//PG4++6745RTTomWLVtGx44dI+J/58Xbf//944gjjoiVK1fGhAkT4pBDDokHH3ww9tlnn5LnH3300XH33XfHUUcdFTvssEPMmDGj1OOrffzxx7HDDjuUxINWrVrF1KlT47jjjouCgoI444wz1ut9O/XUU2ODDTaIESNGxLx582LMmDFxyimnxMSJEyMi4o477oibbropXnjhhbjlllsiImLHHXeMiIjf//73ccEFF8Shhx4axx9/fHzyySdx7bXXRu/evePf//53NGvWrGQ7ixcvjgEDBsSQIUPiyCOPjNatW1f4vZ45c2bcf//98etf/zqaNm0af/rTn2LQoEHx/vvvR4sWLSIiYsGCBfHzn/88lixZEieeeGJsscUW8eGHH8a9994bK1asiPr168eKFSuiT58+8eGHH8Yvf/nL2HjjjeOZZ56J4cOHx8KFC2PMmDFrfb/GjBkTp556ajRp0iTOO++8iIho3br1Ot/jzz//PPbdd98YMmRIHHLIIXHDDTfEkCFDYvz48XHGGWfEr371qzj88MPjyiuvjIMPPjjmz58fTZs2jYj07/szzzwTP/3pT9f42KpVq2LAgAHRu3fvuOKKK2L8+PFxyimnROPGjeO8886LI444Ig466KAYO3Zs/OIXv4hevXpFp06dIuJ/wWv06NFx/PHHx89//vMoKCiIF198MV566aXYY489SraRn58fXbp0iaeffjrOPPPMtc65rp+z448/Pm677bY4+OCD4+yzz47nn38+Ro8eHf/5z39i0qRJpV7nrbfeisMOOyx++ctfxgknnBCbb775Wrf5+OOPx4ABA6Jnz54xYsSIyMnJiXHjxsVuu+0WTz31VPz85z+PiIh//etf8cwzz8SQIUOiffv2MW/evLjhhhuib9++8eabb0ajRo0i4n8X3dlll13iP//5Txx77LHx05/+ND799NOYMmVKfPDBB9GyZcuSbV922WWRk5MTw4YNi6VLl8YVV1wRRxxxRDz//PNrnfebDj300OjYsWOMHj06nnvuufjTn/4Un3/+edx+++1lvsZvfz698cYbscsuu0ReXl6cc845Ua9evbjxxhujb9++JeF7u+22i86dO8fdd98dQ4cOLfWaEydOjA022CD69++/1vkuvPDCGDVqVOy9996x9957x0svvRR77rlnmT0QK/q7OG7cuPjyyy/jxBNPjNzc3GjevHm53i8AqPGyAECN8eKLL2YjIvvoo49ms9lstri4ONu+ffvs6aefXrLOww8/nI2I7AMPPFDquXvvvXe2c+fOJffvuOOObE5OTvapp54qtd7YsWOzEZF9+umnS5ZFRDYnJyf7xhtvlJlpxYoVpe6vXLky26NHj+xuu+1WsmzWrFnZiMieccYZpdY9+uijsxGRHTFiRMmy4447Ltu2bdvsp59+WmrdIUOGZPPz88ts79s22WST7NChQ0vujxs3LhsR2X79+mWLi4tLlp955pnZOnXqZJcsWVKybOjQodnGjRuXer158+Zl69Spk/39739favlrr72WrVu3bqnlffr0yUZEduzYsaXWreh7Xb9+/eycOXNKlr3yyivZiMhee+21Jct+8YtfZHNycrL/+te/yrwHq7/OSy65JNu4cePs22+/Xerxc889N1unTp3s+++/X+a537Tllltm+/TpU2b59OnTsxGRnT59epmv/c477yxZ9t///rfkZ+e5554rWb76Z3TcuHEly1K+71999VU2k8lkzz777DKPDR06NBsR2UsvvbRk2eeff55t2LBhNpPJZCdMmFBm3m/+PG6zzTbZffbZZ63b/qY999wz261bt+9cb00/Zy+//HI2IrLHH398qeXDhg3LRkT28ccfL1m2ySabZCMi+9BDD33ntoqLi7Obbrpptn///qV+/lesWJHt1KlTdo899ii17NueffbZbERkb7/99pJlF154YTYisvfff/8at5fN/v+fkW7dumWLiopKHv/jH/+YjYjsa6+9ts65R4wYkY2I7P77719q+a9//etsRGRfeeWVkmVr+3waOHBgtn79+tl33nmnZNmCBQuyTZs2zfbu3btk2fDhw7P16tXLfvbZZyXLioqKss2aNcsee+yxJctWf5bMnTs3m81ms4sWLcrWr18/u88++5R6b3/3u99lI6LU51B5fxfnzp2bjYhsXl5edtGiRet8jwCgNnLoNADUIOPHj4/WrVvHrrvuGhH/O2Rw8ODBMWHChFi1alVEROy2227RsmXLkj31Iv63p9mjjz4agwcPLll2zz33RLdu3WKLLbaITz/9tOS22267RUTE9OnTS227T58+0b179zIzNWzYsNR2li5dGrvssku89NJLJctXH8b461//utRzTz311FL3s9ls3HfffbHffvtFNpstNVf//v1j6dKlpV63Ik488cRS51bbZZddYtWqVfHee++t83n3339/FBcXx6GHHlpqnjZt2sSmm25a5n3Kzc0tc+h6Rd/rfv36lToEeOutt468vLyS8/8VFxfH5MmTY7/99ovtttuuzMyrv8577rkndtlll9hggw1Kbbdfv36xatWqePLJJ7/rbauQJk2alLri8+abbx7NmjWLbt26lbpC+uo/r/56Ur/vn332WWSz2dhggw3Wus7xxx9f8udmzZrF5ptvHo0bNy51TsfV837zPIvNmjWLN954I2bPnv2dX//q93l9/POf/4yIiLPOOqvU8rPPPjsiIv7xj3+UWt6pU6d17mm32ssvvxyzZ8+Oww8/PBYvXlzyvhYWFsbuu+8eTz75ZMmhu9/8Xf7qq69i8eLF0bVr12jWrFmp9/++++6LbbbZJg488MAy2/v2+QuPOeaYUntW77LLLhER5TqXZUTEySefXOr+6s+M1e/Xat/+fFq1alU88sgjMXDgwOjcuXPJ8rZt28bhhx8eM2fOjIKCgoiIGDx4cHz11Vdx//33l6z3yCOPxJIlS0p9Zn7bY489FitXroxTTz211Ne9pr1vK/q7OGjQoGjVqtVatw0AtZVDpwGghli1alVMmDAhdt1115g7d27J8u233z6uuuqqmDZtWuy5555Rt27dGDRoUNx5551RVFQUubm5cf/998dXX31V6i/Ns2fPjv/85z9r/cvsokWLSt1ffSjptz344IMxatSoePnll0udS+6bf/F+7733Iicnp8xrfPtq2Z988kksWbIkbrrpprjpppvKNVd5bbzxxqXur45Sn3/++TqfN3v27Mhms7Hpppuu8fFvX3hmo402KnORj4q+19+edfW8q2f95JNPoqCgIHr06PGds7/66qvl3m6q9u3blwlN+fn50aFDhzLLIqLU11MZ3/dsNrvG5Q0aNCjzHuTn56913m/+TFx88cVxwAEHxGabbRY9evSIvfbaK4466qjYeuut17j99b1QyOrfkW//TrRp0yaaNWtWJoiv7ffx21YH0m8fFvxNS5cujQ022CC++OKLGD16dIwbNy4+/PDDUu/n0qVLS/78zjvvxKBBg8q1/fX9vVvt2793Xbp0iZycnDLnjfz2+/HJJ5/EihUr1nhIebdu3aK4uDjmz58fW265ZWyzzTaxxRZbxMSJE+O4446LiP8dNt2yZcuSfwxYk9Xfk2/P2KpVqzLRu6K/i+X9/gJAbSM0AkAN8fjjj8fChQtjwoQJMWHChDKPjx8/Pvbcc8+IiBgyZEjceOONMXXq1Bg4cGDcfffdscUWW8Q222xTsn5xcXFstdVWcfXVV69xe9+OQ9/c22m1p556Kvbff//o3bt3XH/99dG2bduoV69ejBs3rsxFNcpj9Z5VRx555FrDyJoCT3nUqVNnjcvXFqe+OVMmk4mpU6eu8TWaNGlS6v6a3qeKvtfrO+uatrvHHnvEOeecs8bHN9tsswq93ndZ29zf9fWkft+bN28emUxmrfFqfeeKiOjdu3e888478fe//z0eeeSRuOWWW+Kaa66JsWPHltpLMuJ/8eyb5ydcH+UNlWv6OVuT1e/tlVdeGT/5yU/WuM7qn+FTTz01xo0bF2eccUb06tUr8vPzI5PJxJAhQ77z4kFrU1k/y6ut7f0p7/uxNoMHD47f//738emnn0bTpk1jypQpcdhhh0XdupXz16GK/i6mfj0AUFMJjQBQQ4wfPz423HDDkqvlftP9998fkyZNirFjx0bDhg2jd+/e0bZt25g4cWLsvPPO8fjjj5dc0GO1Ll26xCuvvBK77777eu+Fdd9990WDBg3i4Ycfjtzc3JLl48aNK7XeJptsEsXFxTF37txSe/98++qzrVq1iqZNm8aqVatKruJb3bp06RLZbDY6deq03mGuMt7rb2rVqlXk5eXF66+//p3bXb58+Xq/l5Uxa3mkft/r1q0bXbp0KbWnb2VafSX3Y445JpYvXx69e/eOiy66qExonDt3bqmYXxGrf0dmz54d3bp1K1n+8ccfx5IlS2KTTTZZr9ddfQh+Xl7ed7639957bwwdOjSuuuqqkmVffvllLFmypMxrftfPXmWZPXt2qb375syZE8XFxSUXo1qbVq1aRaNGjeKtt94q89h///vfyMnJKRX4Bw8eHCNHjoz77rsvWrduHQUFBaVOA7Amq78ns2fPLnV49ieffFImeqf+LgLAD4VzNAJADfDFF1/E/fffH/vuu28cfPDBZW6nnHJKLFu2LKZMmRIRETk5OXHwwQfHAw88EHfccUd8/fXXZc41duihh8aHH34YN9988xq3V1hY+J1z1alTJzKZTMn5ISMi5s2bV+YqyqvPJXf99deXWn7ttdeWeb1BgwbFfffdt8aQ8cknn3znTJXtoIMOijp16sTIkSPL7IWVzWZj8eLF3/kalfFef1NOTk4MHDgwHnjggXjxxRfLPL56zkMPPTSeffbZePjhh8uss2TJkvj666/XuZ3GjRuXiUxVoTK+77169Vrje5Hq29/fJk2aRNeuXUudJiDif4cWv/POOyVXkK6ovffeOyKizNWHV+8Fu6YrtJdHz549o0uXLvGHP/whli9fXubxb763derUKfMzfu2115b6/Y743/kDX3nllTJXwo5Y/z0V1+bb/7Cy+jNjwIAB63xenTp1Ys8994y///3vpQ6z/vjjj+POO++MnXfeOfLy8kqWd+vWLbbaaquYOHFiTJw4Mdq2bRu9e/de5zb69esX9erVi2uvvbbU172mq7mn/i4CwA+FPRoBoAaYMmVKLFu2LPbff/81Pr7DDjtEq1atYvz48SVBcfDgwXHttdfGiBEjYquttiq1l1RExFFHHRV33313/OpXv4rp06fHTjvtFKtWrYr//ve/cffdd8fDDz+8xguNfNM+++wTV199dey1115x+OGHx6JFi+K6666Lrl27xquvvlqyXs+ePWPQoEExZsyYWLx4ceywww4xY8aMePvttyOi9J5zl112WUyfPj223377OOGEE6J79+7x2WefxUsvvRSPPfZYfPbZZ+v1Hq6vLl26xKhRo2L48OExb968GDhwYDRt2jTmzp0bkyZNihNPPDGGDRu2zteojPf62y699NJ45JFHok+fPnHiiSdGt27dYuHChXHPPffEzJkzo1mzZvGb3/wmpkyZEvvuu28cffTR0bNnzygsLIzXXnst7r333pg3b946D/Xt2bNn3HDDDTFq1Kjo2rVrbLjhhus8Z12K1O/7AQccEHfccUe8/fbblXpIePfu3aNv377Rs2fPaN68ebz44otx7733ximnnFJqvcceeyyy2WwccMAB67WdbbbZJoYOHRo33XRTLFmyJPr06RMvvPBC3HbbbTFw4MCSC0BVVE5OTtxyyy0xYMCA2HLLLeOYY46JjTbaKD788MOYPn165OXlxQMPPBAREfvuu2/ccccdkZ+fH927d49nn302HnvssWjRokWp1/zNb34T9957bxxyyCFx7LHHRs+ePeOzzz6LKVOmxNixY9d7r841mTt3buy///6x1157xbPPPht/+9vf4vDDDy/XNkaNGhWPPvpo7LzzzvHrX/866tatGzfeeGMUFRXFFVdcUWb9wYMHx4UXXhgNGjSI4447LnJy1r3PRatWrWLYsGExevTo2HfffWPvvfeOf//73zF16tQyv1epv4sA8EMhNAJADTB+/Pho0KBB7LHHHmt8PCcnJ/bZZ58YP358LF68OFq0aBE77rhjdOjQIebPn7/GK6fm5OTE5MmT45prronbb789Jk2aFI0aNYrOnTvH6aefXq5Ys9tuu8Vf/vKXuOyyy+KMM86ITp06xeWXXx7z5s0rFRojIm6//fZo06ZN3HXXXTFp0qTo169fTJw4MTbffPNo0KBByXqtW7eOF154IS6++OK4//774/rrr48WLVrElltuGZdffnkF37nKce6558Zmm20W11xzTYwcOTIi/ndexT333HOt8febKuO9/raNNtoonn/++bjgggti/PjxUVBQEBtttFEMGDAgGjVqFBERjRo1ihkzZsSll14a99xzT9x+++2Rl5cXm222WYwcObLkoixrc+GFF8Z7770XV1xxRSxbtiz69OlTZaEx9fu+3377RcuWLePuu++O888/v9LmOu2002LKlCnxyCOPRFFRUWyyySYxatSo+M1vflNqvXvuuSd23nnnUlcLr6hbbrklOnfuHLfeemtMmjQp2rRpE8OHD48RI0YkfQ19+/aNZ599Ni655JL485//HMuXL482bdrE9ttvH7/85S9L1vvjH/8YderUifHjx8eXX34ZO+20Uzz22GNlrm7dpEmTeOqpp2LEiBExadKkuO2222LDDTeM3XffPdq3b58067dNnDgxLrzwwjj33HOjbt26ccopp8SVV15ZruduueWW8dRTT8Xw4cNj9OjRUVxcHNtvv3387W9/K3UV9NUGDx4c559/fqxYsWKdV5v+plGjRkWDBg1i7NixJaH8kUceKbMHaurvIgD8UGSylX38AwDA/3n55Zdj2223jb/97W9xxBFHVPc41HKXXHJJjBs3LmbPnr3Wi5BUhY8++ig6deoUEyZMWO89GintoosuipEjR8Ynn3xiTz8A+AFxjkYAoFJ88cUXZZaNGTMmcnJyvvNcaFAeZ555ZixfvnyNV2WvSmPGjImtttpKZAQA+A4OnQYAKsUVV1wRs2bNil133TXq1q0bU6dOjalTp8aJJ55Y6uqvsL6aNGkSixYt+t63e9lll33v2wQAqI2ERgCgUuy4447x6KOPxiWXXBLLly+PjTfeOC666KI477zzqns0AADge+AcjQAAAABAMudoBAAAAACSCY0AAAAAQLIf/Dkai4uLY8GCBdG0adPIZDLVPQ4AAAAA1CrZbDaWLVsW7dq1i5ycte+3+IMPjQsWLHClSwAAAABINH/+/Gjfvv1aH//Bh8amTZtGxP/eiLy8vGqeBgAAAABql4KCgujQoUNJZ1ubH3xoXH24dF5entAIAAAAAOvpu05L6GIwAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExohBqosLAwMplMZDKZKCwsrO5xAAAAAL6T0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgWbWGxhtuuCG23nrryMvLi7y8vOjVq1dMnTq15PEvv/wyTj755GjRokU0adIkBg0aFB9//HE1TgwAAAAArEm1hsb27dvHZZddFrNmzYoXX3wxdttttzjggAPijTfeiIiIM888Mx544IG45557YsaMGbFgwYI46KCDqnNkAAAAAGANMtlsNlvdQ3xT8+bN48orr4yDDz44WrVqFXfeeWccfPDBERHx3//+N7p16xbPPvts7LDDDuV6vYKCgsjPz4+lS5dGXl5eVY4OlaawsDCaNGkSERHLly+Pxo0bV/NEAAAAwI9VeftajTlH46pVq2LChAlRWFgYvXr1ilmzZsVXX30V/fr1K1lniy22iI033jieffbZapwUAAAAAPi2utU9wGuvvRa9evWKL7/8Mpo0aRKTJk2K7t27x8svvxz169ePZs2alVq/devW8dFHH6319YqKiqKoqKjkfkFBQVWNDgAAAAD8n2rfo3HzzTePl19+OZ5//vk46aSTYujQofHmm2+u9+uNHj068vPzS24dOnSoxGkBAAAAgDWp9tBYv3796Nq1a/Ts2TNGjx4d22yzTfzxj3+MNm3axMqVK2PJkiWl1v/444+jTZs2a3294cOHx9KlS0tu8+fPr+KvAAAAAACo9tD4bcXFxVFUVBQ9e/aMevXqxbRp00oee+utt+L999+PXr16rfX5ubm5kZeXV+oGAAAAAFStaj1H4/Dhw2PAgAGx8cYbx7Jly+LOO++MJ554Ih5++OHIz8+P4447Ls4666xo3rx55OXlxamnnhq9evUq9xWnAQAAAIDvR7WGxkWLFsUvfvGLWLhwYeTn58fWW28dDz/8cOyxxx4REXHNNddETk5ODBo0KIqKiqJ///5x/fXXV+fIAAAAAMAaZLLZbLa6h6hKBQUFkZ+fH0uXLnUYNbVGYWFhNGnSJCIili9fHo0bN67miQAAAIAfq/L2tRp3jkYAAAAAoPYRGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEhWraFx9OjR8bOf/SyaNm0aG264YQwcODDeeuutUuv07ds3MplMqduvfvWrapoYAAAAAFiTag2NM2bMiJNPPjmee+65ePTRR+Orr76KPffcMwoLC0utd8IJJ8TChQtLbldccUU1TQwAAAAArEnd6tz4Qw89VOr+rbfeGhtuuGHMmjUrevfuXbK8UaNG0aZNm+97PAAAAACgnGrUORqXLl0aERHNmzcvtXz8+PHRsmXL6NGjRwwfPjxWrFhRHeMBAAAAAGtRrXs0flNxcXGcccYZsdNOO0WPHj1Klh9++OGxySabRLt27eLVV1+N3/72t/HWW2/F/fffv8bXKSoqiqKiopL7BQUFVT47AAAAAPzY1ZjQePLJJ8frr78eM2fOLLX8xBNPLPnzVlttFW3bto3dd9893nnnnejSpUuZ1xk9enSMHDmyyucFAAAAAP6/GnHo9CmnnBIPPvhgTJ8+Pdq3b7/OdbfffvuIiJgzZ84aHx8+fHgsXbq05DZ//vxKnxcAAAAAKK1a92jMZrNx6qmnxqRJk+KJJ56ITp06fedzXn755YiIaNu27Rofz83Njdzc3MocEwAAAAD4DtUaGk8++eS488474+9//3s0bdo0Pvroo4iIyM/Pj4YNG8Y777wTd955Z+y9997RokWLePXVV+PMM8+M3r17x9Zbb12dowMAAAAA35DJZrPZatt4JrPG5ePGjYujjz465s+fH0ceeWS8/vrrUVhYGB06dIgDDzwwzj///MjLyyvXNgoKCiI/Pz+WLl1a7udAdSssLIwmTZpERMTy5cujcePG1TwRAAAA8GNV3r5W7YdOr0uHDh1ixowZ39M0AAAAAMD6qhEXgwEAAAAAajehEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkq1vdA5Cu47n/qO4RqGTFK78s+XO3Cx6KnPoNqnEaqsK8y/ap7hEAAACgUtmjEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAFAhhYWFkclkIpPJRGFhYXWPAwAA1BBCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAklVraBw9enT87Gc/i6ZNm8aGG24YAwcOjLfeeqvUOl9++WWcfPLJ0aJFi2jSpEkMGjQoPv7442qaGAAAAABYk2oNjTNmzIiTTz45nnvuuXj00Ufjq6++ij333DMKCwtL1jnzzDPjgQceiHvuuSdmzJgRCxYsiIMOOqgapwYAAAAAvq1udW78oYceKnX/1ltvjQ033DBmzZoVvXv3jqVLl8Zf/vKXuPPOO2O33XaLiIhx48ZFt27d4rnnnosddtihOsYGAAAAAL6lRp2jcenSpRER0bx584iImDVrVnz11VfRr1+/knW22GKL2HjjjePZZ5+tlhkBAAAAgLKqdY/GbyouLo4zzjgjdtppp+jRo0dERHz00UdRv379aNasWal1W7duHR999NEaX6eoqCiKiopK7hcUFFTZzAAAAADA/9SYPRpPPvnkeP3112PChAlJrzN69OjIz88vuXXo0KGSJgQAAAAA1qZGhMZTTjklHnzwwZg+fXq0b9++ZHmbNm1i5cqVsWTJklLrf/zxx9GmTZs1vtbw4cNj6dKlJbf58+dX5egAAAAAQFRzaMxms3HKKafEpEmT4vHHH49OnTqVerxnz55Rr169mDZtWsmyt956K95///3o1avXGl8zNzc38vLySt0AAAAAgKpVredoPPnkk+POO++Mv//979G0adOS8y7m5+dHw4YNIz8/P4477rg466yzonnz5pGXlxennnpq9OrVyxWnAQAAAKAGqdbQeMMNN0RERN++fUstHzduXBx99NEREXHNNddETk5ODBo0KIqKiqJ///5x/fXXf8+TAgAAAADrUq2hMZvNfuc6DRo0iOuuuy6uu+6672EiAAAAAGB91IiLwQAAAAAAtZvQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQCAH4TCwsLIZDKRyWSisLCwuscBgB8doREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMnqVmTlJUuWxKRJk+Kpp56K9957L1asWBGtWrWKbbfdNvr37x877rhjVc0JAAAAANRg5dqjccGCBXH88cdH27ZtY9SoUfHFF1/ET37yk9h9992jffv2MX369Nhjjz2ie/fuMXHixKqeGQAAAACoYcq1R+O2224bQ4cOjVmzZkX37t3XuM4XX3wRkydPjjFjxsT8+fNj2LBhlTooAAAAAFBzlSs0vvnmm9GiRYt1rtOwYcM47LDD4rDDDovFixdXynAAAAAAQO1QrkOnvysypq4PAAAAANRuFb7q9G233Rb/+Mc/Su6fc8450axZs9hxxx3jvffeq9ThAAAAAIDaocKh8dJLL42GDRtGRMSzzz4b1113XVxxxRXRsmXLOPPMMyt9QAAAAACg5ivXORq/af78+dG1a9eIiJg8eXIMGjQoTjzxxNhpp52ib9++lT0fAAAAAFALVHiPxiZNmpRc7OWRRx6JPfbYIyIiGjRoEF988UXlTgcAAAAA1AoV3qNxjz32iOOPPz623XbbePvtt2PvvfeOiIg33ngjOnbsWNnzAQAAAAC1QIX3aLzuuuuiV69e8cknn8R9991XcoXpWbNmxWGHHVbpAwIAAAAANV+F92hs1qxZ/PnPfy6zfOTIkZUyEAAAAABQ+1Q4NEZEfPnll/Hqq6/GokWLori4uGR5JpOJ/fbbr9KGAwAAAABqhwqHxoceeiiOOuqokgvCfFMmk4lVq1ZVymAAAAAAQO1R4XM0nnrqqXHooYfGwoULo7i4uNRNZAQAAACAH6cKh8aPP/44zjrrrGjdunVVzAMAAAAA1EIVPnT64IMPjieeeCK6dOlSFfMAEZFTv0Fs8tsHq3sMAAAAgHKrcGj885//HIccckg89dRTsdVWW0W9evVKPX7aaadV2nAAAAAAQO1Q4dB41113xSOPPBINGjSIJ554IjKZTMljmUxGaAQAAACAH6EKh8bzzjsvRo4cGeeee27k5FT4FI8AAAAAwA9QhUvhypUrY/DgwSIjAAAAAFCiwrVw6NChMXHixKqYBQAAAACopSp86PSqVaviiiuuiIcffji23nrrMheDufrqqyttOAAAAACgdqhwaHzttddi2223jYiI119/vdRj37wwDAAAAADw41Hh0Dh9+vSqmAMAAAAAqMVc0QUAAAAASFau0PirX/0qPvjgg3K94MSJE2P8+PFJQwEAAAAAtUu5Dp1u1apVbLnllrHTTjvFfvvtF9ttt120a9cuGjRoEJ9//nm8+eabMXPmzJgwYUK0a9cubrrppqqeGwAAAACoQcoVGi+55JI45ZRT4pZbbonrr78+3nzzzVKPN23aNPr16xc33XRT7LXXXlUyKAAAAABQc5X7YjCtW7eO8847L84777z4/PPP4/33348vvvgiWrZsGV26dHHFaQAAAAD4EavwVacjIjbYYIPYYIMNKnsWAAAAAKCWctVpAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkGy9QuPXX38djz32WNx4442xbNmyiIhYsGBBLF++vFKHAwAAAABqhwpfdfq9996LvfbaK95///0oKiqKPfbYI5o2bRqXX355FBUVxdixY6tiTgAAAACgBqvwHo2nn356bLfddvH5559Hw4YNS5YfeOCBMW3atEodDgAAAACoHSq8R+NTTz0VzzzzTNSvX7/U8o4dO8aHH35YaYMBAAAAALVHhfdoLC4ujlWrVpVZ/sEHH0TTpk0rZSgAAAAAoHapcGjcc889Y8yYMSX3M5lMLF++PEaMGBF77713Zc4GAAAAANQSFT50+qqrror+/ftH9+7d48svv4zDDz88Zs+eHS1btoy77rqrKmYEAAAAAGq4CofG9u3bxyuvvBITJkyIV199NZYvXx7HHXdcHHHEEaUuDgMAAAAA/HhUODRGRNStWzeOPPLIyp4FAAAAAKil1is0LliwIGbOnBmLFi2K4uLiUo+ddtpplTIYAAAAAFB7VDg03nrrrfHLX/4y6tevHy1atIhMJlPyWCaTERoBAAAA4EeowqHxggsuiAsvvDCGDx8eOTkVvmg1AAAAAPADVOHQuGLFihgyZIjICEC5dDz3H9U9ApWseOWXJX/udsFDkVO/QTVOQ2Wbd9k+1T0CAAC1VIVr4XHHHRf33HNPVcwCAAAAANRSFd6jcfTo0bHvvvvGQw89FFtttVXUq1ev1ONXX311pQ0HAAAAANQO6xUaH3744dh8880jIspcDAYAAAAA+PGpcGi86qqr4q9//WscffTRVTAOAAAAAFAbVfgcjbm5ubHTTjtVxSwAAAAAQC1V4dB4+umnx7XXXlsVswAAAAAAtVSFD51+4YUX4vHHH48HH3wwttxyyzIXg7n//vsrbTgAAAAAoHaocGhs1qxZHHTQQVUxCwAAAABQS1U4NI4bN64q5gAAAAAAarEKn6OxMj355JOx3377Rbt27SKTycTkyZNLPX700UdHJpMpddtrr72qZ1gAAAAAYK3KtUfjT3/605g2bVpssMEGse2220Ymk1nrui+99FK5N15YWBjbbLNNHHvssWs9HHuvvfYqtRdlbm5uuV8fAAAAAPh+lCs0HnDAASWB74ADDlhnaKyIAQMGxIABA9a5Tm5ubrRp06ZStgcAAAAAVI1yhcYRI0aU/Pmiiy6qqlnW6IknnogNN9wwNthgg9htt91i1KhR0aJFi+91BgAAAABg3Sp8jsbOnTvH4sWLyyxfsmRJdO7cuVKGWm2vvfaK22+/PaZNmxaXX355zJgxIwYMGBCrVq1a63OKioqioKCg1A0AAAAAqFoVvur0vHnz1hj6ioqK4oMPPqiUoVYbMmRIyZ+32mqr2HrrraNLly7xxBNPxO67777G54wePTpGjhxZqXMAAAAAAOtW7tA4ZcqUkj8//PDDkZ+fX3J/1apVMW3atOjUqVPlTvctnTt3jpYtW8acOXPWGhqHDx8eZ511Vsn9goKC6NChQ5XOBQAAAAA/duUOjQMHDoyIiEwmE0OHDi31WL169aJjx45x1VVXVepw3/bBBx/E4sWLo23btmtdJzc315WpAQAAAOB7Vu7QWFxcHBERnTp1in/961/RsmXL5I0vX7485syZU3J/7ty58fLLL0fz5s2jefPmMXLkyBg0aFC0adMm3nnnnTjnnHOia9eu0b9//+RtAwAAAACVp8LnaJw7d26lbfzFF1+MXXfdteT+6kOehw4dGjfccEO8+uqrcdttt8WSJUuiXbt2seeee8Yll1xij0UAAAAAqGEqHBorU9++fSObza718Ycffvh7nAYAAAAAWF851T0AAAAAAFD7CY0AAAAAQDKhEQAAAABItl6h8Z133onzzz8/DjvssFi0aFFEREydOjXeeOONSh0OAAAAAKgdKhwaZ8yYEVtttVU8//zzcf/998fy5csjIuKVV16JESNGVPqAAAAAAEDNV+HQeO6558aoUaPi0Ucfjfr165cs32233eK5556r1OEAAAAAgNqhwqHxtddeiwMPPLDM8g033DA+/fTTShkKAAAAAKhdKhwamzVrFgsXLiyz/N///ndstNFGlTIUAAAAAFC7VDg0DhkyJH7729/GRx99FJlMJoqLi+Ppp5+OYcOGxS9+8YuqmBEAAAAAqOEqHBovvfTS2GKLLaJDhw6xfPny6N69e/Tu3Tt23HHHOP/886tiRgAAAACghqtb0SfUr18/br755rjwwgvjtddei+XLl8e2224bm266aVXMBwAAAADUAhUOjat16NAhOnToUJmzAAAAAAC1VIUPnR40aFBcfvnlZZZfccUVccghh1TKUAAAAABA7VLh0Pjkk0/G3nvvXWb5gAED4sknn6yUoQAAAACA2qXCoXH58uVRv379Msvr1asXBQUFlTIUAAAAAFC7VDg0brXVVjFx4sQyyydMmBDdu3evlKEAAAAAgNqlwheDueCCC+Kggw6Kd955J3bbbbeIiJg2bVrcddddcc8991T6gAAAAABAzVfh0LjffvvF5MmT49JLL4177703GjZsGFtvvXU89thj0adPn6qYEQAAAACo4SocGiMi9tlnn9hnn30qexYAAAAAoJZar9AYEbFy5cpYtGhRFBcXl1q+8cYbJw8FAAAAANQuFQ6Ns2fPjmOPPTaeeeaZUsuz2WxkMplYtWpVpQ0HAAAAANQOFQ6NRx99dNStWzcefPDBaNu2bWQymaqYCwAAAACoRSocGl9++eWYNWtWbLHFFlUxDwAAAABQC+VU9Andu3ePTz/9tCpmAQAAAABqqQqHxssvvzzOOeeceOKJJ2Lx4sVRUFBQ6gYAAAAA/PhU+NDpfv36RUTE7rvvXmq5i8EAAAAAwI9XhUPj9OnTq2IOAAAAAKAWq3Bo7NOnT1XMAQAAAADUYhU+R2NExFNPPRVHHnlk7LjjjvHhhx9GRMQdd9wRM2fOrNThAAAAAIDaocKh8b777ov+/ftHw4YN46WXXoqioqKIiFi6dGlceumllT4gAAAAAFDzVTg0jho1KsaOHRs333xz1KtXr2T5TjvtFC+99FKlDgcAAAAA1A4VDo1vvfVW9O7du8zy/Pz8WLJkSWXMBAAAAADUMhUOjW3atIk5c+aUWT5z5szo3LlzpQwFAAAAANQuFQ6NJ5xwQpx++unx/PPPRyaTiQULFsT48eNj2LBhcdJJJ1XFjAAAAABADVe3ok8499xzo7i4OHbfffdYsWJF9O7dO3Jzc2PYsGFx6qmnVsWMAAAAAEANV6HQuGrVqnj66afj5JNPjt/85jcxZ86cWL58eXTv3j2aNGlSVTMCAAAAADVchUJjnTp1Ys8994z//Oc/0axZs+jevXtVzQUAAAAA1CIVPkdjjx494t13362KWQAAAACAWqrCoXHUqFExbNiwePDBB2PhwoVRUFBQ6gYAAAAA/PhU+GIwe++9d0RE7L///pHJZEqWZ7PZyGQysWrVqsqbDgAAAACoFSocGqdPn14VcwAAAAAAtViFQ2OfPn2qYg4AAAAAoBar8DkaIyKeeuqpOPLII2PHHXeMDz/8MCIi7rjjjpg5c2alDgcAAAAA1A4VDo333Xdf9O/fPxo2bBgvvfRSFBUVRUTE0qVL49JLL630AQEAAACAmm+9rjo9duzYuPnmm6NevXoly3faaad46aWXKnU4AAAAAKB2qHBofOutt6J3795llufn58eSJUsqYyYAAAAAoJapcGhs06ZNzJkzp8zymTNnRufOnStlKAAAAACgdqlwaDzhhBPi9NNPj+effz4ymUwsWLAgxo8fH8OGDYuTTjqpKmYEAAAAAGq4uhV9wrnnnhvFxcWx++67x4oVK6J3796Rm5sbw4YNi1NPPbUqZgQAAAAAarhyhcZXX301evToETk5OZHJZOK8886L3/zmNzFnzpxYvnx5dO/ePZo0aVLVswIAAAAANVS5Dp3edttt49NPP42IiM6dO8fixYujfv360b179/j5z38uMgIAAADAj1y5QmOzZs1i7ty5ERExb968KC4urtKhAAAAAIDapVyHTg8aNCj69OkTbdu2jUwmE9ttt13UqVNnjeu+++67lTogAAAAAFDzlSs03nTTTXHQQQfFnDlz4rTTTosTTjghmjZtWtWzAQAAAAC1RLmvOr3XXntFRMSsWbPi9NNPFxoBAAAAgBLlDo2rjRs3rirmAAAAAABqsQqHxsLCwrjsssti2rRpsWjRojIXhnGORgAAAAD48alwaDz++ONjxowZcdRRR5VcHAYAAAAA+HGrcGicOnVq/OMf/4iddtqpKuYBAAAAAGqhnIo+YYMNNojmzZtXxSwAAAAAQC1V4dB4ySWXxIUXXhgrVqyoinkAAAAAgFqowodOX3XVVfHOO+9E69ato2PHjlGvXr1Sj7/00kuVNhwAAAAAUDtUODQOHDiwCsYAAAAAAGqzCofGESNGVMUcAAAAAEAtVuFzNAIAAAAAfFu592jcYIMNIpPJfOd6n332WdJAAAAAAEDtU+7QOGbMmCocAwAAAACozcodGocOHVqVcwAAAAAAtZhzNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkKzcV51e7ayzzlrj8kwmEw0aNIiuXbvGAQccEM2bN08eDgAAAACoHSocGv/973/HSy+9FKtWrYrNN988IiLefvvtqFOnTmyxxRZx/fXXx9lnnx0zZ86M7t27V/rAAAAAAEDNU+FDpw844IDo169fLFiwIGbNmhWzZs2KDz74IPbYY4847LDD4sMPP4zevXvHmWeeWRXzAgAAAAA1UIVD45VXXhmXXHJJ5OXllSzLz8+Piy66KK644opo1KhRXHjhhTFr1qxKHRQAAAAAqLkqHBqXLl0aixYtKrP8k08+iYKCgoiIaNasWaxcuTJ9OgAAAACgVlivQ6ePPfbYmDRpUnzwwQfxwQcfxKRJk+K4446LgQMHRkTECy+8EJtttlllzwoAAABQKxUWFkYmk4lMJhOFhYXVPQ5UiQpfDObGG2+MM888M4YMGRJff/31/16kbt0YOnRoXHPNNRERscUWW8Qtt9xSuZMCAAAAADVWhUNjkyZN4uabb45rrrkm3n333YiI6Ny5czRp0qRknZ/85CeVNiAAAAAAUPNV+NDpv/3tb7FixYpo0qRJbL311rH11luXiowAAAAAwI9PhUPjmWeeGRtuuGEcfvjh8c9//jNWrVpVFXMBAAAAALVIhUPjwoULY8KECZHJZOLQQw+Ntm3bxsknnxzPPPNMVcwHAAAAANQCFQ6NdevWjX333TfGjx8fixYtimuuuSbmzZsXu+66a3Tp0qUqZgQAAAAAargKXwzmmxo1ahT9+/ePzz//PN577734z3/+U1lzAQAAAAC1SIX3aIyIWLFiRYwfPz723nvv2GijjWLMmDFx4IEHxhtvvFHZ8wEAAAAAtUCF92gcMmRIPPjgg9GoUaM49NBD44ILLohevXpVxWwAAAAAQC1R4dBYp06duPvuu6N///5Rp06dUo+9/vrr0aNHj0obDgAAAACoHSocGsePH1/q/rJly+Kuu+6KW265JWbNmhWrVq2qtOEAAAAAgNphvS8G8+STT8Zf/vKXuO+++6Jdu3Zx0EEHxXXXXVeZswEAQJXqeO4/qnsEKlHxyi9L/tztgocip36DapyGqjDvsn2qewQA1qFCofGjjz6KW2+9Nf7yl79EQUFBHHrooVFUVBSTJ0+O7t27V9WMAAAAAEANV+6rTu+3336x+eabx6uvvhpjxoyJBQsWxLXXXluVswEAAAAAtUS592icOnVqnHbaaXHSSSfFpptuWpUzAQAAAAC1TLn3aJw5c2YsW7YsevbsGdtvv338+c9/jk8//bQqZwMAAAAAaolyh8Yddtghbr755li4cGH88pe/jAkTJkS7du2iuLg4Hn300Vi2bFlVzgkAAAAA1GDlDo2rNW7cOI499tiYOXNmvPbaa3H22WfHZZddFhtuuGHsv//+VTEjAAAAAFDDVTg0ftPmm28eV1xxRXzwwQdx1113VdZMAAAAAEAtkxQaV6tTp04MHDgwpkyZUhkvBwAAAADUMpUSGgEAAACAHzehEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJqjU0Pvnkk7HffvtFu3btIpPJxOTJk0s9ns1m48ILL4y2bdtGw4YNo1+/fjF79uzqGRYAAAAAWKtqDY2FhYWxzTbbxHXXXbfGx6+44or405/+FGPHjo3nn38+GjduHP37948vv/zye54UAAAAAFiXutW58QEDBsSAAQPW+Fg2m40xY8bE+eefHwcccEBERNx+++3RunXrmDx5cgwZMuT7HBUAAAAAWIcae47GuXPnxkcffRT9+vUrWZafnx/bb799PPvss9U4GQAAAADwbdW6R+O6fPTRRxER0bp161LLW7duXfLYmhQVFUVRUVHJ/YKCgqoZEAAAAAAoUWP3aFxfo0ePjvz8/JJbhw4dqnskAAAAAPjBq7GhsU2bNhER8fHHH5da/vHHH5c8tibDhw+PpUuXltzmz59fpXMCAAAAADU4NHbq1CnatGkT06ZNK1lWUFAQzz//fPTq1Wutz8vNzY28vLxSNwAAAACgalXrORqXL18ec+bMKbk/d+7cePnll6N58+ax8cYbxxlnnBGjRo2KTTfdNDp16hQXXHBBtGvXLgYOHFh9QwMAAAAAZVRraHzxxRdj1113Lbl/1llnRUTE0KFD49Zbb41zzjknCgsL48QTT4wlS5bEzjvvHA899FA0aNCgukYGAAAAANagWkNj3759I5vNrvXxTCYTF198cVx88cXf41QAAAAAQEXV2HM0AgAAAAC1h9AIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGR1q3sAAKB2yanfIDb57YPVPQYAAFDD2KMRAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyWp0aLzooosik8mUum2xxRbVPRYAAAAA8C11q3uA77LlllvGY489VnK/bt0aPzIAAAAA/OjU+GpXt27daNOmTXWPAQAAAACsQ40+dDoiYvbs2dGuXbvo3LlzHHHEEfH++++vc/2ioqIoKCgodQMAAAAAqlaNDo3bb7993HrrrfHQQw/FDTfcEHPnzo1ddtklli1bttbnjB49OvLz80tuHTp0+B4nBgAAAIAfpxodGgcMGBCHHHJIbL311tG/f//45z//GUuWLIm77757rc8ZPnx4LF26tOQ2f/7873FiAAAAAPhxqvHnaPymZs2axWabbRZz5sxZ6zq5ubmRm5v7PU4FAAAAANToPRq/bfny5fHOO+9E27Ztq3sUAAAAAOAbanRoHDZsWMyYMSPmzZsXzzzzTBx44IFRp06dOOyww6p7NAAAAADgG2r0odMffPBBHHbYYbF48eJo1apV7LzzzvHcc89Fq1atqns0AAAAAOAbanRonDBhQnWPAAAAAACUQ40+dBoAAAAAqB2ERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJLVre4BAAAAgNI6nvuP6h6BSla88suSP3e74KHIqd+gGqehKsy7bJ/qHqHa2aMRAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMnqVvcAAAAAlSGnfoPY5LcPVvcYAPCjZY9GAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJCsVoTG6667Ljp27BgNGjSI7bffPl544YXqHgkAAAAA+IYaHxonTpwYZ511VowYMSJeeuml2GabbaJ///6xaNGi6h4NAAAAAPg/NT40Xn311XHCCSfEMcccE927d4+xY8dGo0aN4q9//Wt1jwYAAAAA/J8aHRpXrlwZs2bNin79+pUsy8nJiX79+sWzzz67xucUFRVFQUFBqRsAAAAAULUy2Ww2W91DrM2CBQtio402imeeeSZ69epVsvycc86JGTNmxPPPP1/mORdddFGMHDmyzPKlS5dGXl5elc4LAAAAsCaFhYXRpEmTiIhYvnx5NG7cuJongvIrKCiI/Pz87+xrNXqPxvUxfPjwWLp0aclt/vz51T0SAAAAAPzg1a3uAdalZcuWUadOnfj4449LLf/444+jTZs2a3xObm5u5Obmfh/jAQAAAAD/p0bv0Vi/fv3o2bNnTJs2rWRZcXFxTJs2rdSh1AAAAABA9arRezRGRJx11lkxdOjQ2G677eLnP/95jBkzJgoLC+OYY46p7tEAAAAAgP9T40Pj4MGD45NPPokLL7wwPvroo/jJT34SDz30ULRu3bq6RwMAAAAA/k+Nvup0ZSjvVXEAAAAAqoqrTlOb/WivOg0AAAAAfP+ERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACS1a3uAQAAAAB+6Bo3bhzZbLa6x4AqZY9GAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJhEYAAAAAIJnQCAAAAAAkExoBAAAAgGRCIwAAAACQTGgEAAAAAJIJjQAAAABAMqERAAAAAEgmNAIAAAAAyYRGAAAAACCZ0AgAAAAAJBMaAQAAAIBkQiMAAAAAkExoBAAAAACSCY0AAAAAQDKhEQAAAABIJjQCAAAAAMmERgAAAAAgmdAIAAAAACQTGgEAAACAZEIjAAAAAJBMaAQAAAAAkgmNAAAAAEAyoREAAAAASCY0AgAAAADJ6lb3AFUtm81GRERBQUE1TwIAAAAAtc/qrra6s63NDz40Llu2LCIiOnToUM2TAAAAAEDttWzZssjPz1/r45nsd6XIWq64uDgWLFgQTZs2jUwmU93jQLkVFBREhw4dYv78+ZGXl1fd4wCU4jMKqKl8PgE1mc8oaqtsNhvLli2Ldu3aRU7O2s/E+IPfozEnJyfat29f3WPAesvLy/MfIKDG8hkF1FQ+n4CazGcUtdG69mRczcVgAAAAAIBkQiMAAAAAkExohBoqNzc3RowYEbm5udU9CkAZPqOAmsrnE1CT+Yzih+4HfzEYAAAAAKDq2aMRAAAAAEgmNAIAAAAAyYRGAAAAACCZ0Ai1wK233hrNmjWr7jEAANbK/69UrXnz5kUmk4mXX365yrbhewhAKqGRH6Wjjz46MplMZDKZqF+/fnTt2jUuvvji+Prrr7/zubfeemvJc9d2mzdvXtV/EeV03333Rd++fSM/Pz+aNGkSW2+9dVx88cXx2WefRUTprycnJyfat28fxxxzTCxatCgi1v0/tX379o0zzjjje/xqgPnz58exxx4b7dq1i/r168cmm2wSp59+eixevLjUen379o1MJhMTJkwotXzMmDHRsWPHkvurPwP22muvUustWbIkMplMPPHEExER8corr0T9+vVjypQppda77777okGDBvH666+vc26fRVBx/n/l+/uM+Oyzz+KMM86ITTbZJOrXrx/t2rWLY489Nt5///3K/lLL5eijj46BAweWWtahQ4dYuHBh9OjRo1pmAvAPHpSH0MiP1l577RULFy6M2bNnx9lnnx0XXXRRXHnlld/5vMGDB8fChQtLbr169YoTTjih1LIOHTqUe46VK1emfBnrdN5558XgwYPjZz/7WUydOjVef/31uOqqq+KVV16JO+64o2S9vLy8WLhwYXzwwQdx8803x9SpU+Ooo46qsrmA9fPuu+/GdtttF7Nnz4677ror5syZE2PHjo1p06ZFr169Sv5CvlqDBg3i/PPPj6+++mqdr1u3bt147LHHYvr06WtdZ5tttokLL7wwTjzxxJKouWjRovjVr34VI0eOXOdffH0Wwfrz/ytV/xnx2WefxQ477BCPPfZYjB07NubMmRMTJkyIOXPmxM9+9rN49913k7dRGerUqRNt2rSJunXrVvcosE7+UfTlMq/tHzz4MREa+dHKzc2NNm3axCabbBInnXRS9OvXL6ZMmRKFhYWRl5cX9/6/9u49KKq6jQP4FwhmYRfUkAS5rYokhKhlURowWoRM5C2VEpXxMjmDNyogHWYSHcFmREfRTCcNQTGxF8wZU8mlAicMGnIHCgwQEErJ9YIOYqLwvH/4emK5LAJab/r9zOwMe87v/M5vD/Ds7zzP2bP/+Y9R+y+//BJqtRp37tyBo6Oj8rCysoKNjY3yvLm5GdOnT4dGo4GdnR1mzZqFP/74Q+knPj4eo0ePxq5duzBkyBCoVCoAd98oFy9ejEGDBkGlUsHHxwdHjhwxGkN2dja8vLyg0WiUE4+uFBYWIjExERs3bsSGDRswbtw4aLVaBAUFITMzExEREUpbMzMzODo6YvDgwQgJCcHy5cuh0+lw8+bNB3GoiegBWbJkCaysrPD1118jMDAQbm5uCAkJgU6nw++//464uDij9m+//TYaGhrw6aefmuxXrVZjwYIFWLlypcl2q1atgpubG5YsWQIAWLx4MYYPH47o6Ogut2EsIuobzlcefoyIi4vD+fPnodPpEBISAjc3NwQEBCA7OxuWlpZKzAMArVaLzZs3G20/evRoxMfHK883bdqEkSNHQq1Ww9XVFZGRkWhsbFTW37tap6vjFB8fj9TUVBw+fFhJanz33Xcdkhhtr3ht+7iXdLl16xaio6Ph7OwMtVoNPz8/ZV3bsbi5ucHGxgbTpk3rkAgi6ikWRXuHBQ96lDDRSPQ/1tbWaG5uhlqtxltvvYWUlBSj9SkpKZgxYwZsbW277KO1tRVTpkzBlStXkJubixMnTqCqqgphYWFG7SorK5GZmYmsrCzo9Xq0trYiJCQE33//Pfbt24fS0lJ89NFHsLCwULZpampCUlIS9u7di7y8PNTW1po8uU9PT4dGo0FkZGSn601djm5tbY3W1tb7+mgWEf09rly5guzsbERGRsLa2tponaOjI8LDw5GRkQERUZbb2dkhLi4Oa9euxY0bN0z2Hx8fj5KSkg5Ji7YsLCyUk9/Zs2cjOzsbe/bsMYpV7TEWET1YnK8YH4u+xojW1lYcOHAA4eHhcHR07NB/ZGQksrOzOyRHTDE3N0dycjJ++eUXpKam4ptvvkFsbKxRG1PHKTo6GrNmzVKSjxcuXMC4ceM67GfLli1GV6iuWLECTz31FEaMGAEAWLp0KU6dOoUDBw6guLgYM2fOxKRJk1BRUQEAKCgowMKFC7F06VLo9XpMmDAB69at69HxI2qPRdHeYcGDBY9HCRON9NgTEeh0OmRnZ2PixIkAgEWLFiE7O1sJtBcvXsTRo0exYMECk33l5OSgpKQE+/fvx3PPPQc/Pz+kpaUhNzcXP/74o9KuubkZaWlpGDNmDHx9faHT6VBYWIisrCwEBQVh6NChCA0NRUhIiLLN7du3sWPHDowdOxbPPvssli5dipycnC7HUlFRgaFDh8LS0rJHx6OiokLZj6mTFCL6e1VUVEBE4OXl1el6Ly8vXL16FQaDwWh5ZGQkVCoVNm3aZLL/wYMHY8WKFYiLizN50u7l5YWoqCh8/vnniI+Ph6enZ7fjZiwi6jvOVzpu9yBihMFgQENDg8nYKiKorKy87z6joqIwYcIEaLVaTJw4EevWrcPBgweN2pg6ThqNBtbW1srVrPeuSG2vX79+yvr8/Hzs3LkTWVlZcHR0RG1tLVJSUvDFF1/A398fw4YNQ3R0NF5++WUlOb1lyxZMmjQJsbGx8PT0xPLlyxEcHHzfr5OoPRZFe4cFDxY8HjVMNNJj68iRI9BoNFCpVAgJCUFYWJhSBXrhhRfwzDPPIDU1FQCwb98+uLu7IyAgwGSfZWVlcHV1Nbrnkbe3N/r374+ysjJlmbu7OxwcHJTner0eLi4uJk/YbWxsMGzYMOW5k5OTcj+QzrR9A+/OtWvXoNFoYGNjg6effhqDBg1Cenr6fW9PRH+fnvxvA3c/drl27VokJSXh0qVLJtt+8MEHMBgM+Oyzz7ps09jYiIyMDNjY2ODkyZMPdLyMRUQdcb7yl4cZI7obR2eJvq7odDq88sorcHZ2hq2tLebOnYvLly+jqalJadPT42TK6dOnMXfuXGzbtg3jx48HAJSUlKClpQWenp7QaDTKIzc3F2fPngVw9+/Az8/PqK+XXnqpV2MgAlgU7S0WPFjweNQw0UiPrQkTJkCv16OiogI3b95Eamoq1Gq1sn7RokXYs2cPgLsfQ5o/fz7MzMweyL7b7gdAh4pfZ9q/8ZmZmZmcFHt6eqKqqqrb+50AgK2tLfR6PX7++WfcuHEDeXl5yhuynZ0dgLuT+/YaGhrQr1+/bvsnor7z8PCAmZmZURKgrbKyMgwYMMAoKXDPnDlz4O7u3m2FuH///li1ahXWrFljdELcVkxMDFQqFfLz86HT6ZCWlmayT8Yior7hfOUvDyNGODg4dEiwtlVWVoYnnngCQ4YMAXD3KqH2r6ft2GtqahAaGgpfX19kZmaiqKgIH3/8MQDjL9Tp6XHqSn19PSZPnoxFixZh4cKFyvLGxkZYWFigqKgIer1eeZSVlWHLli093g9RT7Ao2jsseNzFgse/HxON9NhSq9Xw8PCAm5tbpzeznTNnDs6dO4fk5GSUlpYa3ZujK15eXqirq0NdXZ2yrLS0FA0NDfD29u5yO19fX/z2228oLy/v3YvpxOzZs9HY2Ijt27d3ur6hoUH52dzcHB4eHhg6dGiHk4gnn3wSAwcORFFRkdHy69evo7KystsKIRE9GPb29ggKCsL27ds73Aeovr4e6enpCAsL6zTBYG5ujvXr1+OTTz5BTU2Nyf0sW7YM5ubmnZ6InjhxArt27UJqaipGjRqFdevWISoqyuQXPTAWEfUN5ysNys8PI0aYm5tj1qxZ2L9/P+rr643W3bx5E9u3b8e0adOURKWDg4NRzLt+/Tqqq6uV50VFRWhtbcXGjRvx4osvwtPTE+fPn+/+QLRjZWWFlpYWk23+/PNPTJkyBSNGjOhwJdiYMWPQ0tKCixcvwsPDw+hx76OZXl5eKCgoMNruhx9+6PFYie5hUZQFDxY8CGCikahLAwYMwPTp0xETE4PXXnsNLi4u3W7z6quvYuTIkQgPD8dPP/2EwsJCzJs3D4GBgRg7dmyX2wUGBiIgIABvvvkmTpw4gerqahw7dgzHjx/v9fj9/PwQGxuL999/H7GxsTh16hTOnTuHnJwczJw5U/mY1f147733kJiYiPT0dJw9exaFhYUIDw+Hg4MDpk+f3usxElHPbNu2Dbdu3UJwcDDy8vJQV1eH48ePIygoCM7OzkhISOhy29dffx1+fn7YuXOnyX2oVCqsWbMGycnJRsuvX7+OhQsXIiYmBs8//zwA4N1334W3tzfeeeedLvtjLCJ6uDhf+UtvY0RCQgIcHR0RFBSEY8eOoa6uDnl5eQgODu5QeJk4cSL27t2LkydPoqSkBBEREUb3fvPw8MDt27exdetWVFVVYe/evdixY0ePj4tWq0VxcTF+/fVXXLp0qdMEyOLFi1FXV4fk5GQYDAbU19ejvr4ezc3N8PT0RHh4OObNm4esrCxUV1ejsLAQ69evx1dffQUAWL58OY4fP46kpCRUVFRg27ZtffpdErEoyoJHWyx4PMaE6DEUEREhU6ZM6bZdTk6OAJCDBw922SYwMFBWrFihPD937pxMnjxZ1Gq12NraysyZM6W+vl5Zv3r1ahk1alSHfi5fvizz588Xe3t7UalU4uPjI0eOHBERkZSUFOnXr59R+0OHDsn9/AtnZGRIQECA2NrailqtFl9fX1m7dq1cvXq1y77bu3PnjiQnJ8vIkSPFxsZGXFxcJCwsTKqrq7vdPxE9WDU1NRIRESGDBg0SS0tLcXV1lWXLlsmlS5eM2rWPTSIi+fn5AkDc3d2VZZ3FgDt37oi3t7cAkG+//VZERObPny8+Pj5y69Yto7bl5eViY2MjqampJsfNWETUc5yv/H0xwmAwyLJly8TV1VUsLCwEgIwbN04uX75s1O7atWsSFhYmdnZ24urqKnv27JFRo0bJ6tWrlTabNm0SJycnsba2luDgYElLSxMAJl9L++N08eJFCQoKEo1Go8Ti6upqASCnT58WERF3d3cB0OFxL243NzfLhx9+KFqtViwtLcXJyUmmTZsmxcXFyn52794tLi4uYm1tLW+88YYkJSV1e5yJTCkvL5eBAweKv7+/5ObmSm1trRw7dkx8fHxk+PDhRv9Tnc1V/P39RaVSdTtX2b17t6hUKqO/+WvXromrq6usWrVKadfS0iLjx4+X0NBQk+OOjY0VCwsLiYmJkfz8fKmpqRGdTiczZsyQzZs3dzmO9hITE8Xe3l727dsnlZWVUlBQIKGhoaLVaqWpqanL7QwGgwwbNkx8fHzk6NGjUltbK7m5ueLv7y+urq5y/vx5pe3KlSvF0dFR8vLypLi4WKZOnSoajUaJQ3q9XgDI5s2b5ezZs5KWlibOzs49jkMJCQni5uYmZ86cEYPBIM3NzR3i0Lx588TJyUlKS0vlwoULyuPefDE8PFy0Wq1kZmZKVVWVFBQUSGJiovK+cerUKTE3N5cNGzZIeXm5bN26Vfr378849C/HRCORCWlpaWJvb9/hxJqIiIjo/wXnKw/erl27xMrKSg4dOvRPD4XoX4dFURY8WPB4vJmJ9OJD+ESPuKamJly4cAGTJ0/G1KlTTX4ckYiIiOifwPnKw3Xo0CGcOXMGUVFR9/VFOERED9Lu3bsRGRmJjIwMTJ069Z8eDtF9Y6KRqBPx8fFISEhAQEAADh8+DI1G808PiYiIiMgI5ytERI82Fjzo34iJRiIiIiIiIiIiIuozfus0ERERERERERER9RkTjURERERERERERNRnTDQSERERERERERFRnzHRSERERERERERERH3GRCMRERERERERERH1GRONRERERERERERE1GdMNBIREREREREREVGfMdFIREREREREREREfcZEIxEREREREREREfXZfwH+3hof9wmK1wAAAABJRU5ErkJggg==\n",
"text/plain": [
"<Figure size 1600x1200 with 1 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"%matplotlib inline\n",
"\n",
"import matplotlib\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"import os\n",
"\n",
"\n",
"# Compute average inference time + std\n",
"time_results = {k: np.mean(v.model_inference_time) * 1e3 for k, v in results.items()}\n",
"time_results_std = np.std([v.model_inference_time for v in results.values()]) * 1000\n",
"\n",
"plt.rcdefaults()\n",
"fig, ax = plt.subplots(figsize=(16, 12))\n",
"ax.set_ylabel(\"Avg Inference time (ms)\")\n",
"ax.set_title(\"Average inference time (ms) for each provider\")\n",
"ax.bar(time_results.keys(), time_results.values(), yerr=time_results_std)\n",
"plt.show()"
]
}
],
"metadata": {
"accelerator": "GPU",
"colab": {
"collapsed_sections": [],
"name": "ONNX Overview",
"provenance": [],
"toc_visible": true
},
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.0"
}
},
"nbformat": 4,
"nbformat_minor": 1
}
This source diff could not be displayed because it is too large. You can view the blob instead.
...@@ -25,12 +25,26 @@ Pull Request so it can be included under the Community notebooks. ...@@ -25,12 +25,26 @@ Pull Request so it can be included under the Community notebooks.
## Hugging Face's notebooks 🤗 ## Hugging Face's notebooks 🤗
### Documentation notebooks
You can open any page of the documentation as a notebook in colab (there is a button directly on said pages) but they are also listed here if you need to:
| Notebook | Description | |
|:----------|:-------------|------:|
| [Quicktour of the library](https://github.com/huggingface/notebooks/blob/master/transformers_doc/quicktour.ipynb) | A presentation of the various APIs in Transformers | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/transformers_doc/quicktour.ipynb) |
| [Summary of the tasks](https://github.com/huggingface/notebooks/blob/master/transformers_doc/task_summary.ipynb) | How to run the models of the Transformers library task by task | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/transformers_doc/task_summary.ipynb) |
| [Preprocessing data](https://github.com/huggingface/notebooks/blob/master/transformers_doc/preprocessing.ipynb) | How to use a tokenizer to preprocess your data | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/transformers_doc/preprocessing.ipynb) |
| [Fine-tuning a pretrained model](https://github.com/huggingface/notebooks/blob/master/transformers_doc/training.ipynb) | How to use the Trainer to fine-tune a pretrained model | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/transformers_doc/training.ipynb) |
| [Summary of the tokenizers](https://github.com/huggingface/notebooks/blob/master/transformers_doc/tokenizer_summary.ipynb) | The differences between the tokenizers algorithm | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/transformers_doc/tokenizer_summary.ipynb) |
| [Multilingual models](https://github.com/huggingface/notebooks/blob/master/transformers_doc/multilingual.ipynb) | How to use the multilingual models of the library | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/transformers_doc/multilingual.ipynb) |
| [Fine-tuning with custom datasets](https://github.com/huggingface/notebooks/blob/master/transformers_doc/custom_datasets.ipynb) | How to fine-tune a pretrained model on various tasks | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/transformers_doc/custom_datasets.ipynb) |
### Examples
| Notebook | Description | | | Notebook | Description | |
|:----------|:-------------|------:| |:----------|:-------------|------:|
| [Getting Started Tokenizers](https://github.com/huggingface/transformers/blob/master/notebooks/01-training-tokenizers.ipynb) | How to train and use your very own tokenizer |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/transformers/blob/master/notebooks/01-training-tokenizers.ipynb) | | [Train your tokenizer](https://github.com/huggingface/notebooks/blob/master/examples/tokenizer_training.ipynb) | How to train and use your very own tokenizer |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/examples/tokenizer_training.ipynb) |
| [Getting Started Transformers](https://github.com/huggingface/transformers/blob/master/notebooks/02-transformers.ipynb) | How to easily start using transformers | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/transformers/blob/master/notebooks/02-transformers.ipynb) | | [Train your language model](https://github.com/huggingface/notebooks/blob/master/examples/language_modeling_from_scratch.ipynb) | How to easily start using transformers | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/examples/language_modeling_from_scratch.ipynb) |
| [How to use Pipelines](https://github.com/huggingface/transformers/blob/master/notebooks/03-pipelines.ipynb) | Simple and efficient way to use State-of-the-Art models on downstream tasks through transformers | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/transformers/blob/master/notebooks/03-pipelines.ipynb) |
| [How to fine-tune a model on text classification](https://github.com/huggingface/notebooks/blob/master/examples/text_classification.ipynb) | Show how to preprocess the data and fine-tune a pretrained model on any GLUE task. | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/examples/text_classification.ipynb)| | [How to fine-tune a model on text classification](https://github.com/huggingface/notebooks/blob/master/examples/text_classification.ipynb) | Show how to preprocess the data and fine-tune a pretrained model on any GLUE task. | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/examples/text_classification.ipynb)|
| [How to fine-tune a model on language modeling](https://github.com/huggingface/notebooks/blob/master/examples/language_modeling.ipynb) | Show how to preprocess the data and fine-tune a pretrained model on a causal or masked LM task. | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/examples/language_modeling.ipynb)| | [How to fine-tune a model on language modeling](https://github.com/huggingface/notebooks/blob/master/examples/language_modeling.ipynb) | Show how to preprocess the data and fine-tune a pretrained model on a causal or masked LM task. | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/examples/language_modeling.ipynb)|
| [How to fine-tune a model on token classification](https://github.com/huggingface/notebooks/blob/master/examples/token_classification.ipynb) | Show how to preprocess the data and fine-tune a pretrained model on a token classification task (NER, PoS). | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/examples/token_classification.ipynb)| | [How to fine-tune a model on token classification](https://github.com/huggingface/notebooks/blob/master/examples/token_classification.ipynb) | Show how to preprocess the data and fine-tune a pretrained model on a token classification task (NER, PoS). | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/examples/token_classification.ipynb)|
...@@ -40,8 +54,8 @@ Pull Request so it can be included under the Community notebooks. ...@@ -40,8 +54,8 @@ Pull Request so it can be included under the Community notebooks.
| [How to fine-tune a model on summarization](https://github.com/huggingface/notebooks/blob/master/examples/summarization.ipynb) | Show how to preprocess the data and fine-tune a pretrained model on XSUM. | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/examples/summarization.ipynb)| | [How to fine-tune a model on summarization](https://github.com/huggingface/notebooks/blob/master/examples/summarization.ipynb) | Show how to preprocess the data and fine-tune a pretrained model on XSUM. | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/examples/summarization.ipynb)|
| [How to train a language model from scratch](https://github.com/huggingface/blog/blob/master/notebooks/01_how_to_train.ipynb)| Highlight all the steps to effectively train Transformer model on custom data | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/blog/blob/master/notebooks/01_how_to_train.ipynb)| | [How to train a language model from scratch](https://github.com/huggingface/blog/blob/master/notebooks/01_how_to_train.ipynb)| Highlight all the steps to effectively train Transformer model on custom data | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/blog/blob/master/notebooks/01_how_to_train.ipynb)|
| [How to generate text](https://github.com/huggingface/blog/blob/master/notebooks/02_how_to_generate.ipynb)| How to use different decoding methods for language generation with transformers | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/blog/blob/master/notebooks/02_how_to_generate.ipynb)| | [How to generate text](https://github.com/huggingface/blog/blob/master/notebooks/02_how_to_generate.ipynb)| How to use different decoding methods for language generation with transformers | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/blog/blob/master/notebooks/02_how_to_generate.ipynb)|
| [How to export model to ONNX](https://github.com/huggingface/transformers/blob/master/notebooks/04-onnx-export.ipynb) | Highlight how to export and run inference workloads through ONNX | | [How to export model to ONNX](https://github.com/huggingface/notebooks/blob/master/examples/onnx-export.ipynb) | Highlight how to export and run inference workloads through ONNX |
| [How to use Benchmarks](https://github.com/huggingface/transformers/blob/master/notebooks/05-benchmark.ipynb) | How to benchmark models with transformers | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/transformers/blob/master/notebooks/05-benchmark.ipynb)| | [How to use Benchmarks](https://github.com/huggingface/transformers/notebooks/blob/master/examples/benchmark.ipynb) | How to benchmark models with transformers | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/master/examples/benchmark.ipynb)|
| [Reformer](https://github.com/huggingface/blog/blob/master/notebooks/03_reformer.ipynb) | How Reformer pushes the limits of language modeling | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/patrickvonplaten/blog/blob/master/notebooks/03_reformer.ipynb)| | [Reformer](https://github.com/huggingface/blog/blob/master/notebooks/03_reformer.ipynb) | How Reformer pushes the limits of language modeling | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/patrickvonplaten/blog/blob/master/notebooks/03_reformer.ipynb)|
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment