"test/vscode:/vscode.git/clone" did not exist on "ec4837dcb86ac2eb795449381dddfebdc7f7be5b"
Commit 90351b4d authored by A. Unique TensorFlower's avatar A. Unique TensorFlower
Browse files

Internal change

PiperOrigin-RevId: 477832570
parent 7b129e6b
{"cells":[{"cell_type":"markdown","source":["# Conversion of COCO annotation JSON file to TFRecords"],"metadata":{"id":"SsIv6LYT84gm"}},{"cell_type":"markdown","source":["Given a COCO annotated JSON file, your goal is to convert it into a TFRecords file necessary to train with the Mask RCNN model.\n","\n","To accomplish this task, you will clone the TensorFlow Model Garden repo. The TensorFlow Model Garden is a repository with a number of different implementations of state-of-the-art (SOTA) models and modeling solutions for TensorFlow users.\n","\n","This notebook is an end to end example. When you run the notebook, it will take COCO annotated JSON train and test files as an input and will convert them into TFRecord files. You can also output sharded TFRecord files in case your training and validation data is huge. It makes it easier for the algorithm to read and access the data."],"metadata":{"id":"zl7o2xEW9IbX"}},{"cell_type":"markdown","source":["**Note** - In this example, we assume that all our data is saved on Google drive and we will also write our outputs to Google drive. We also assume that the script will be used as a Google Colab notebook. But this can be changed according to the needs of users. They can modify this in case they are working on their local workstation, remote server or any other database. This colab notebook can be changed to a regular jupyter notebook running on a local machine according to the need of the users."],"metadata":{"id":"g3OHfWQBpYVB"}},{"cell_type":"markdown","metadata":{"id":"CRwVTTPuED_1"},"source":["## Run the below command to connect to your google drive"]},{"cell_type":"code","execution_count":null,"metadata":{"colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"elapsed":4439,"status":"ok","timestamp":1650585781180,"user":{"displayName":"Umair Sabir","userId":"06940594206388957365"},"user_tz":420},"id":"hdRAEurMA3zi","outputId":"7212e558-af5d-4cb2-dd1f-6e634f5fca0a"},"outputs":[{"output_type":"stream","name":"stdout","text":["Collecting tensorflow-addons\n"," Downloading tensorflow_addons-0.16.1-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.1 MB)\n","\u001b[?25l\r\u001b[K |▎ | 10 kB 22.4 MB/s eta 0:00:01\r\u001b[K |▋ | 20 kB 8.9 MB/s eta 0:00:01\r\u001b[K |▉ | 30 kB 8.3 MB/s eta 0:00:01\r\u001b[K |█▏ | 40 kB 7.7 MB/s eta 0:00:01\r\u001b[K |█▌ | 51 kB 4.1 MB/s eta 0:00:01\r\u001b[K |█▊ | 61 kB 4.9 MB/s eta 0:00:01\r\u001b[K |██ | 71 kB 5.3 MB/s eta 0:00:01\r\u001b[K |██▍ | 81 kB 5.5 MB/s eta 0:00:01\r\u001b[K |██▋ | 92 kB 6.1 MB/s eta 0:00:01\r\u001b[K |███ | 102 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███▏ | 112 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███▌ | 122 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███▉ | 133 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████ | 143 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████▍ | 153 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████▊ | 163 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████ | 174 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████▎ | 184 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████▌ | 194 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████▉ | 204 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████▏ | 215 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████▍ | 225 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████▊ | 235 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████ | 245 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████▎ | 256 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████▋ | 266 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████▉ | 276 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████▏ | 286 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████▌ | 296 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████▊ | 307 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████ | 317 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████▍ | 327 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████▋ | 337 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████ | 348 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████▏ | 358 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████▌ | 368 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████▉ | 378 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████ | 389 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████▍ | 399 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████▊ | 409 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████ | 419 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████▎ | 430 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████▌ | 440 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████▉ | 450 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████▏ | 460 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████▍ | 471 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████▊ | 481 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████ | 491 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████▎ | 501 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████▋ | 512 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████▉ | 522 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████▏ | 532 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████▌ | 542 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████▊ | 552 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████ | 563 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████▍ | 573 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████▋ | 583 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████ | 593 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████▏ | 604 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████▌ | 614 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████▉ | 624 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████ | 634 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████▍ | 645 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████▊ | 655 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████ | 665 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████▎ | 675 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████▌ | 686 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████▉ | 696 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████▏ | 706 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████▍ | 716 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████▊ | 727 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████ | 737 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████▎ | 747 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████▋ | 757 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████▉ | 768 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████▏ | 778 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████▌ | 788 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████▊ | 798 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████ | 808 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████▍ | 819 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████▋ | 829 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████ | 839 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████▏ | 849 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████▌ | 860 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████▉ | 870 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████████ | 880 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████████▍ | 890 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████████▊ | 901 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████████ | 911 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████████▎ | 921 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████████▌ | 931 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████████▉ | 942 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████████▏ | 952 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████████▍ | 962 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████████▊ | 972 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████████ | 983 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████████▎ | 993 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████████▋ | 1.0 MB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████████▉ | 1.0 MB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████████████▏ | 1.0 MB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████████████▌ | 1.0 MB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████████████▊ | 1.0 MB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████████████ | 1.1 MB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████████████▍ | 1.1 MB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████████████▋ | 1.1 MB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████████████ | 1.1 MB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████████████▏| 1.1 MB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████████████▌| 1.1 MB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████████████▉| 1.1 MB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████████████| 1.1 MB 5.1 MB/s \n","\u001b[?25hRequirement already satisfied: typeguard>=2.7 in /usr/local/lib/python3.7/dist-packages (from tensorflow-addons) (2.7.1)\n","Installing collected packages: tensorflow-addons\n","Successfully installed tensorflow-addons-0.16.1\n"]}],"source":["!pip install tensorflow-addons"]},{"cell_type":"code","execution_count":null,"metadata":{"id":"bBN0CZWlD7zl"},"outputs":[],"source":["# import libraries\n","from google.colab import drive\n","import sys\n","from configparser import ConfigParser"]},{"cell_type":"code","source":["# \"opencv-python-headless\" version should be same of \"opencv-python\"\n","import pkg_resources\n","version_number = pkg_resources.get_distribution(\"opencv-python\").version\n","\n","!pip install opencv-python-headless==$version_number"],"metadata":{"id":"leap_jk5fq_v","colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"status":"ok","timestamp":1650585791722,"user_tz":420,"elapsed":5939,"user":{"displayName":"Umair Sabir","userId":"06940594206388957365"}},"outputId":"b5608bb5-24df-4fb1-9885-649ceca98a26"},"execution_count":null,"outputs":[{"output_type":"stream","name":"stdout","text":["Collecting opencv-python-headless==4.1.2.30\n"," Downloading opencv_python_headless-4.1.2.30-cp37-cp37m-manylinux1_x86_64.whl (21.8 MB)\n","\u001b[K |████████████████████████████████| 21.8 MB 62.9 MB/s \n","\u001b[?25hRequirement already satisfied: numpy>=1.14.5 in /usr/local/lib/python3.7/dist-packages (from opencv-python-headless==4.1.2.30) (1.21.6)\n","Installing collected packages: opencv-python-headless\n","Successfully installed opencv-python-headless-4.1.2.30\n"]}]},{"cell_type":"code","execution_count":null,"metadata":{"id":"i80tEP0pEJif","colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"status":"ok","timestamp":1650585818263,"user_tz":420,"elapsed":22456,"user":{"displayName":"Umair Sabir","userId":"06940594206388957365"}},"outputId":"cb0d8dde-8852-49eb-e6d7-33653722eee0"},"outputs":[{"output_type":"stream","name":"stdout","text":["Mounted at /content/gdrive\n","Successful\n"]}],"source":["# connect to google drive\n","drive.mount('/content/gdrive')\n","\n","# making an alias for the root path\n","try:\n"," !ln -s /content/gdrive/My\\ Drive/ /mydrive\n"," print('Successful')\n","except Exception as e:\n"," print(e)\n"," print('Not successful')"]},{"cell_type":"markdown","metadata":{"id":"w40-VpWXU-Hu"},"source":["## Clone TensorFlow Model Garden repository"]},{"cell_type":"code","execution_count":null,"metadata":{"id":"qjhCOR3ZYB0T","colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"status":"ok","timestamp":1650585823614,"user_tz":420,"elapsed":167,"user":{"displayName":"Umair Sabir","userId":"06940594206388957365"}},"outputId":"7dc2762f-bae2-4361-8e3f-d7f919ebfa1d"},"outputs":[{"output_type":"stream","name":"stdout","text":["/content/gdrive/My Drive/TFVision\n"]}],"source":["# move to the specified folder where you want to clone\n","%cd $tensorflow_model_folder"]},{"cell_type":"code","source":["# clone the Model Garden directory for Tensorflow where all the config files and scripts are located for this project. \n","# project folder name is - 'waste_identification_ml'\n","!git clone https://github.com/tensorflow/models.git "],"metadata":{"id":"Vh42KtozpqeT"},"execution_count":null,"outputs":[]},{"cell_type":"code","source":["# Go to the model folder\n","%cd models"],"metadata":{"id":"wm-k6-S4pr_B"},"execution_count":null,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"cbPxppeNWsD8"},"source":["## **MUST CHANGE** - Import the path and parameters"]},{"cell_type":"code","execution_count":null,"metadata":{"id":"6sMnfRIeWvKG","colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"status":"ok","timestamp":1650585820740,"user_tz":420,"elapsed":1207,"user":{"displayName":"Umair Sabir","userId":"06940594206388957365"}},"outputId":"f463824e-61a8-4512-b02a-a054f721e907"},"outputs":[{"output_type":"stream","name":"stdout","text":["/mydrive/TFVision/\n","/mydrive/gtech/MRFs/Recykal/Latest_sharing_by_sanket/Google_Recykal/Taxonomy_version_2/train/\n","/mydrive/gtech/MRFs/Recykal/Latest_sharing_by_sanket/Google_Recykal/Taxonomy_version_2/val/\n","/mydrive/gtech/MRFs/Recykal/Latest_sharing_by_sanket/Google_Recykal/Taxonomy_version_2/Total_images/\n","/mydrive/gtech/MRFs/Recykal/Latest_sharing_by_sanket/Google_Recykal/Taxonomy_version_2/_train.json\n","/mydrive/gtech/MRFs/Recykal/Latest_sharing_by_sanket/Google_Recykal/Taxonomy_version_2/Total_images/\n","/mydrive/gtech/MRFs/Recykal/Latest_sharing_by_sanket/Google_Recykal/Taxonomy_version_2/_val.json\n"]}],"source":["config = ConfigParser()\n","\n","# path to the config file defining parameters\n","# config.ini file is an important file where all the parameter variables are located\n","# config.ini file resides in the 'config' folder which is in the 'pre_processing' folder\n","config.read('official/projects/waste_identification_ml/pre_processing/config/config.ini')\n","\n","# path where you want to close tensorflow model directory\n","tensorflow_model_folder = config['tfrecord']['tensorflow_model_folder']\n","print(tensorflow_model_folder)\n","\n","# path where you want to put the traiing tfrecord file\n","training_data_folder = config['tfrecord']['training_data_folder']\n","print(training_data_folder)\n","\n","# path where you want to put the validation tfrecord file\n","validation_data_folder = config['tfrecord']['validation_data_folder']\n","print(validation_data_folder)\n","\n","# path where all training images are located\n","training_images_folder = config['tfrecord']['training_images_folder']\n","print(training_images_folder)\n","\n","# path of the training annotation file that needs to be converted\n","training_annotation_file = config['tfrecord']['training_annotation_file']\n","print(training_annotation_file)\n","\n","# path where all validation images are located\n","validation_images_folder = config['tfrecord']['validation_images_folder']\n","print(validation_images_folder)\n","\n","# path of the validation annotation file that needs tobe converted\n","validation_annotation_file = config['tfrecord']['validation_annotation_file']\n","print(validation_annotation_file)"]},{"cell_type":"markdown","metadata":{"id":"xNe2NuqjV4uW"},"source":["## Create TFRecord for training data"]},{"cell_type":"code","execution_count":null,"metadata":{"id":"J9Nz75g0oJkI"},"outputs":[],"source":["# create a folder for validation data\n","!mkdir -p $validation_data_folder\n","\n","# create a folder for training data\n","!mkdir -p $training_data_folder"]},{"cell_type":"code","execution_count":null,"metadata":{"id":"mjsai7PDAxgp","colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"status":"ok","timestamp":1650590090694,"user_tz":420,"elapsed":3284882,"user":{"displayName":"Umair Sabir","userId":"06940594206388957365"}},"outputId":"c78c7eaa-36e0-48e0-ba2c-3e674bdc5402"},"outputs":[{"output_type":"stream","name":"stdout","text":["I0422 00:06:23.072771 139705362556800 create_coco_tf_record.py:494] writing to output path: /mydrive/gtech/MRFs/Recykal/Latest_sharing_by_sanket/Google_Recykal/Taxonomy_version_2/train/\n","I0422 00:06:25.089654 139705362556800 create_coco_tf_record.py:366] Building bounding box index.\n","I0422 00:06:25.115955 139705362556800 create_coco_tf_record.py:377] 0 images are missing bboxes.\n","I0422 00:07:39.273266 139705362556800 tfrecord_lib.py:168] On image 0\n","I0422 00:09:03.214606 139705362556800 tfrecord_lib.py:168] On image 100\n","I0422 00:10:14.332473 139705362556800 tfrecord_lib.py:168] On image 200\n","I0422 00:11:11.556596 139705362556800 tfrecord_lib.py:168] On image 300\n","I0422 00:12:11.437826 139705362556800 tfrecord_lib.py:168] On image 400\n","I0422 00:13:13.166231 139705362556800 tfrecord_lib.py:168] On image 500\n","I0422 00:14:21.695016 139705362556800 tfrecord_lib.py:168] On image 600\n","I0422 00:15:24.191824 139705362556800 tfrecord_lib.py:168] On image 700\n","I0422 00:16:48.620902 139705362556800 tfrecord_lib.py:168] On image 800\n","I0422 00:17:48.565592 139705362556800 tfrecord_lib.py:168] On image 900\n","I0422 00:18:41.091029 139705362556800 tfrecord_lib.py:168] On image 1000\n","I0422 00:19:39.844225 139705362556800 tfrecord_lib.py:168] On image 1100\n","I0422 00:20:45.108587 139705362556800 tfrecord_lib.py:168] On image 1200\n","I0422 00:22:13.738559 139705362556800 tfrecord_lib.py:168] On image 1300\n","I0422 00:23:13.147292 139705362556800 tfrecord_lib.py:168] On image 1400\n","I0422 00:24:06.315325 139705362556800 tfrecord_lib.py:168] On image 1500\n","I0422 00:24:59.421572 139705362556800 tfrecord_lib.py:168] On image 1600\n","I0422 00:25:45.958540 139705362556800 tfrecord_lib.py:168] On image 1700\n","I0422 00:26:35.475085 139705362556800 tfrecord_lib.py:168] On image 1800\n","I0422 00:27:38.255803 139705362556800 tfrecord_lib.py:168] On image 1900\n","I0422 00:28:37.250636 139705362556800 tfrecord_lib.py:168] On image 2000\n","I0422 00:29:38.937792 139705362556800 tfrecord_lib.py:168] On image 2100\n","I0422 00:30:24.683607 139705362556800 tfrecord_lib.py:168] On image 2200\n","I0422 00:31:13.964802 139705362556800 tfrecord_lib.py:168] On image 2300\n","I0422 00:32:06.411041 139705362556800 tfrecord_lib.py:168] On image 2400\n","I0422 00:33:06.038232 139705362556800 tfrecord_lib.py:168] On image 2500\n","I0422 00:34:15.721037 139705362556800 tfrecord_lib.py:168] On image 2600\n","I0422 00:35:19.886712 139705362556800 tfrecord_lib.py:168] On image 2700\n","I0422 00:36:32.834578 139705362556800 tfrecord_lib.py:168] On image 2800\n","I0422 00:38:00.137243 139705362556800 tfrecord_lib.py:168] On image 2900\n","I0422 00:39:24.083769 139705362556800 tfrecord_lib.py:168] On image 3000\n","I0422 00:40:47.815561 139705362556800 tfrecord_lib.py:168] On image 3100\n","I0422 00:42:01.868806 139705362556800 tfrecord_lib.py:168] On image 3200\n","I0422 00:43:10.464518 139705362556800 tfrecord_lib.py:168] On image 3300\n","I0422 00:44:08.492330 139705362556800 tfrecord_lib.py:168] On image 3400\n","I0422 00:45:06.637591 139705362556800 tfrecord_lib.py:168] On image 3500\n","I0422 00:46:17.144057 139705362556800 tfrecord_lib.py:168] On image 3600\n","I0422 00:47:34.219212 139705362556800 tfrecord_lib.py:168] On image 3700\n","I0422 00:48:47.535176 139705362556800 tfrecord_lib.py:168] On image 3800\n","I0422 00:49:44.018001 139705362556800 tfrecord_lib.py:168] On image 3900\n","I0422 00:50:46.843277 139705362556800 tfrecord_lib.py:168] On image 4000\n","I0422 00:51:42.749161 139705362556800 tfrecord_lib.py:168] On image 4100\n","I0422 00:52:29.118489 139705362556800 tfrecord_lib.py:168] On image 4200\n","I0422 00:53:12.499863 139705362556800 tfrecord_lib.py:168] On image 4300\n","I0422 00:54:02.751904 139705362556800 tfrecord_lib.py:168] On image 4400\n","I0422 00:54:54.855237 139705362556800 tfrecord_lib.py:168] On image 4500\n","I0422 00:56:11.432259 139705362556800 tfrecord_lib.py:168] On image 4600\n","I0422 00:57:12.901312 139705362556800 tfrecord_lib.py:168] On image 4700\n","I0422 00:58:15.347571 139705362556800 tfrecord_lib.py:168] On image 4800\n","I0422 00:59:13.046698 139705362556800 tfrecord_lib.py:168] On image 4900\n","I0422 01:00:38.408758 139705362556800 tfrecord_lib.py:168] On image 5000\n","I0422 01:02:03.484946 139705362556800 tfrecord_lib.py:168] On image 5100\n","I0422 01:02:57.290261 139705362556800 tfrecord_lib.py:168] On image 5200\n","I0422 01:03:54.188467 139705362556800 tfrecord_lib.py:168] On image 5300\n","I0422 01:04:49.160263 139705362556800 tfrecord_lib.py:168] On image 5400\n","I0422 01:05:46.782065 139705362556800 tfrecord_lib.py:168] On image 5500\n","I0422 01:07:00.913060 139705362556800 tfrecord_lib.py:168] On image 5600\n","I0422 01:08:05.558512 139705362556800 tfrecord_lib.py:168] On image 5700\n","I0422 01:09:09.658477 139705362556800 tfrecord_lib.py:168] On image 5800\n","I0422 01:10:10.147291 139705362556800 tfrecord_lib.py:168] On image 5900\n","I0422 01:11:11.286698 139705362556800 tfrecord_lib.py:168] On image 6000\n","I0422 01:12:08.696386 139705362556800 tfrecord_lib.py:168] On image 6100\n","I0422 01:13:02.225769 139705362556800 tfrecord_lib.py:168] On image 6200\n","I0422 01:13:55.910152 139705362556800 tfrecord_lib.py:168] On image 6300\n","I0422 01:14:47.861520 139705362556800 tfrecord_lib.py:181] Finished writing, skipped 8 annotations.\n","I0422 01:14:47.862285 139705362556800 create_coco_tf_record.py:529] Finished writing, skipped 8 annotations.\n"]}],"source":["# run the script to convert your json file to TFRecord file\n","# --num_shards (how many TFRecord sharded files you want)\n","!python3 -m official.vision.data.create_coco_tf_record --logtostderr \\\n"," --image_dir=$training_images_folder \\\n"," --object_annotations_file=$training_annotation_file \\\n"," --output_file_prefix=$training_data_folder \\\n"," --num_shards=100 \\\n"," --include_masks=True \\\n"," --num_processes=0"]},{"cell_type":"markdown","metadata":{"id":"zwazp89SojMA"},"source":["## Create TFRecord for validation data"]},{"cell_type":"code","execution_count":null,"metadata":{"id":"nWbKeLoVwXbi","colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"status":"ok","timestamp":1650576542070,"user_tz":420,"elapsed":1834225,"user":{"displayName":"Umair Sabir","userId":"06940594206388957365"}},"outputId":"63f4fc03-43b1-424e-dfb2-200f9bbdf1e5"},"outputs":[{"output_type":"stream","name":"stdout","text":["I0421 20:53:39.071351 140304098097024 create_coco_tf_record.py:494] writing to output path: /mydrive/gtech/MRFs/Recykal/Latest_sharing_by_sanket/Google_Recykal/Taxonomy_version_2/val/\n","I0421 20:53:40.622877 140304098097024 create_coco_tf_record.py:366] Building bounding box index.\n","I0421 20:53:40.627101 140304098097024 create_coco_tf_record.py:377] 0 images are missing bboxes.\n","I0421 20:54:41.275259 140304098097024 tfrecord_lib.py:168] On image 0\n","I0421 20:56:53.052898 140304098097024 tfrecord_lib.py:168] On image 100\n","I0421 20:59:01.886727 140304098097024 tfrecord_lib.py:168] On image 200\n","I0421 21:01:12.356394 140304098097024 tfrecord_lib.py:168] On image 300\n","I0421 21:03:03.635432 140304098097024 tfrecord_lib.py:168] On image 400\n","I0421 21:05:04.787051 140304098097024 tfrecord_lib.py:168] On image 500\n","I0421 21:06:52.991898 140304098097024 tfrecord_lib.py:168] On image 600\n","I0421 21:09:02.626780 140304098097024 tfrecord_lib.py:168] On image 700\n","I0421 21:11:39.070799 140304098097024 tfrecord_lib.py:168] On image 800\n","I0421 21:13:58.603258 140304098097024 tfrecord_lib.py:168] On image 900\n","I0421 21:16:23.214870 140304098097024 tfrecord_lib.py:168] On image 1000\n","I0421 21:18:25.072518 140304098097024 tfrecord_lib.py:168] On image 1100\n","I0421 21:20:29.223420 140304098097024 tfrecord_lib.py:168] On image 1200\n","I0421 21:22:34.431273 140304098097024 tfrecord_lib.py:168] On image 1300\n","I0421 21:24:29.066092 140304098097024 tfrecord_lib.py:168] On image 1400\n","I0421 21:26:33.851860 140304098097024 tfrecord_lib.py:168] On image 1500\n","I0421 21:28:25.426244 140304098097024 tfrecord_lib.py:168] On image 1600\n","I0421 21:28:59.923923 140304098097024 tfrecord_lib.py:181] Finished writing, skipped 2 annotations.\n","I0421 21:28:59.924295 140304098097024 create_coco_tf_record.py:529] Finished writing, skipped 2 annotations.\n"]}],"source":["# run the script to convert your json file to TFRecord file\n","# --num_shards (how many TFRecord sharded files you want)\n","!python3 -m official.vision.data.create_coco_tf_record --logtostderr \\\n"," --image_dir=$validation_images_folder \\\n"," --object_annotations_file=$validation_annotation_file \\\n"," --output_file_prefix=$validation_data_folder \\\n"," --num_shards=100 \\\n"," --include_masks=True \\\n"," --num_processes=0"]}],"metadata":{"accelerator":"GPU","colab":{"collapsed_sections":[],"machine_shape":"hm","name":"coco_to_tfrecord.ipynb","provenance":[],"authorship_tag":"ABX9TyOBsOCsWRUa6CQ6GhCAKlN0"},"kernelspec":{"display_name":"Python 3","name":"python3"},"language_info":{"name":"python"}},"nbformat":4,"nbformat_minor":0}
\ No newline at end of file
{
"cells": [
{
"cell_type": "markdown",
"source": [
"# Conversion of COCO annotation JSON file to TFRecords"
],
"metadata": {
"id": "SsIv6LYT84gm"
}
},
{
"cell_type": "markdown",
"source": [
"Given a COCO annotated JSON file, your goal is to convert it into a TFRecords file necessary to train with the Mask RCNN model.\n",
"\n",
"To accomplish this task, you will clone the TensorFlow Model Garden repo. The TensorFlow Model Garden is a repository with a number of different implementations of state-of-the-art (SOTA) models and modeling solutions for TensorFlow users.\n",
"\n",
"This notebook is an end to end example. When you run the notebook, it will take COCO annotated JSON train and test files as an input and will convert them into TFRecord files. You can also output sharded TFRecord files in case your training and validation data is huge. It makes it easier for the algorithm to read and access the data."
],
"metadata": {
"id": "zl7o2xEW9IbX"
}
},
{
"cell_type": "markdown",
"source": [
"**Note** - In this example, we assume that all our data is saved on Google drive and we will also write our outputs to Google drive. We also assume that the script will be used as a Google Colab notebook. But this can be changed according to the needs of users. They can modify this in case they are working on their local workstation, remote server or any other database. This colab notebook can be changed to a regular jupyter notebook running on a local machine according to the need of the users."
],
"metadata": {
"id": "g3OHfWQBpYVB"
}
},
{
"cell_type": "markdown",
"metadata": {
"id": "CRwVTTPuED_1"
},
"source": [
"## Run the below command to connect to your google drive"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "hdRAEurMA3zi",
"outputId": "7212e558-af5d-4cb2-dd1f-6e634f5fca0a"
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Collecting tensorflow-addons\n",
" Downloading tensorflow_addons-0.16.1-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.1 MB)\n",
"\u001b[?25l\r\u001b[K |▎ | 10 kB 22.4 MB/s eta 0:00:01\r\u001b[K |▋ | 20 kB 8.9 MB/s eta 0:00:01\r\u001b[K |▉ | 30 kB 8.3 MB/s eta 0:00:01\r\u001b[K |█▏ | 40 kB 7.7 MB/s eta 0:00:01\r\u001b[K |█▌ | 51 kB 4.1 MB/s eta 0:00:01\r\u001b[K |█▊ | 61 kB 4.9 MB/s eta 0:00:01\r\u001b[K |██ | 71 kB 5.3 MB/s eta 0:00:01\r\u001b[K |██▍ | 81 kB 5.5 MB/s eta 0:00:01\r\u001b[K |██▋ | 92 kB 6.1 MB/s eta 0:00:01\r\u001b[K |███ | 102 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███▏ | 112 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███▌ | 122 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███▉ | 133 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████ | 143 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████▍ | 153 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████▊ | 163 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████ | 174 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████▎ | 184 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████▌ | 194 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████▉ | 204 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████▏ | 215 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████▍ | 225 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████▊ | 235 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████ | 245 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████▎ | 256 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████▋ | 266 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████▉ | 276 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████▏ | 286 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████▌ | 296 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████▊ | 307 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████ | 317 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████▍ | 327 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████▋ | 337 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████ | 348 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████▏ | 358 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████▌ | 368 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████▉ | 378 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████ | 389 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████▍ | 399 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████▊ | 409 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████ | 419 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████▎ | 430 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████▌ | 440 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████▉ | 450 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████▏ | 460 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████▍ | 471 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████▊ | 481 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████ | 491 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████▎ | 501 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████▋ | 512 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████▉ | 522 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████▏ | 532 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████▌ | 542 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████▊ | 552 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████ | 563 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████▍ | 573 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████▋ | 583 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████ | 593 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████▏ | 604 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████▌ | 614 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████▉ | 624 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████ | 634 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████▍ | 645 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████▊ | 655 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████ | 665 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████▎ | 675 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████▌ | 686 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████▉ | 696 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████▏ | 706 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████▍ | 716 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████▊ | 727 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████ | 737 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████▎ | 747 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████▋ | 757 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████▉ | 768 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████▏ | 778 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████▌ | 788 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████▊ | 798 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████ | 808 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████▍ | 819 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████▋ | 829 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████ | 839 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████▏ | 849 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████▌ | 860 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████▉ | 870 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████████ | 880 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████████▍ | 890 kB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████████▊ | 901 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████████ | 911 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████████▎ | 921 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████████▌ | 931 kB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████████▉ | 942 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████████▏ | 952 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████████▍ | 962 kB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████████▊ | 972 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████████ | 983 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████████▎ | 993 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████████▋ | 1.0 MB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████████▉ | 1.0 MB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████████████▏ | 1.0 MB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████████████▌ | 1.0 MB 5.1 MB/s eta 0:00:01\r\u001b[K |█████████████████████████████▊ | 1.0 MB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████████████ | 1.1 MB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████████████▍ | 1.1 MB 5.1 MB/s eta 0:00:01\r\u001b[K |██████████████████████████████▋ | 1.1 MB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████████████ | 1.1 MB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████████████▏| 1.1 MB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████████████▌| 1.1 MB 5.1 MB/s eta 0:00:01\r\u001b[K |███████████████████████████████▉| 1.1 MB 5.1 MB/s eta 0:00:01\r\u001b[K |████████████████████████████████| 1.1 MB 5.1 MB/s \n",
"\u001b[?25hRequirement already satisfied: typeguard>=2.7 in /usr/local/lib/python3.7/dist-packages (from tensorflow-addons) (2.7.1)\n",
"Installing collected packages: tensorflow-addons\n",
"Successfully installed tensorflow-addons-0.16.1\n"
]
}
],
"source": [
"!pip install tf-nightly\n",
"!pip install tensorflow-addons"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "bBN0CZWlD7zl"
},
"outputs": [],
"source": [
"# import libraries\n",
"from google.colab import drive\n",
"import sys"
]
},
{
"cell_type": "code",
"source": [
"# \"opencv-python-headless\" version should be same of \"opencv-python\"\n",
"import pkg_resources\n",
"version_number = pkg_resources.get_distribution(\"opencv-python\").version\n",
"\n",
"!pip install opencv-python-headless==$version_number"
],
"metadata": {
"id": "leap_jk5fq_v",
"colab": {
"base_uri": "https://localhost:8080/"
},
"outputId": "b5608bb5-24df-4fb1-9885-649ceca98a26"
},
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Collecting opencv-python-headless==4.1.2.30\n",
" Downloading opencv_python_headless-4.1.2.30-cp37-cp37m-manylinux1_x86_64.whl (21.8 MB)\n",
"\u001b[K |████████████████████████████████| 21.8 MB 62.9 MB/s \n",
"\u001b[?25hRequirement already satisfied: numpy>=1.14.5 in /usr/local/lib/python3.7/dist-packages (from opencv-python-headless==4.1.2.30) (1.21.6)\n",
"Installing collected packages: opencv-python-headless\n",
"Successfully installed opencv-python-headless-4.1.2.30\n"
]
}
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "i80tEP0pEJif",
"colab": {
"base_uri": "https://localhost:8080/"
},
"outputId": "cb0d8dde-8852-49eb-e6d7-33653722eee0"
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Mounted at /content/gdrive\n",
"Successful\n"
]
}
],
"source": [
"# connect to google drive\n",
"drive.mount('/content/gdrive')\n",
"\n",
"# making an alias for the root path\n",
"try:\n",
" !ln -s /content/gdrive/My\\ Drive/ /mydrive\n",
" print('Successful')\n",
"except Exception as e:\n",
" print(e)\n",
" print('Not successful')"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "w40-VpWXU-Hu"
},
"source": [
"## Clone TensorFlow Model Garden repository"
]
},
{
"cell_type": "code",
"source": [
"# clone the Model Garden directory for Tensorflow where all the config files and scripts are located for this project. \n",
"# project folder name is - 'waste_identification_ml'\n",
"!git clone https://github.com/tensorflow/models.git "
],
"metadata": {
"id": "Vh42KtozpqeT"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"# Go to the model folder\n",
"%cd models"
],
"metadata": {
"id": "wm-k6-S4pr_B"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "xNe2NuqjV4uW"
},
"source": [
"## Create TFRecord for training data"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "J9Nz75g0oJkI"
},
"outputs": [],
"source": [
"training_images_folder = '/mydrive/gtech/total_images/' #@param {type:\"string\"}\n",
"training_annotation_file = '/mydrive/gtech/_train.json' #@param {type:\"string\"}\n",
"output_folder = '/mydrive/gtech/train/' #@param {type:\"string\"}"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "mjsai7PDAxgp",
"colab": {
"base_uri": "https://localhost:8080/"
},
"outputId": "c78c7eaa-36e0-48e0-ba2c-3e674bdc5402"
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"I0422 00:06:23.072771 139705362556800 create_coco_tf_record.py:494] writing to output path: /mydrive/gtech/MRFs/Recykal/Latest_sharing_by_sanket/Google_Recykal/Taxonomy_version_2/train/\n",
"I0422 00:06:25.089654 139705362556800 create_coco_tf_record.py:366] Building bounding box index.\n",
"I0422 00:06:25.115955 139705362556800 create_coco_tf_record.py:377] 0 images are missing bboxes.\n",
"I0422 00:07:39.273266 139705362556800 tfrecord_lib.py:168] On image 0\n",
"I0422 00:09:03.214606 139705362556800 tfrecord_lib.py:168] On image 100\n",
"I0422 00:10:14.332473 139705362556800 tfrecord_lib.py:168] On image 200\n",
"I0422 00:11:11.556596 139705362556800 tfrecord_lib.py:168] On image 300\n",
"I0422 00:12:11.437826 139705362556800 tfrecord_lib.py:168] On image 400\n",
"I0422 00:13:13.166231 139705362556800 tfrecord_lib.py:168] On image 500\n",
"I0422 00:14:21.695016 139705362556800 tfrecord_lib.py:168] On image 600\n",
"I0422 00:15:24.191824 139705362556800 tfrecord_lib.py:168] On image 700\n",
"I0422 00:16:48.620902 139705362556800 tfrecord_lib.py:168] On image 800\n",
"I0422 00:17:48.565592 139705362556800 tfrecord_lib.py:168] On image 900\n",
"I0422 00:18:41.091029 139705362556800 tfrecord_lib.py:168] On image 1000\n",
"I0422 00:19:39.844225 139705362556800 tfrecord_lib.py:168] On image 1100\n",
"I0422 00:20:45.108587 139705362556800 tfrecord_lib.py:168] On image 1200\n",
"I0422 00:22:13.738559 139705362556800 tfrecord_lib.py:168] On image 1300\n",
"I0422 00:23:13.147292 139705362556800 tfrecord_lib.py:168] On image 1400\n",
"I0422 00:24:06.315325 139705362556800 tfrecord_lib.py:168] On image 1500\n",
"I0422 00:24:59.421572 139705362556800 tfrecord_lib.py:168] On image 1600\n",
"I0422 00:25:45.958540 139705362556800 tfrecord_lib.py:168] On image 1700\n",
"I0422 00:26:35.475085 139705362556800 tfrecord_lib.py:168] On image 1800\n",
"I0422 00:27:38.255803 139705362556800 tfrecord_lib.py:168] On image 1900\n",
"I0422 00:28:37.250636 139705362556800 tfrecord_lib.py:168] On image 2000\n",
"I0422 00:29:38.937792 139705362556800 tfrecord_lib.py:168] On image 2100\n",
"I0422 00:30:24.683607 139705362556800 tfrecord_lib.py:168] On image 2200\n",
"I0422 00:31:13.964802 139705362556800 tfrecord_lib.py:168] On image 2300\n",
"I0422 00:32:06.411041 139705362556800 tfrecord_lib.py:168] On image 2400\n",
"I0422 00:33:06.038232 139705362556800 tfrecord_lib.py:168] On image 2500\n",
"I0422 00:34:15.721037 139705362556800 tfrecord_lib.py:168] On image 2600\n",
"I0422 00:35:19.886712 139705362556800 tfrecord_lib.py:168] On image 2700\n",
"I0422 00:36:32.834578 139705362556800 tfrecord_lib.py:168] On image 2800\n",
"I0422 00:38:00.137243 139705362556800 tfrecord_lib.py:168] On image 2900\n",
"I0422 00:39:24.083769 139705362556800 tfrecord_lib.py:168] On image 3000\n",
"I0422 00:40:47.815561 139705362556800 tfrecord_lib.py:168] On image 3100\n",
"I0422 00:42:01.868806 139705362556800 tfrecord_lib.py:168] On image 3200\n",
"I0422 00:43:10.464518 139705362556800 tfrecord_lib.py:168] On image 3300\n",
"I0422 00:44:08.492330 139705362556800 tfrecord_lib.py:168] On image 3400\n",
"I0422 00:45:06.637591 139705362556800 tfrecord_lib.py:168] On image 3500\n",
"I0422 00:46:17.144057 139705362556800 tfrecord_lib.py:168] On image 3600\n",
"I0422 00:47:34.219212 139705362556800 tfrecord_lib.py:168] On image 3700\n",
"I0422 00:48:47.535176 139705362556800 tfrecord_lib.py:168] On image 3800\n",
"I0422 00:49:44.018001 139705362556800 tfrecord_lib.py:168] On image 3900\n",
"I0422 00:50:46.843277 139705362556800 tfrecord_lib.py:168] On image 4000\n",
"I0422 00:51:42.749161 139705362556800 tfrecord_lib.py:168] On image 4100\n",
"I0422 00:52:29.118489 139705362556800 tfrecord_lib.py:168] On image 4200\n",
"I0422 00:53:12.499863 139705362556800 tfrecord_lib.py:168] On image 4300\n",
"I0422 00:54:02.751904 139705362556800 tfrecord_lib.py:168] On image 4400\n",
"I0422 00:54:54.855237 139705362556800 tfrecord_lib.py:168] On image 4500\n",
"I0422 00:56:11.432259 139705362556800 tfrecord_lib.py:168] On image 4600\n",
"I0422 00:57:12.901312 139705362556800 tfrecord_lib.py:168] On image 4700\n",
"I0422 00:58:15.347571 139705362556800 tfrecord_lib.py:168] On image 4800\n",
"I0422 00:59:13.046698 139705362556800 tfrecord_lib.py:168] On image 4900\n",
"I0422 01:00:38.408758 139705362556800 tfrecord_lib.py:168] On image 5000\n",
"I0422 01:02:03.484946 139705362556800 tfrecord_lib.py:168] On image 5100\n",
"I0422 01:02:57.290261 139705362556800 tfrecord_lib.py:168] On image 5200\n",
"I0422 01:03:54.188467 139705362556800 tfrecord_lib.py:168] On image 5300\n",
"I0422 01:04:49.160263 139705362556800 tfrecord_lib.py:168] On image 5400\n",
"I0422 01:05:46.782065 139705362556800 tfrecord_lib.py:168] On image 5500\n",
"I0422 01:07:00.913060 139705362556800 tfrecord_lib.py:168] On image 5600\n",
"I0422 01:08:05.558512 139705362556800 tfrecord_lib.py:168] On image 5700\n",
"I0422 01:09:09.658477 139705362556800 tfrecord_lib.py:168] On image 5800\n",
"I0422 01:10:10.147291 139705362556800 tfrecord_lib.py:168] On image 5900\n",
"I0422 01:11:11.286698 139705362556800 tfrecord_lib.py:168] On image 6000\n",
"I0422 01:12:08.696386 139705362556800 tfrecord_lib.py:168] On image 6100\n",
"I0422 01:13:02.225769 139705362556800 tfrecord_lib.py:168] On image 6200\n",
"I0422 01:13:55.910152 139705362556800 tfrecord_lib.py:168] On image 6300\n",
"I0422 01:14:47.861520 139705362556800 tfrecord_lib.py:181] Finished writing, skipped 8 annotations.\n",
"I0422 01:14:47.862285 139705362556800 create_coco_tf_record.py:529] Finished writing, skipped 8 annotations.\n"
]
}
],
"source": [
"# run the script to convert your json file to TFRecord file\n",
"# --num_shards (how many TFRecord sharded files you want)\n",
"!python3 -m official.vision.data.create_coco_tf_record --logtostderr \\\n",
" --image_dir=$training_images_folder \\\n",
" --object_annotations_file=$training_annotation_file \\\n",
" --output_file_prefix=$output_folder \\\n",
" --num_shards=100 \\\n",
" --include_masks=True \\\n",
" --num_processes=0"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "zwazp89SojMA"
},
"source": [
"## Create TFRecord for validation data"
]
},
{
"cell_type": "code",
"source": [
"validation_annotation_file = '/mydrive/gtech/total_images/' #@param {type:\"string\"}\n",
"validation_data_folder = '/mydrive/gtech/_val.json' #@param {type:\"string\"}\n",
"output_folder = '/mydrive/gtech/val/' #@param {type:\"string\"}"
],
"metadata": {
"id": "OVQn5DiFBUfv"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "nWbKeLoVwXbi",
"colab": {
"base_uri": "https://localhost:8080/"
},
"outputId": "63f4fc03-43b1-424e-dfb2-200f9bbdf1e5"
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"I0421 20:53:39.071351 140304098097024 create_coco_tf_record.py:494] writing to output path: /mydrive/gtech/MRFs/Recykal/Latest_sharing_by_sanket/Google_Recykal/Taxonomy_version_2/val/\n",
"I0421 20:53:40.622877 140304098097024 create_coco_tf_record.py:366] Building bounding box index.\n",
"I0421 20:53:40.627101 140304098097024 create_coco_tf_record.py:377] 0 images are missing bboxes.\n",
"I0421 20:54:41.275259 140304098097024 tfrecord_lib.py:168] On image 0\n",
"I0421 20:56:53.052898 140304098097024 tfrecord_lib.py:168] On image 100\n",
"I0421 20:59:01.886727 140304098097024 tfrecord_lib.py:168] On image 200\n",
"I0421 21:01:12.356394 140304098097024 tfrecord_lib.py:168] On image 300\n",
"I0421 21:03:03.635432 140304098097024 tfrecord_lib.py:168] On image 400\n",
"I0421 21:05:04.787051 140304098097024 tfrecord_lib.py:168] On image 500\n",
"I0421 21:06:52.991898 140304098097024 tfrecord_lib.py:168] On image 600\n",
"I0421 21:09:02.626780 140304098097024 tfrecord_lib.py:168] On image 700\n",
"I0421 21:11:39.070799 140304098097024 tfrecord_lib.py:168] On image 800\n",
"I0421 21:13:58.603258 140304098097024 tfrecord_lib.py:168] On image 900\n",
"I0421 21:16:23.214870 140304098097024 tfrecord_lib.py:168] On image 1000\n",
"I0421 21:18:25.072518 140304098097024 tfrecord_lib.py:168] On image 1100\n",
"I0421 21:20:29.223420 140304098097024 tfrecord_lib.py:168] On image 1200\n",
"I0421 21:22:34.431273 140304098097024 tfrecord_lib.py:168] On image 1300\n",
"I0421 21:24:29.066092 140304098097024 tfrecord_lib.py:168] On image 1400\n",
"I0421 21:26:33.851860 140304098097024 tfrecord_lib.py:168] On image 1500\n",
"I0421 21:28:25.426244 140304098097024 tfrecord_lib.py:168] On image 1600\n",
"I0421 21:28:59.923923 140304098097024 tfrecord_lib.py:181] Finished writing, skipped 2 annotations.\n",
"I0421 21:28:59.924295 140304098097024 create_coco_tf_record.py:529] Finished writing, skipped 2 annotations.\n"
]
}
],
"source": [
"# run the script to convert your json file to TFRecord file\n",
"# --num_shards (how many TFRecord sharded files you want)\n",
"!python3 -m official.vision.data.create_coco_tf_record --logtostderr \\\n",
" --image_dir=$validation_images_folder \\\n",
" --object_annotations_file=$validation_annotation_file \\\n",
" --output_file_prefix=$output_folder \\\n",
" --num_shards=100 \\\n",
" --include_masks=True \\\n",
" --num_processes=0"
]
}
],
"metadata": {
"accelerator": "GPU",
"colab": {
"collapsed_sections": [],
"machine_shape": "hm",
"provenance": []
},
"kernelspec": {
"display_name": "Python 3",
"name": "python3"
},
"language_info": {
"name": "python"
}
},
"nbformat": 4,
"nbformat_minor": 0
}
\ No newline at end of file
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment