Lesson 7: Pre-trained Models and Transfer Learning

Banner.

Pre-trained Models and Transfer Learning

Instead of building a CNN from scratch, pre-trained networks can be used to predict new image samples. Pre-trained deep neural networks are useful for predicting samples that belong to the same classes as those in the pre-trained set.

Pre-trained models save computational time and resources as they are trained on vast amounts of data. These models can be fine-tuned for specific tasks, allowing them to generalize well to new data.

Examples of pre-trained models in Keras include:

  • Xception
  • VGG16
  • VGG19
  • ResNet50
  • ResNet101
  • And more

For a full list of pre-trained models, refer to the official Keras documentation: https://keras.io/api/applications/.

Steps for Using a Pre-trained Model

A pretrained model can be used without tuning as follows:

  1. Load the data
  2. Preprocess the data
  3. Initialize the pre-trained model
  4. Make predictions with the model
  5. Find the object predicted with the highest probability

The ResNet50 Pre-trained Model

Let’s use the ResNet50 pre-trained model to make predictions. ResNet-50 is a convolutional neural network that is 50 layers deep and trained on over a million images across 1000 categories, including objects like keyboards, mice, pencils, and various animals.

The data used for ResNet-50 has a shape of (224, 224, 3), so we need to ensure that the image sample we want to predict is resized to the same shape.

Loading the Image to be Predicted

Lets upload and display image that will be predicted with a pretrained model.

Code
import matplotlib.pyplot as plt
from keras.preprocessing.image import load_img, img_to_array

# Load the image
orange = load_img("orange.png", color_mode="rgb", target_size=(224, 224))

# Convert to array
orange_array = img_to_array(orange)

# Display the image
plt.imshow(orange_array.astype('uint8')); 
plt.axis('off');  # Hide the axes
(np.float64(-0.5), np.float64(223.5), np.float64(223.5), np.float64(-0.5))

Prepare the Image to be Predicted

The data used for the ResNet has 4 dimensions so we need to make sure the image to be predicted is 4D. So, let’s check the dimensions of the image.

Code
import keras
from keras.preprocessing.image import load_img, img_to_array

orange_arr = keras.preprocessing.image.img_to_array(orange)
print("Image Shape", orange_arr.shape)
Image Shape (224, 224, 3)

Let’s reshape the image from 3D to 4D

Code
orange_arr = orange_arr.reshape(1, 224, 224, 3)
orange_arr.shape
(1, 224, 224, 3)

Let’s prepare the image to be predicted

Code
import keras

orange_image = keras.applications.resnet50.preprocess_input(orange_arr)

Initialize the ResNet50 Pretrained Model

Code
# initialize the pretrained model
resnet_model = keras.applications.ResNet50()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/resnet/resnet50_weights_tf_dim_ordering_tf_kernels.h5

        0/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0s/step
   139264/102967424 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
   589824/102967424 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
  1146880/102967424 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
  1687552/102967424 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
  2457600/102967424 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
  2883584/102967424 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
  3407872/102967424 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
  3915776/102967424 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
  4521984/102967424 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step 
  5160960/102967424 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
  5873664/102967424 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
  6488064/102967424 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
  7192576/102967424 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
  7749632/102967424 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
  8470528/102967424 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
  9142272/102967424 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
  9748480/102967424 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
 10469376/102967424 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
 11042816/102967424 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
 11681792/102967424 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
 12320768/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 12926976/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 13647872/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 14204928/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 14598144/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 15237120/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 15892480/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 16465920/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 17154048/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 17694720/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 18300928/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 18792448/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 19365888/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 20004864/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 20316160/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 21774336/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 22429696/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 23035904/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 23511040/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 24051712/102967424 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
 24625152/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 25174016/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 25837568/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 26083328/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 26509312/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 26984448/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 27557888/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 28196864/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 28868608/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 29327360/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 30064640/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 30605312/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 31309824/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 31670272/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 32260096/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 32800768/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 33521664/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 33964032/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 34668544/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 35274752/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 35946496/102967424 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
 36618240/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 37322752/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 37781504/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 38436864/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 39026688/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 39731200/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 40321024/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 41074688/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 41500672/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 42156032/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 42614784/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 43368448/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 43859968/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 44613632/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 45236224/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 45957120/102967424 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
 46596096/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 47284224/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 47792128/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 48349184/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 48840704/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 49561600/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 50135040/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 50774016/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 51462144/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 52068352/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 52592640/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 53346304/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 53870592/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 54640640/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 55230464/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 55771136/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 56459264/102967424 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
 57212928/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 57819136/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 58556416/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 58966016/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 59228160/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 59441152/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 59654144/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 59949056/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 60456960/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 60981248/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 61652992/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 62046208/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 62783488/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 63406080/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 64126976/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 64634880/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 65142784/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 65798144/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 66551808/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 67117056/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 67862528/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 68501504/102967424 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
 69124096/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 69763072/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 70500352/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 71041024/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 71696384/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 72286208/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 72957952/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 73433088/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 73957376/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 74629120/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 75366400/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 75874304/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 76611584/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 77119488/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 77709312/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 78217216/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 78495744/102967424 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
 80871424/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 81575936/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 82198528/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 82952192/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 83558400/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 84230144/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 84836352/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 85524480/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 86097920/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 86638592/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 87212032/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 87801856/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 88440832/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 89112576/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 89587712/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 90308608/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 90882048/102967424 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
 91570176/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
 92192768/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
 92913664/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
 93552640/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
 94240768/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
 94830592/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
 95567872/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
 96108544/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
 96780288/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
 97435648/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
 98091008/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
 98615296/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
 99303424/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
 99958784/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
100671488/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
100827136/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
102465536/102967424 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
102967424/102967424 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
Code
#resnet_model.summary() 

Use the ResNet50 Pretrained Model for Prediction

Code
import tensorflow as tf

# Disable TensorFlow logging
tf.get_logger().setLevel('ERROR')

# make a prediction
y_pred = resnet_model.predict(orange_image, verbose=0);

# print the top 2 probabilities with the corresponding predicted images
results = keras.applications.resnet50.decode_predictions(y_pred, top=2)
Downloading data from https://storage.googleapis.com/download.tensorflow.org/data/imagenet_class_index.json

    0/35363 ━━━━━━━━━━━━━━━━━━━━ 0s 0s/step
35363/35363 ━━━━━━━━━━━━━━━━━━━━ 0s 1us/step
Code
print(results)
[[('n07747607', 'orange', np.float32(0.94502753)), ('n07749582', 'lemon', np.float32(0.049757604))]]

The results show top two predicted labels with their probabilities. Orange has the highest probability so we can classify or identify the image as an orange.

Building a Model with Images in a Directory

Let’s first build a model from scratch using image data from a directory. Then, we will compare this approach with fine-tuning a pre-trained model, instead of building a model entirely from scratch. ### Preparing and Loadig the Image Data

If the training and test sets are stored in a directory on your computer, we can upload the data and use it for model training or transfer learn, as shown in this section. You can create a training_set folder and a test_set folder inside a “data” folder. The data folder is at the first level, inside the project directory.

Training and Test Folders

Assuming the data is binary and consists of cars and flowers, you should create a car folder and a flower folder containing the respective car and flower images for both the training and test sets. For example, the diagram below shows the car and flowerfolders inside the training_set.

Training Set Folder

The diagram below shows the training flower images inside the flower folder.

Training Set Images

Read the training and test image datasets

Code
import os
base_dir = "../data"
train_dir = os.path.join(base_dir, "training_set")
test_dir = os.path.join(base_dir, "test_set")
# directory with training car images
train_car_dir = os.path.join(train_dir, "car")
# directory with test car images
test_car_dir = os.path.join(test_dir, "car")
# directory with training flower images
train_flower_dir = os.path.join(train_dir, "flower")
# directory with test flower images
test_flower_dir = os.path.join(test_dir, "flower")

print("Does the path exist? ", os.path.exists(train_flower_dir))
Does the path exist?  True
Code
# Create data generators for training and testing
train_datagen = tf.keras.preprocessing.image.ImageDataGenerator(
    rescale=1./255,             # Normalize pixel values
    rotation_range=40,          # Random rotation
    width_shift_range=0.2,      # Random horizontal shift
    height_shift_range=0.2,     # Random vertical shift
    shear_range=0.2,            # Random shear
    zoom_range=0.2,             # Random zoom
    horizontal_flip=True        # Random horizontal flip
)

test_datagen = tf.keras.preprocessing.image.ImageDataGenerator(rescale=1./255)  # Only rescale for test data

Image Data Preprocessing and Batching

We will then prepare the image data folders for training by applying the following steps:

  • Rescale pixel values to a range between 0 and 1.
  • Resize images to 64x64 pixels.
  • Load images in batches of 20 from the specified directory. This data is then ready to be fed into a machine learning model for training.
Code
# Create data generators for rescaling
train_datagen = tf.keras.preprocessing.image.ImageDataGenerator(rescale=1./255)
test_datagen = tf.keras.preprocessing.image.ImageDataGenerator(rescale=1./255)

# Load training images in batches of 20 with resizing to 64x64 pixels
train_set = train_datagen.flow_from_directory(
    train_dir,
    target_size=(64, 64),
    batch_size=20,
    class_mode='binary'
)
Found 2000 images belonging to 2 classes.
Code
# Load test images in batches of 20 with resizing to 64x64 pixels
test_set = test_datagen.flow_from_directory(
    test_dir,
    target_size=(64, 64),
    batch_size=20,
    class_mode='binary'
)
Found 2000 images belonging to 2 classes.
Code
# Print both test and train image shapes in a single print statement 
print(f"Test image shape: {test_set.image_shape}\nTrain image shape: {train_set.image_shape}")
Test image shape: (64, 64, 3)
Train image shape: (64, 64, 3)
Note
  • After an image is processed by ImageDataGenerator using its flow_from_directory() method, train_set.image_shape will show the shape of one image in the dataset, not the entire batch.
  • ImageDataGenerator is lazy and loads data in batches during training.
  • The first batch of images is loaded into memory initially when using train_set = train_datagen.flow_from_directory().
  • This first batch is the one that will be fed into the model for training.

Model Building

After preparing the image data and have the data in a suitable numerical tensor format, we can now initialize a deep learning model, define the architecture, compile the model and fit the model as follows.

Code
import tensorflow as tf
import keras
from keras import layers

import warnings

# Suppress all warnings globally
warnings.filterwarnings('ignore')

# Set random seed for reproducibility
tf.random.set_seed(1234)

# Initialize the Sequential model
model = keras.Sequential()

# Add the Input layer to specify the input shape
model.add(layers.Input(shape=(64, 64, 3)))  # Set input shape to 64x64 images with 3 color channels (RGB)

# Add the first convolutional layer with 28 filters and 3x3 kernel
model.add(layers.Conv2D(28, (3, 3), activation='relu', padding="same"))

# Add the first MaxPooling layer
model.add(layers.MaxPooling2D((2, 2)))

# Add the second convolutional layer with 64 filters and 3x3 kernel
model.add(layers.Conv2D(64, (3, 3), activation='relu'))

# Add the second MaxPooling layer
model.add(layers.MaxPooling2D((2, 2)))

# Flatten the output to connect to fully connected layers
model.add(layers.Flatten())

# Dropout layer to drop 50% of the neurons during training to prevent overfitting
model.add(layers.Dropout(0.5))

# Add a Dense fully connected layer with 512 units
model.add(layers.Dense(512, activation='relu'))

# Add the output layer with a sigmoid activation for binary classification
model.add(layers.Dense(1, activation='sigmoid'))

# Compile the model with Adam optimizer, binary cross-entropy loss, and accuracy metric
model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])

# Fit the model to the training data (train_set) with validation from the test data (test_set)
model.fit(
    train_set,
    steps_per_epoch=100,  # steps * batch_size = 2000
    epochs=5,
    validation_data=test_set,
    validation_steps=100,
    shuffle=False, 
    verbose=0
)
<keras.src.callbacks.history.History object at 0x321c0a4d0>
Code
model.summary();
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv2d (Conv2D)                 │ (None, 64, 64, 28)     │           784 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ max_pooling2d (MaxPooling2D)    │ (None, 32, 32, 28)     │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv2d_1 (Conv2D)               │ (None, 30, 30, 64)     │        16,192 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ max_pooling2d_1 (MaxPooling2D)  │ (None, 15, 15, 64)     │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ flatten (Flatten)               │ (None, 14400)          │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 14400)          │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 512)            │     7,373,312 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_1 (Dense)                 │ (None, 1)              │           513 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 22,172,405 (84.58 MB)
 Trainable params: 7,390,801 (28.19 MB)
 Non-trainable params: 0 (0.00 B)
 Optimizer params: 14,781,604 (56.39 MB)
Note
  • Epoch: One full pass through the entire dataset. The number of epochs determines how many times the algorithm sees the dataset.
  • Batch Size: Defines how many examples are used to update parameters at a time.
  • Steps per Epoch: The number of iterations (or batches) required to process the entire dataset, calculated as the size of the dataset divided by the batch size.

Evaluate the model

Code
# Evaluate the model on the test set
test_loss, test_accuracy = model.evaluate(test_set, steps=100, verbose=0)

# Print the evaluation results in a single line with newlines
print(f"Test Loss: {test_loss}\nTest Accuracy: {test_accuracy}")
Test Loss: 0.4155365526676178
Test Accuracy: 0.8144999742507935

Use the model for prediction

Code
# Access the class labels (flow and car)
class_labels = train_set.class_indices
print("Class Labels:", class_labels)
Class Labels: {'car': 0, 'flower': 1}
Code
sample_images, sample_labels = next(test_set)  # Get a batch of images and labels from the test_set

# Predict the labels for the second samples
y_pred = model.predict(sample_images[1:2], verbose=0)
print("True Label: ", sample_labels[1])
True Label:  1.0
Code
print(y_pred)
[[0.5831039]]

Transfer Learning

Transfer learning involves using a pre-trained model and adapting it to a new dataset. A CNN model consists of two parts: the convolutional base (which captures generic features) and the classifier (ANN). We can retain the convolutional base from a pre-trained model and replace the classifier to suit the new task. This approach allows us to freeze the convolutional layers while modifying the classifier to predict specific categories, such as adapting an animal classifier to predict only cats and dogs.

How to Fine-Tune a Pre-trained Model

Instead of training a model from scratch, we can leverage a pre-trained model to initialize the model with learned features. This speeds up convergence by reusing previously learned representations. We freeze the earlier layers to preserve the general features, while fine-tuning the later layers (e.g., the output layer) to adapt the model to the new task.

In this case, we will fine-tune a pre-trained VGG16 model to classify objects as either a car or a flower, modifying it from its original 1000 categories to just two.

Steps for Fine-tuning the VGG16 Model

  1. Initializing a Pre-trained Model
    We will begin by loading the VGG16 model with pre-trained weights, excluding the top (classification) layers, as we will replace them with a custom output layer.

  2. Initializing the Sequential Model
    A Sequential model will be used to stack layers in the desired order.

  3. Adding Layers of the Pre-trained Model
    We will add all layers of the pre-trained VGG16 model to the Sequential model except for the last layer, which is the classifier layer (since we will be modifying it).

  4. Freezing the Initial Layers
    The initial layers of the pre-trained model will be frozen, meaning their weights will not be updated during training. This allows us to retain the feature extraction capabilities learned from the large dataset the model was initially trained on.

  5. Adding an Output Layer
    A new output layer will be added to the frozen layers, adjusted to predict only two categories (car and flower).

  6. Compiling the Network
    The model will be compiled with an appropriate optimizer, loss function, and evaluation metrics.

  7. Fitting the Model with Additional Data
    The model will be trained (fine-tuned) with additional data, which could be augmented to improve generalization.

  8. Using the Model to Make Predictions
    Finally, we will use the trained model to make predictions on new examples, classifying them as either a car or a flower.

Code
# initialize the pre-trained vgg16 model
import tensorflow as tf
import keras

tf.random.set_seed(1234)
vgg16_model = keras.applications.VGG16()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/vgg16/vgg16_weights_tf_dim_ordering_tf_kernels.h5

        0/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0s/step
   147456/553467096 ━━━━━━━━━━━━━━━━━━━━ 3:15 0us/step
   524288/553467096 ━━━━━━━━━━━━━━━━━━━━ 1:48 0us/step
   999424/553467096 ━━━━━━━━━━━━━━━━━━━━ 1:29 0us/step
  1622016/553467096 ━━━━━━━━━━━━━━━━━━━━ 1:14 0us/step
  2277376/553467096 ━━━━━━━━━━━━━━━━━━━━ 1:05 0us/step
  2850816/553467096 ━━━━━━━━━━━━━━━━━━━━ 1:02 0us/step
  3407872/553467096 ━━━━━━━━━━━━━━━━━━━━ 1:00 0us/step
  4063232/553467096 ━━━━━━━━━━━━━━━━━━━━ 57s 0us/step 
  4702208/553467096 ━━━━━━━━━━━━━━━━━━━━ 55s 0us/step
  5423104/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
  5963776/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
  6586368/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
  7077888/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
  7782400/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
  8396800/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
  9191424/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
  9863168/553467096 ━━━━━━━━━━━━━━━━━━━━ 48s 0us/step
 10387456/553467096 ━━━━━━━━━━━━━━━━━━━━ 48s 0us/step
 11075584/553467096 ━━━━━━━━━━━━━━━━━━━━ 48s 0us/step
 11796480/553467096 ━━━━━━━━━━━━━━━━━━━━ 47s 0us/step
 12435456/553467096 ━━━━━━━━━━━━━━━━━━━━ 47s 0us/step
 13123584/553467096 ━━━━━━━━━━━━━━━━━━━━ 46s 0us/step
 13746176/553467096 ━━━━━━━━━━━━━━━━━━━━ 46s 0us/step
 14467072/553467096 ━━━━━━━━━━━━━━━━━━━━ 46s 0us/step
 15024128/553467096 ━━━━━━━━━━━━━━━━━━━━ 46s 0us/step
 15646720/553467096 ━━━━━━━━━━━━━━━━━━━━ 46s 0us/step
 16318464/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
 17006592/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
 17481728/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
 18186240/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
 18890752/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
 19415040/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
 19922944/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
 20316160/553467096 ━━━━━━━━━━━━━━━━━━━━ 46s 0us/step
 20398080/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 20922368/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 22183936/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 23248896/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 23543808/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 24092672/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 24379392/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 24772608/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 25165824/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 25542656/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 25919488/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 26288128/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 26656768/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 27017216/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 27377664/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 27721728/553467096 ━━━━━━━━━━━━━━━━━━━━ 54s 0us/step
 28360704/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 28999680/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 29589504/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 30212096/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 30408704/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 30605312/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 30900224/553467096 ━━━━━━━━━━━━━━━━━━━━ 54s 0us/step
 32047104/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 32604160/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 33062912/553467096 ━━━━━━━━━━━━━━━━━━━━ 54s 0us/step
 34291712/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 34766848/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 35012608/553467096 ━━━━━━━━━━━━━━━━━━━━ 54s 0us/step
 35422208/553467096 ━━━━━━━━━━━━━━━━━━━━ 54s 0us/step
 36536320/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 36995072/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 37601280/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 38060032/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 38649856/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 38682624/553467096 ━━━━━━━━━━━━━━━━━━━━ 54s 0us/step
 39763968/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 40255488/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 40665088/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 41615360/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 42090496/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 42549248/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 43024384/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 43466752/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 43974656/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 44417024/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 44924928/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 45268992/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 45858816/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 46235648/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 46792704/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 47235072/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 47742976/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 48201728/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 48726016/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 49348608/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 49905664/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 50462720/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 51118080/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 51511296/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 53035008/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 53444608/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 53919744/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 54050816/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 54296576/553467096 ━━━━━━━━━━━━━━━━━━━━ 53s 0us/step
 55787520/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 56197120/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 56623104/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 57147392/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 57737216/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 58228736/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 58605568/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 59080704/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 59441152/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 59850752/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 60325888/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 60702720/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 61194240/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 61554688/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 62373888/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 62849024/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 63127552/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 63979520/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 64372736/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 64847872/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 65306624/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 65748992/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 66191360/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 66486272/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 67338240/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 67665920/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 68026368/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 68386816/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 68812800/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 69271552/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 69681152/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 70975488/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 71483392/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 72015872/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 72581120/553467096 ━━━━━━━━━━━━━━━━━━━━ 52s 0us/step
 73662464/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 74219520/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 74809344/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 75251712/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 75890688/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 76464128/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 76857344/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 77512704/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 77905920/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 78430208/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 78823424/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 78938112/553467096 ━━━━━━━━━━━━━━━━━━━━ 51s 0us/step
 80134144/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 80576512/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 81117184/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 81625088/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 82059264/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 82460672/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 83017728/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 83427328/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 84066304/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 84574208/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 85000192/553467096 ━━━━━━━━━━━━━━━━━━━━ 50s 0us/step
 85475328/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
 85917696/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
 86327296/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
 86818816/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
 87326720/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
 87670784/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
 88211456/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
 88686592/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
 89178112/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
 89505792/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
 90079232/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
 90750976/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
 91439104/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
 92061696/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
 92569600/553467096 ━━━━━━━━━━━━━━━━━━━━ 48s 0us/step
 93175808/553467096 ━━━━━━━━━━━━━━━━━━━━ 48s 0us/step
 93765632/553467096 ━━━━━━━━━━━━━━━━━━━━ 48s 0us/step
 94437376/553467096 ━━━━━━━━━━━━━━━━━━━━ 48s 0us/step
 95174656/553467096 ━━━━━━━━━━━━━━━━━━━━ 48s 0us/step
 95518720/553467096 ━━━━━━━━━━━━━━━━━━━━ 48s 0us/step
 96206848/553467096 ━━━━━━━━━━━━━━━━━━━━ 48s 0us/step
 96862208/553467096 ━━━━━━━━━━━━━━━━━━━━ 48s 0us/step
 97550336/553467096 ━━━━━━━━━━━━━━━━━━━━ 47s 0us/step
 98025472/553467096 ━━━━━━━━━━━━━━━━━━━━ 47s 0us/step
 98795520/553467096 ━━━━━━━━━━━━━━━━━━━━ 47s 0us/step
 99352576/553467096 ━━━━━━━━━━━━━━━━━━━━ 47s 0us/step
 99876864/553467096 ━━━━━━━━━━━━━━━━━━━━ 47s 0us/step
100483072/553467096 ━━━━━━━━━━━━━━━━━━━━ 47s 0us/step
101138432/553467096 ━━━━━━━━━━━━━━━━━━━━ 47s 0us/step
101826560/553467096 ━━━━━━━━━━━━━━━━━━━━ 47s 0us/step
102481920/553467096 ━━━━━━━━━━━━━━━━━━━━ 46s 0us/step
103170048/553467096 ━━━━━━━━━━━━━━━━━━━━ 46s 0us/step
103825408/553467096 ━━━━━━━━━━━━━━━━━━━━ 46s 0us/step
104316928/553467096 ━━━━━━━━━━━━━━━━━━━━ 46s 0us/step
106102784/553467096 ━━━━━━━━━━━━━━━━━━━━ 46s 0us/step
106708992/553467096 ━━━━━━━━━━━━━━━━━━━━ 46s 0us/step
107315200/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
107921408/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
108593152/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
109264896/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
109805568/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
110428160/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
111034368/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
111706112/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
112345088/553467096 ━━━━━━━━━━━━━━━━━━━━ 45s 0us/step
112967680/553467096 ━━━━━━━━━━━━━━━━━━━━ 44s 0us/step
113639424/553467096 ━━━━━━━━━━━━━━━━━━━━ 44s 0us/step
114212864/553467096 ━━━━━━━━━━━━━━━━━━━━ 44s 0us/step
114769920/553467096 ━━━━━━━━━━━━━━━━━━━━ 44s 0us/step
115425280/553467096 ━━━━━━━━━━━━━━━━━━━━ 44s 0us/step
116047872/553467096 ━━━━━━━━━━━━━━━━━━━━ 44s 0us/step
116604928/553467096 ━━━━━━━━━━━━━━━━━━━━ 44s 0us/step
117309440/553467096 ━━━━━━━━━━━━━━━━━━━━ 44s 0us/step
117735424/553467096 ━━━━━━━━━━━━━━━━━━━━ 44s 0us/step
118456320/553467096 ━━━━━━━━━━━━━━━━━━━━ 44s 0us/step
119111680/553467096 ━━━━━━━━━━━━━━━━━━━━ 43s 0us/step
119816192/553467096 ━━━━━━━━━━━━━━━━━━━━ 43s 0us/step
120373248/553467096 ━━━━━━━━━━━━━━━━━━━━ 43s 0us/step
121044992/553467096 ━━━━━━━━━━━━━━━━━━━━ 43s 0us/step
121634816/553467096 ━━━━━━━━━━━━━━━━━━━━ 43s 0us/step
122314752/553467096 ━━━━━━━━━━━━━━━━━━━━ 43s 0us/step
122896384/553467096 ━━━━━━━━━━━━━━━━━━━━ 43s 0us/step
123625472/553467096 ━━━━━━━━━━━━━━━━━━━━ 43s 0us/step
124125184/553467096 ━━━━━━━━━━━━━━━━━━━━ 43s 0us/step
124813312/553467096 ━━━━━━━━━━━━━━━━━━━━ 42s 0us/step
125304832/553467096 ━━━━━━━━━━━━━━━━━━━━ 42s 0us/step
125927424/553467096 ━━━━━━━━━━━━━━━━━━━━ 42s 0us/step
126566400/553467096 ━━━━━━━━━━━━━━━━━━━━ 42s 0us/step
127221760/553467096 ━━━━━━━━━━━━━━━━━━━━ 42s 0us/step
127680512/553467096 ━━━━━━━━━━━━━━━━━━━━ 42s 0us/step
128417792/553467096 ━━━━━━━━━━━━━━━━━━━━ 42s 0us/step
129024000/553467096 ━━━━━━━━━━━━━━━━━━━━ 42s 0us/step
129777664/553467096 ━━━━━━━━━━━━━━━━━━━━ 42s 0us/step
130433024/553467096 ━━━━━━━━━━━━━━━━━━━━ 42s 0us/step
131096576/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
131563520/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
132317184/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
132956160/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
133709824/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
134053888/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
134692864/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
135380992/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
135479296/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
136134656/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
136830976/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
137445376/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
138117120/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
138657792/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
139214848/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
139788288/553467096 ━━━━━━━━━━━━━━━━━━━━ 41s 0us/step
140591104/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
141148160/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
141869056/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
142524416/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
143163392/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
143572992/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
143769600/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
144048128/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
144293888/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
144769024/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
145391616/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
145915904/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
146571264/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
147161088/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
147800064/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
148340736/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
149012480/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
149700608/553467096 ━━━━━━━━━━━━━━━━━━━━ 40s 0us/step
150454272/553467096 ━━━━━━━━━━━━━━━━━━━━ 39s 0us/step
151003136/553467096 ━━━━━━━━━━━━━━━━━━━━ 39s 0us/step
151552000/553467096 ━━━━━━━━━━━━━━━━━━━━ 39s 0us/step
152256512/553467096 ━━━━━━━━━━━━━━━━━━━━ 39s 0us/step
152961024/553467096 ━━━━━━━━━━━━━━━━━━━━ 39s 0us/step
153665536/553467096 ━━━━━━━━━━━━━━━━━━━━ 39s 0us/step
154386432/553467096 ━━━━━━━━━━━━━━━━━━━━ 39s 0us/step
154976256/553467096 ━━━━━━━━━━━━━━━━━━━━ 39s 0us/step
155631616/553467096 ━━━━━━━━━━━━━━━━━━━━ 39s 0us/step
156336128/553467096 ━━━━━━━━━━━━━━━━━━━━ 39s 0us/step
157040640/553467096 ━━━━━━━━━━━━━━━━━━━━ 38s 0us/step
157777920/553467096 ━━━━━━━━━━━━━━━━━━━━ 38s 0us/step
158351360/553467096 ━━━━━━━━━━━━━━━━━━━━ 38s 0us/step
158941184/553467096 ━━━━━━━━━━━━━━━━━━━━ 38s 0us/step
159531008/553467096 ━━━━━━━━━━━━━━━━━━━━ 38s 0us/step
160235520/553467096 ━━━━━━━━━━━━━━━━━━━━ 38s 0us/step
160727040/553467096 ━━━━━━━━━━━━━━━━━━━━ 38s 0us/step
161316864/553467096 ━━━━━━━━━━━━━━━━━━━━ 38s 0us/step
161939456/553467096 ━━━━━━━━━━━━━━━━━━━━ 38s 0us/step
162627584/553467096 ━━━━━━━━━━━━━━━━━━━━ 38s 0us/step
163315712/553467096 ━━━━━━━━━━━━━━━━━━━━ 38s 0us/step
163872768/553467096 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
164560896/553467096 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
165183488/553467096 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
165806080/553467096 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
166526976/553467096 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
167165952/553467096 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
167870464/553467096 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
168411136/553467096 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
168919040/553467096 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
169394176/553467096 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
170082304/553467096 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
170647552/553467096 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
171245568/553467096 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
171376640/553467096 ━━━━━━━━━━━━━━━━━━━━ 37s 0us/step
173244416/553467096 ━━━━━━━━━━━━━━━━━━━━ 36s 0us/step
173916160/553467096 ━━━━━━━━━━━━━━━━━━━━ 36s 0us/step
174555136/553467096 ━━━━━━━━━━━━━━━━━━━━ 36s 0us/step
175177728/553467096 ━━━━━━━━━━━━━━━━━━━━ 36s 0us/step
175734784/553467096 ━━━━━━━━━━━━━━━━━━━━ 36s 0us/step
176291840/553467096 ━━━━━━━━━━━━━━━━━━━━ 36s 0us/step
176865280/553467096 ━━━━━━━━━━━━━━━━━━━━ 36s 0us/step
177569792/553467096 ━━━━━━━━━━━━━━━━━━━━ 36s 0us/step
178143232/553467096 ━━━━━━━━━━━━━━━━━━━━ 36s 0us/step
178847744/553467096 ━━━━━━━━━━━━━━━━━━━━ 36s 0us/step
179552256/553467096 ━━━━━━━━━━━━━━━━━━━━ 35s 0us/step
180060160/553467096 ━━━━━━━━━━━━━━━━━━━━ 35s 0us/step
180748288/553467096 ━━━━━━━━━━━━━━━━━━━━ 35s 0us/step
181452800/553467096 ━━━━━━━━━━━━━━━━━━━━ 35s 0us/step
182009856/553467096 ━━━━━━━━━━━━━━━━━━━━ 35s 0us/step
182763520/553467096 ━━━━━━━━━━━━━━━━━━━━ 35s 0us/step
183255040/553467096 ━━━━━━━━━━━━━━━━━━━━ 35s 0us/step
183959552/553467096 ━━━━━━━━━━━━━━━━━━━━ 35s 0us/step
184598528/553467096 ━━━━━━━━━━━━━━━━━━━━ 35s 0us/step
185155584/553467096 ━━━━━━━━━━━━━━━━━━━━ 35s 0us/step
185630720/553467096 ━━━━━━━━━━━━━━━━━━━━ 35s 0us/step
186269696/553467096 ━━━━━━━━━━━━━━━━━━━━ 35s 0us/step
186974208/553467096 ━━━━━━━━━━━━━━━━━━━━ 35s 0us/step
187547648/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
188088320/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
188760064/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
189415424/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
189693952/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
190185472/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
191578112/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
192053248/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
192643072/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
193282048/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
193773568/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
194379776/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
194985984/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
195723264/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
196362240/553467096 ━━━━━━━━━━━━━━━━━━━━ 34s 0us/step
197033984/553467096 ━━━━━━━━━━━━━━━━━━━━ 33s 0us/step
197640192/553467096 ━━━━━━━━━━━━━━━━━━━━ 33s 0us/step
198115328/553467096 ━━━━━━━━━━━━━━━━━━━━ 33s 0us/step
198787072/553467096 ━━━━━━━━━━━━━━━━━━━━ 33s 0us/step
199426048/553467096 ━━━━━━━━━━━━━━━━━━━━ 33s 0us/step
200032256/553467096 ━━━━━━━━━━━━━━━━━━━━ 33s 0us/step
200704000/553467096 ━━━━━━━━━━━━━━━━━━━━ 33s 0us/step
201228288/553467096 ━━━━━━━━━━━━━━━━━━━━ 33s 0us/step
201932800/553467096 ━━━━━━━━━━━━━━━━━━━━ 33s 0us/step
202375168/553467096 ━━━━━━━━━━━━━━━━━━━━ 33s 0us/step
203063296/553467096 ━━━━━━━━━━━━━━━━━━━━ 33s 0us/step
203735040/553467096 ━━━━━━━━━━━━━━━━━━━━ 33s 0us/step
204324864/553467096 ━━━━━━━━━━━━━━━━━━━━ 33s 0us/step
204947456/553467096 ━━━━━━━━━━━━━━━━━━━━ 33s 0us/step
205520896/553467096 ━━━━━━━━━━━━━━━━━━━━ 32s 0us/step
206110720/553467096 ━━━━━━━━━━━━━━━━━━━━ 32s 0us/step
206880768/553467096 ━━━━━━━━━━━━━━━━━━━━ 32s 0us/step
207323136/553467096 ━━━━━━━━━━━━━━━━━━━━ 32s 0us/step
207994880/553467096 ━━━━━━━━━━━━━━━━━━━━ 32s 0us/step
208601088/553467096 ━━━━━━━━━━━━━━━━━━━━ 32s 0us/step
209338368/553467096 ━━━━━━━━━━━━━━━━━━━━ 32s 0us/step
210010112/553467096 ━━━━━━━━━━━━━━━━━━━━ 32s 0us/step
210731008/553467096 ━━━━━━━━━━━━━━━━━━━━ 32s 0us/step
211189760/553467096 ━━━━━━━━━━━━━━━━━━━━ 32s 0us/step
211877888/553467096 ━━━━━━━━━━━━━━━━━━━━ 32s 0us/step
212467712/553467096 ━━━━━━━━━━━━━━━━━━━━ 32s 0us/step
213155840/553467096 ━━━━━━━━━━━━━━━━━━━━ 32s 0us/step
213762048/553467096 ━━━━━━━━━━━━━━━━━━━━ 32s 0us/step
214433792/553467096 ━━━━━━━━━━━━━━━━━━━━ 31s 0us/step
215040000/553467096 ━━━━━━━━━━━━━━━━━━━━ 31s 0us/step
215678976/553467096 ━━━━━━━━━━━━━━━━━━━━ 31s 0us/step
216219648/553467096 ━━━━━━━━━━━━━━━━━━━━ 31s 0us/step
216956928/553467096 ━━━━━━━━━━━━━━━━━━━━ 31s 0us/step
217546752/553467096 ━━━━━━━━━━━━━━━━━━━━ 31s 0us/step
218120192/553467096 ━━━━━━━━━━━━━━━━━━━━ 31s 0us/step
218742784/553467096 ━━━━━━━━━━━━━━━━━━━━ 31s 0us/step
219463680/553467096 ━━━━━━━━━━━━━━━━━━━━ 31s 0us/step
219955200/553467096 ━━━━━━━━━━━━━━━━━━━━ 31s 0us/step
220577792/553467096 ━━━━━━━━━━━━━━━━━━━━ 31s 0us/step
221118464/553467096 ━━━━━━━━━━━━━━━━━━━━ 31s 0us/step
221888512/553467096 ━━━━━━━━━━━━━━━━━━━━ 31s 0us/step
222412800/553467096 ━━━━━━━━━━━━━━━━━━━━ 31s 0us/step
223100928/553467096 ━━━━━━━━━━━━━━━━━━━━ 30s 0us/step
223707136/553467096 ━━━━━━━━━━━━━━━━━━━━ 30s 0us/step
224346112/553467096 ━━━━━━━━━━━━━━━━━━━━ 30s 0us/step
224985088/553467096 ━━━━━━━━━━━━━━━━━━━━ 30s 0us/step
225722368/553467096 ━━━━━━━━━━━━━━━━━━━━ 30s 0us/step
226410496/553467096 ━━━━━━━━━━━━━━━━━━━━ 30s 0us/step
227164160/553467096 ━━━━━━━━━━━━━━━━━━━━ 30s 0us/step
227753984/553467096 ━━━━━━━━━━━━━━━━━━━━ 30s 0us/step
228409344/553467096 ━━━━━━━━━━━━━━━━━━━━ 30s 0us/step
229015552/553467096 ━━━━━━━━━━━━━━━━━━━━ 30s 0us/step
229785600/553467096 ━━━━━━━━━━━━━━━━━━━━ 30s 0us/step
230440960/553467096 ━━━━━━━━━━━━━━━━━━━━ 30s 0us/step
231161856/553467096 ━━━━━━━━━━━━━━━━━━━━ 30s 0us/step
231702528/553467096 ━━━━━━━━━━━━━━━━━━━━ 29s 0us/step
232439808/553467096 ━━━━━━━━━━━━━━━━━━━━ 29s 0us/step
232980480/553467096 ━━━━━━━━━━━━━━━━━━━━ 29s 0us/step
233668608/553467096 ━━━━━━━━━━━━━━━━━━━━ 29s 0us/step
234356736/553467096 ━━━━━━━━━━━━━━━━━━━━ 29s 0us/step
234979328/553467096 ━━━━━━━━━━━━━━━━━━━━ 29s 0us/step
235479040/553467096 ━━━━━━━━━━━━━━━━━━━━ 29s 0us/step
236126208/553467096 ━━━━━━━━━━━━━━━━━━━━ 29s 0us/step
236699648/553467096 ━━━━━━━━━━━━━━━━━━━━ 29s 0us/step
237420544/553467096 ━━━━━━━━━━━━━━━━━━━━ 29s 0us/step
238125056/553467096 ━━━━━━━━━━━━━━━━━━━━ 29s 0us/step
238698496/553467096 ━━━━━━━━━━━━━━━━━━━━ 29s 0us/step
239337472/553467096 ━━━━━━━━━━━━━━━━━━━━ 29s 0us/step
240041984/553467096 ━━━━━━━━━━━━━━━━━━━━ 29s 0us/step
240566272/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
241303552/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
241729536/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
241958912/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
242237440/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
242515968/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
242892800/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
243466240/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
243990528/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
244498432/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
245104640/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
245825536/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
246480896/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
247119872/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
247742464/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
248414208/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
249069568/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
249774080/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
250331136/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
251002880/553467096 ━━━━━━━━━━━━━━━━━━━━ 28s 0us/step
251658240/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
252280832/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
252903424/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
253558784/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
253788160/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
253968384/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
254312448/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
255541248/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
256114688/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
256704512/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
257245184/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
257835008/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
258424832/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
259096576/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
259670016/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
260194304/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
260784128/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
261324800/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
261947392/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
262537216/553467096 ━━━━━━━━━━━━━━━━━━━━ 27s 0us/step
263127040/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
263667712/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
264257536/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
264830976/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
265420800/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
265977856/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
266551296/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
267091968/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
267599872/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
268271616/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
268877824/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
269565952/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
270188544/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
270614528/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
271335424/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
272089088/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
272793600/553467096 ━━━━━━━━━━━━━━━━━━━━ 26s 0us/step
273514496/553467096 ━━━━━━━━━━━━━━━━━━━━ 25s 0us/step
274251776/553467096 ━━━━━━━━━━━━━━━━━━━━ 25s 0us/step
274759680/553467096 ━━━━━━━━━━━━━━━━━━━━ 25s 0us/step
275398656/553467096 ━━━━━━━━━━━━━━━━━━━━ 25s 0us/step
276135936/553467096 ━━━━━━━━━━━━━━━━━━━━ 25s 0us/step
276791296/553467096 ━━━━━━━━━━━━━━━━━━━━ 25s 0us/step
277512192/553467096 ━━━━━━━━━━━━━━━━━━━━ 25s 0us/step
278069248/553467096 ━━━━━━━━━━━━━━━━━━━━ 25s 0us/step
278724608/553467096 ━━━━━━━━━━━━━━━━━━━━ 25s 0us/step
279216128/553467096 ━━━━━━━━━━━━━━━━━━━━ 25s 0us/step
279904256/553467096 ━━━━━━━━━━━━━━━━━━━━ 25s 0us/step
280608768/553467096 ━━━━━━━━━━━━━━━━━━━━ 25s 0us/step
281329664/553467096 ━━━━━━━━━━━━━━━━━━━━ 25s 0us/step
281853952/553467096 ━━━━━━━━━━━━━━━━━━━━ 25s 0us/step
282492928/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
283181056/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
283869184/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
284606464/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
285147136/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
285818880/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
286408704/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
287129600/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
287834112/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
288473088/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
289144832/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
289734656/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
290308096/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
290947072/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
291602432/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
292093952/553467096 ━━━━━━━━━━━━━━━━━━━━ 24s 0us/step
292716544/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
293371904/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
294027264/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
294715392/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
295239680/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
295845888/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
296583168/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
297041920/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
297680896/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
298221568/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
298909696/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
299581440/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
300236800/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
300875776/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
301580288/553467096 ━━━━━━━━━━━━━━━━━━━━ 23s 0us/step
302202880/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
302563328/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
302923776/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
303169536/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
303398912/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
303939584/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
304644096/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
305201152/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
305971200/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
306593792/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
307232768/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
307888128/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
308609024/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
309231616/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
309755904/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
310312960/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
310951936/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
311607296/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
312344576/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
312688640/553467096 ━━━━━━━━━━━━━━━━━━━━ 22s 0us/step
313393152/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
313982976/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
314703872/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
315326464/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
316080128/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
316702720/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
317259776/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
317882368/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
318668800/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
319324160/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
320045056/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
320552960/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
321241088/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
321896448/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
322600960/553467096 ━━━━━━━━━━━━━━━━━━━━ 21s 0us/step
323256320/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
324009984/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
324583424/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
325304320/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
325910528/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
326549504/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
327221248/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
327958528/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
328433664/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
329089024/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
329695232/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
330399744/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
331022336/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
331718656/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
332414976/553467096 ━━━━━━━━━━━━━━━━━━━━ 20s 0us/step
333135872/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
333709312/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
334413824/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
334921728/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
335593472/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
336265216/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
336936960/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
337510400/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
338116608/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
338690048/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
339410944/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
340000768/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
340557824/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
341164032/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
341770240/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
342343680/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
342933504/553467096 ━━━━━━━━━━━━━━━━━━━━ 19s 0us/step
343556096/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
344244224/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
344834048/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
345554944/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
346177536/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
346873856/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
347455488/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
348176384/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
348766208/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
349536256/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
350191616/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
350937088/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
351617024/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
352108544/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
352813056/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
353501184/553467096 ━━━━━━━━━━━━━━━━━━━━ 18s 0us/step
354099200/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
354762752/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
355368960/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
356040704/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
356630528/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
357302272/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
357924864/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
358645760/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
359268352/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
359923712/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
360595456/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
361316352/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
361906176/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
362545152/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
363167744/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
363888640/553467096 ━━━━━━━━━━━━━━━━━━━━ 17s 0us/step
364544000/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
365330432/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
365944832/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
366526464/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
367067136/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
367476736/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
367738880/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
368033792/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
368279552/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
368787456/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
369328128/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
370065408/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
370606080/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
371195904/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
371802112/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
372539392/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
373112832/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
373833728/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
374374400/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
374849536/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
375439360/553467096 ━━━━━━━━━━━━━━━━━━━━ 16s 0us/step
376963072/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
377159680/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
377815040/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
378454016/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
379158528/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
379404288/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
379961344/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
380239872/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
381829120/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
382337024/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
382894080/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
383451136/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
383975424/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
384466944/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
385007616/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
385433600/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
386039808/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
386498560/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
387121152/553467096 ━━━━━━━━━━━━━━━━━━━━ 15s 0us/step
387645440/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
388153344/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
388694016/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
389234688/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
389775360/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
390283264/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
390823936/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
391380992/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
391938048/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
392658944/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
392953856/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
393674752/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
394248192/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
394887168/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
395378688/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
396115968/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
396771328/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
397377536/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
397967360/553467096 ━━━━━━━━━━━━━━━━━━━━ 14s 0us/step
398721024/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
399343616/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
400031744/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
400670720/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
401326080/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
401932288/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
402653184/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
403193856/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
403947520/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
404389888/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
405061632/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
405602304/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
406323200/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
406880256/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
407519232/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
408125440/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
408846336/553467096 ━━━━━━━━━━━━━━━━━━━━ 13s 0us/step
409501696/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
410206208/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
410877952/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
411451392/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
412073984/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
412811264/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
413270016/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
414056448/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
414531584/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
415252480/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
415875072/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
416628736/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
417218560/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
417841152/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
418447360/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
419037184/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
419627008/553467096 ━━━━━━━━━━━━━━━━━━━━ 12s 0us/step
420216832/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
420839424/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
421396480/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
422002688/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
422690816/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
423231488/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
423886848/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
424345600/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
425115648/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
425803776/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
426524672/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
426754048/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
426967040/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
427245568/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
427458560/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
428015616/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
428572672/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
429211648/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
429654016/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
430292992/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
431046656/553467096 ━━━━━━━━━━━━━━━━━━━━ 11s 0us/step
431685632/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
432324608/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
432947200/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
433569792/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
434118656/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
434733056/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
435273728/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
435994624/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
436469760/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
437190656/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
437862400/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
438534144/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
439156736/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
439795712/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
440451072/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
441057280/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
441712640/553467096 ━━━━━━━━━━━━━━━━━━━━ 10s 0us/step
442466304/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step 
443039744/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
443596800/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
444284928/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
444940288/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
445530112/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
446251008/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
446906368/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
447627264/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
448135168/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
448806912/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
449576960/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
450117632/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
450871296/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
451493888/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
452116480/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
452755456/553467096 ━━━━━━━━━━━━━━━━━━━━ 9s 0us/step
453410816/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
454131712/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
454885376/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
455524352/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
456114176/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
456654848/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
457261056/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
457949184/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
458604544/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
459194368/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
459898880/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
460488704/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
461078528/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
461619200/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
462356480/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
462962688/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
463568896/553467096 ━━━━━━━━━━━━━━━━━━━━ 8s 0us/step
464125952/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
464863232/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
465567744/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
466190336/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
466763776/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
467419136/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
468074496/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
468811776/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
469450752/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
470155264/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
470630400/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
471384064/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
472039424/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
472809472/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
473350144/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
474021888/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
474595328/553467096 ━━━━━━━━━━━━━━━━━━━━ 7s 0us/step
475365376/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
476004352/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
476725248/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
477216768/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
477855744/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
478527488/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
479264768/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
479936512/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
480690176/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
481280000/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
481968128/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
482623488/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
483360768/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
484065280/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
484589568/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
485244928/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
485703680/553467096 ━━━━━━━━━━━━━━━━━━━━ 6s 0us/step
486309888/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
486973440/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
487522304/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
488243200/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
488923136/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
489537536/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
490160128/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
490569728/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
490766336/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
490995712/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
491159552/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
491503616/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
491896832/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
492421120/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
492470272/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
492699648/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
493322240/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
493813760/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
494370816/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
495026176/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
495583232/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
496189440/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
496369664/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
496697344/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
497319936/553467096 ━━━━━━━━━━━━━━━━━━━━ 5s 0us/step
497975296/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
498483200/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
499073024/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
499630080/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
500269056/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
500973568/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
501645312/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
502235136/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
502677504/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
503332864/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
503939072/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
504569856/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
505036800/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
505643008/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
506232832/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
506839040/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
507412480/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
508067840/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
508444672/553467096 ━━━━━━━━━━━━━━━━━━━━ 4s 0us/step
508985344/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
509591552/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
510140416/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
510705664/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
511377408/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
511918080/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
512540672/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
513097728/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
513671168/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
514211840/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
514899968/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
515407872/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
516161536/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
516751360/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
517406720/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
518029312/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
518537216/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
519110656/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
519798784/553467096 ━━━━━━━━━━━━━━━━━━━━ 3s 0us/step
520470528/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
520994816/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
521584640/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
522289152/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
522813440/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
523501568/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
524140544/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
524795904/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
525467648/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
525942784/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
526581760/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
527253504/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
527777792/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
528465920/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
529039360/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
529727488/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
530300928/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
530776064/553467096 ━━━━━━━━━━━━━━━━━━━━ 2s 0us/step
531447808/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
532217856/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
532758528/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
533446656/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
534118400/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
534708224/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
535445504/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
536133632/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
536805376/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
537427968/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
538001408/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
538722304/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
539361280/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
540049408/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
540786688/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
541343744/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
541966336/553467096 ━━━━━━━━━━━━━━━━━━━━ 1s 0us/step
542736384/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
543391744/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
544129024/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
544702464/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
545243136/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
545882112/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
546586624/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
547160064/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
547897344/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
548356096/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
548978688/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
549601280/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
550338560/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
550928384/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
551534592/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
552157184/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
552796160/553467096 ━━━━━━━━━━━━━━━━━━━━ 0s 0us/step
553467096/553467096 ━━━━━━━━━━━━━━━━━━━━ 49s 0us/step
Code
vgg16_model.summary() 
Model: "vgg16"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ input_layer_2 (InputLayer)      │ (None, 224, 224, 3)    │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block1_conv1 (Conv2D)           │ (None, 224, 224, 64)   │         1,792 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block1_conv2 (Conv2D)           │ (None, 224, 224, 64)   │        36,928 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block1_pool (MaxPooling2D)      │ (None, 112, 112, 64)   │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block2_conv1 (Conv2D)           │ (None, 112, 112, 128)  │        73,856 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block2_conv2 (Conv2D)           │ (None, 112, 112, 128)  │       147,584 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block2_pool (MaxPooling2D)      │ (None, 56, 56, 128)    │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block3_conv1 (Conv2D)           │ (None, 56, 56, 256)    │       295,168 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block3_conv2 (Conv2D)           │ (None, 56, 56, 256)    │       590,080 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block3_conv3 (Conv2D)           │ (None, 56, 56, 256)    │       590,080 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block3_pool (MaxPooling2D)      │ (None, 28, 28, 256)    │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block4_conv1 (Conv2D)           │ (None, 28, 28, 512)    │     1,180,160 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block4_conv2 (Conv2D)           │ (None, 28, 28, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block4_conv3 (Conv2D)           │ (None, 28, 28, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block4_pool (MaxPooling2D)      │ (None, 14, 14, 512)    │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block5_conv1 (Conv2D)           │ (None, 14, 14, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block5_conv2 (Conv2D)           │ (None, 14, 14, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block5_conv3 (Conv2D)           │ (None, 14, 14, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block5_pool (MaxPooling2D)      │ (None, 7, 7, 512)      │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ flatten (Flatten)               │ (None, 25088)          │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ fc1 (Dense)                     │ (None, 4096)           │   102,764,544 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ fc2 (Dense)                     │ (None, 4096)           │    16,781,312 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ predictions (Dense)             │ (None, 1000)           │     4,097,000 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 138,357,544 (527.79 MB)
 Trainable params: 138,357,544 (527.79 MB)
 Non-trainable params: 0 (0.00 B)

We aim to transfer all the initial layers of the pre-trained model, except for the output layers, to a new classifier.

Code
# Initialize the classifier model
model = tf.keras.Sequential()

# Loop through the pre-trained layers and add them to the sequential model
# Exclude the pre-trained output layer
pre_trained_output = str(vgg16_model.layers[-1])

for layer in vgg16_model.layers:
    if str(layer) != pre_trained_output:
        model.add(layer)

model.summary() 
Model: "sequential_1"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ block1_conv1 (Conv2D)           │ (None, 224, 224, 64)   │         1,792 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block1_conv2 (Conv2D)           │ (None, 224, 224, 64)   │        36,928 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block1_pool (MaxPooling2D)      │ (None, 112, 112, 64)   │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block2_conv1 (Conv2D)           │ (None, 112, 112, 128)  │        73,856 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block2_conv2 (Conv2D)           │ (None, 112, 112, 128)  │       147,584 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block2_pool (MaxPooling2D)      │ (None, 56, 56, 128)    │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block3_conv1 (Conv2D)           │ (None, 56, 56, 256)    │       295,168 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block3_conv2 (Conv2D)           │ (None, 56, 56, 256)    │       590,080 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block3_conv3 (Conv2D)           │ (None, 56, 56, 256)    │       590,080 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block3_pool (MaxPooling2D)      │ (None, 28, 28, 256)    │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block4_conv1 (Conv2D)           │ (None, 28, 28, 512)    │     1,180,160 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block4_conv2 (Conv2D)           │ (None, 28, 28, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block4_conv3 (Conv2D)           │ (None, 28, 28, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block4_pool (MaxPooling2D)      │ (None, 14, 14, 512)    │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block5_conv1 (Conv2D)           │ (None, 14, 14, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block5_conv2 (Conv2D)           │ (None, 14, 14, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block5_conv3 (Conv2D)           │ (None, 14, 14, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block5_pool (MaxPooling2D)      │ (None, 7, 7, 512)      │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ flatten (Flatten)               │ (None, 25088)          │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ fc1 (Dense)                     │ (None, 4096)           │   102,764,544 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ fc2 (Dense)                     │ (None, 4096)           │    16,781,312 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 134,260,544 (512.16 MB)
 Trainable params: 134,260,544 (512.16 MB)
 Non-trainable params: 0 (0.00 B)

You will notice from the summary of the classifier model that the output layer, named Prediction (Dense) in the pretrained model summary, is not included in this sequential model with the added layers.

Now, let’s freeze the transferred layers of the new sequential model and add the output layer with a sigmoid activation function.

Code
# Freeze the layers transferred to the new sequential model
for layer in model.layers:
    layer.trainable = False

# Add the output layer with a sigmoid activation
model.add(layers.Dense(1, activation="sigmoid"))

# Optionally, display the model summary
model.summary()
Model: "sequential_1"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ block1_conv1 (Conv2D)           │ (None, 224, 224, 64)   │         1,792 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block1_conv2 (Conv2D)           │ (None, 224, 224, 64)   │        36,928 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block1_pool (MaxPooling2D)      │ (None, 112, 112, 64)   │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block2_conv1 (Conv2D)           │ (None, 112, 112, 128)  │        73,856 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block2_conv2 (Conv2D)           │ (None, 112, 112, 128)  │       147,584 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block2_pool (MaxPooling2D)      │ (None, 56, 56, 128)    │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block3_conv1 (Conv2D)           │ (None, 56, 56, 256)    │       295,168 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block3_conv2 (Conv2D)           │ (None, 56, 56, 256)    │       590,080 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block3_conv3 (Conv2D)           │ (None, 56, 56, 256)    │       590,080 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block3_pool (MaxPooling2D)      │ (None, 28, 28, 256)    │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block4_conv1 (Conv2D)           │ (None, 28, 28, 512)    │     1,180,160 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block4_conv2 (Conv2D)           │ (None, 28, 28, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block4_conv3 (Conv2D)           │ (None, 28, 28, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block4_pool (MaxPooling2D)      │ (None, 14, 14, 512)    │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block5_conv1 (Conv2D)           │ (None, 14, 14, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block5_conv2 (Conv2D)           │ (None, 14, 14, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block5_conv3 (Conv2D)           │ (None, 14, 14, 512)    │     2,359,808 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ block5_pool (MaxPooling2D)      │ (None, 7, 7, 512)      │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ flatten (Flatten)               │ (None, 25088)          │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ fc1 (Dense)                     │ (None, 4096)           │   102,764,544 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ fc2 (Dense)                     │ (None, 4096)           │    16,781,312 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_2 (Dense)                 │ (None, 1)              │         4,097 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 134,264,641 (512.18 MB)
 Trainable params: 4,097 (16.00 KB)
 Non-trainable params: 134,260,544 (512.16 MB)

You would notice that the output or prediction layer has been added.

Prepare the Data and Use it for Fine-tuning

Ensure the image shape matches the input shape expected by the pre-trained model, and re-scale the image data.

Code
# Create data generators for rescaling
train_datagen = tf.keras.preprocessing.image.ImageDataGenerator(rescale=1./255)
test_datagen = tf.keras.preprocessing.image.ImageDataGenerator(rescale=1./255)

# Load training images in batches of 20 with resizing to 64x64 pixels
train_set = train_datagen.flow_from_directory(
    train_dir,
    target_size=(224, 224),
    batch_size=20,
    class_mode='binary'
)
Found 2000 images belonging to 2 classes.
Code
# Load test images in batches of 20 with resizing to 64x64 pixels
test_set = test_datagen.flow_from_directory(
    test_dir,
    target_size=(224, 224),
    batch_size=20,
    class_mode='binary'
)
Found 2000 images belonging to 2 classes.
Code
# Print both test and train image shapes in a single print statement 
print(f"Test image shape: {test_set.image_shape}\nTrain image shape: {train_set.image_shape}")
Test image shape: (224, 224, 3)
Train image shape: (224, 224, 3)

Fine-tune the Pre-trained Model

Code
# Compile the model
model.compile(
    optimizer='rmsprop',
    loss='binary_crossentropy',
    metrics=['accuracy']
)

# Fit the model with the car-flower image data
model.fit(
    train_set,
    steps_per_epoch=20,  
    epochs=1,
    validation_data=test_set,
    validation_steps=20,
    shuffle=False,
    verbose=0
)
<keras.src.callbacks.history.History object at 0x323a3dd50>

Load an Image to be Predicted

Code
import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt  # You need to import matplotlib to use plt

# Load the new image
new_image = tf.keras.preprocessing.image.load_img(
    "car.png",
    color_mode="rgb",
    target_size=(224, 224)
)

# Convert the image to a numpy array
new_image_arr = tf.keras.preprocessing.image.img_to_array(new_image)

# Expand dimensions to match the model input
new_image_array = np.expand_dims(new_image_arr, axis=0)

# Display the image
plt.imshow(new_image_array[0].astype('uint8'))  # Use new_image_array[0] to get the image itself
plt.axis('off')  # Hide the axes
(np.float64(-0.5), np.float64(223.5), np.float64(223.5), np.float64(-0.5))
Code
plt.show()

Use the Fine-tuned Model for Prediction

Code
## inspect the classes in the dataset
classes = train_set.class_indices
print("Classes in the dataset: ", classes)
Classes in the dataset:  {'car': 0, 'flower': 1}
Code
# make a prediction
y_pred = model.predict(new_image_array, verbose=0)[0][0]
print("Estimated Probability: ", y_pred)
Estimated Probability:  0.013323624
Code
# classify the image based on the prediction
if y_pred >0.5:
  print("Predicted Label: ", "Flower")
  
elif y_pred<0.5:
  print("Predicted Label: ", "Car")
Predicted Label:  Car

Use Pretrained Model for Prediction (No Fine-tuning)

Code
import tensorflow as tf
import keras

# initialize the pretrained model
VGG16_model = keras.applications.VGG16()

# Disable TensorFlow logging
tf.get_logger().setLevel('ERROR')

# make a prediction
y_pred = VGG16_model.predict(new_image_array, verbose=0)

# print the top 5 probabilities with the corresponding predicted images
results = keras.applications.vgg16.decode_predictions(y_pred, top=5);
for i in results:
    print(i)
[('n03924679', 'photocopier', np.float32(0.48023683)), ('n04590129', 'window_shade', np.float32(0.10118661)), ('n04554684', 'washer', np.float32(0.07467653)), ('n04239074', 'sliding_door', np.float32(0.06227135)), ('n04005630', 'prison', np.float32(0.033811074))]

We can see that the pre-trained model classifies the image as a photocopier because it is a general multi-class model with 1,000 labels or categories. However, fine-tuning the pre-trained model enables it to make more accurate predictions. The fine-tuned model correctly identified the image as a car.

Summary

This lesson explains how to use pre-trained models to perform tasks like image classification without building models from scratch. Pre-trained models, such as ResNet50 and VGG16, are beneficial because they are already trained on large datasets, saving time and computational resources. These models can be used directly or fine-tuned to adapt to specific tasks. The process involves loading data, preprocessing it, and using a pre-trained model to predict images, such as classifying objects like cars or flowers.

The lesson covers detailed steps for using a pre-trained model like ResNet50 to classify images, including image preprocessing, reshaping images to match the model’s input format, and making predictions. It also introduces transfer learning, where pre-trained models are adjusted for new tasks by adding custom output layers and freezing the convolutional base. Through an example using the VGG16 model, students learn how to replace the output layer for binary classification tasks like distinguishing between cars and flowers. By leveraging pre-trained networks, users can save resources and build efficient models for specific applications with minimal training. Fine-tuning allows users to freeze the initial layers of a pre-trained model, modify the output layer, and adapt the model to new data.