Introducing Keras
Hello nets!
You’re going to build a simple neural network to get a feeling of how quickly it is to accomplish this in Keras.
You will build a network that takes two numbers as an input, passes them through a hidden layer of 10 neurons, and finally outputs a single non-constrained number.
A non-constrained output can be obtained by avoiding setting an activation function in the output layer. This is useful for problems like regression, when we want our output to be able to take any non-constrained value.
include hello.nets.png
# Import the Sequential model and Dense layer
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
# Create a Sequential model
model = Sequential()
# Add an input layer and a hidden layer with 10 neurons
model.add(Dense(10 , input_shape= (2 ,), activation= "relu" ))
# Add a 1-neuron output layer
model.add(Dense(1 ))
# Summarise your model
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 10) 30
dense_1 (Dense) (None, 1) 11
=================================================================
Total params: 41 (164.00 Byte)
Trainable params: 41 (164.00 Byte)
Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________
Counting parameters
You’ve just created a neural network. But you’re going to create a new one now, taking some time to think about the weights of each layer. The Keras Dense layer and the Sequential model are already loaded for you to use.
This is the network you will be creating:
# Instantiate a new Sequential model
model = Sequential()
# Add a Dense layer with five neurons and three inputs
model.add(Dense(5 , input_shape= (3 ,), activation= "relu" ))
# Add a final Dense layer with one neuron and no activation
model.add(Dense(1 ))
# Summarize your model
model.summary()
Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_2 (Dense) (None, 5) 20
dense_3 (Dense) (None, 1) 6
=================================================================
Total params: 26 (104.00 Byte)
Trainable params: 26 (104.00 Byte)
Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________
Question
Given the model you just built, which answer is correct regarding the number of weights (parameters) in the hidden layer?
There are 20 parameters, 15 from the connections of our inputs to our hidden layer and 5 from the bias weight of each neuron in the hidden layer.
Build as shown!
You will take on a final challenge before moving on to the next lesson. Build the network shown in the picture below. Prove your mastered Keras basics in no time!
# Instantiate a Sequential model
model = Sequential()
# Build the input and hidden layer
model.add(Dense(3 , input_shape = (2 ,)))
# Add the ouput layer
model.add(Dense(1 ))
Specifying a model
You will build a simple regression model to predict the orbit of the meteor!
Your training data consist of measurements taken at time steps from -10 minutes before the impact region to +10 minutes after. Each time step can be viewed as an X coordinate in our graph, which has an associated position Y for the meteor orbit at that time step.
Note that you can view this problem as approximating a quadratic function via the use of neural networks.
This data is stored in two numpy arrays: one called time_steps , what we call features, and another called y_positions, with the labels. Go on and build your model! It should be able to predict the y positions for the meteor orbit at future time steps.
Keras Sequential model and Dense layers are available for you to use.
# Instantiate a Sequential model
model = Sequential()
# Add a Dense layer with 50 neurons and an input of 1 neuron
model.add(Dense(50 , input_shape= (1 ,), activation= 'relu' ))
# Add two Dense layers with 50 neurons and relu activation
model.add(Dense(50 , activation= "relu" ))
model.add(Dense(50 , activation= "relu" ))
# End your model with a Dense layer and no activation
model.add(Dense(1 ))
Training
You’re going to train your first model in this course, and for a good cause!
Remember that before training your Keras models you need to compile them. This can be done with the .compile() method. The .compile() method takes arguments such as the optimizer, used for weight updating, and the loss function, which is what we want to minimize. Training your model is as easy as calling the .fit() method, passing on the features, labels and a number of epochs to train for.
The regression model you built in the previous exercise is loaded for you to use, along with the time_steps and y_positions data. Train it and evaluate it on this very same data, let’s see if your model can learn the meteor’s trajectory.
time_steps_script = "time_steps.py"
filename = 'path_to_your_script.py'
with open (time_steps_script, 'r' ) as file :
script_content = file .read()
exec (script_content)
# Compile your model
model.compile (optimizer = "adam" , loss = "mse" )
print ("Training started..., this can take a while:" )
Training started..., this can take a while:
# Fit your model on your data for 30 epochs
model.fit(time_steps,y_positions, epochs = 30 )
Epoch 1/30
1/63 [..............................] - ETA: 31s - loss: 1898.9307
48/63 [=====================>........] - ETA: 0s - loss: 1782.0690
63/63 [==============================] - 1s 1ms/step - loss: 1570.2661
Epoch 2/30
1/63 [..............................] - ETA: 0s - loss: 596.8338
51/63 [=======================>......] - ETA: 0s - loss: 276.0764
63/63 [==============================] - 0s 1ms/step - loss: 250.9498
Epoch 3/30
1/63 [..............................] - ETA: 0s - loss: 143.9181
52/63 [=======================>......] - ETA: 0s - loss: 129.9817
63/63 [==============================] - 0s 1ms/step - loss: 129.2717
Epoch 4/30
1/63 [..............................] - ETA: 0s - loss: 119.4723
56/63 [=========================>....] - ETA: 0s - loss: 102.7880
63/63 [==============================] - 0s 940us/step - loss: 101.2285
Epoch 5/30
1/63 [..............................] - ETA: 0s - loss: 106.8071
51/63 [=======================>......] - ETA: 0s - loss: 71.2458
63/63 [==============================] - 0s 1ms/step - loss: 69.9295
Epoch 6/30
1/63 [..............................] - ETA: 0s - loss: 52.6778
56/63 [=========================>....] - ETA: 0s - loss: 48.3553
63/63 [==============================] - 0s 945us/step - loss: 46.5619
Epoch 7/30
1/63 [..............................] - ETA: 0s - loss: 28.7040
52/63 [=======================>......] - ETA: 0s - loss: 28.1992
63/63 [==============================] - 0s 1ms/step - loss: 27.2515
Epoch 8/30
1/63 [..............................] - ETA: 0s - loss: 15.2236
53/63 [========================>.....] - ETA: 0s - loss: 16.2146
63/63 [==============================] - 0s 983us/step - loss: 15.5511
Epoch 9/30
1/63 [..............................] - ETA: 0s - loss: 12.5567
53/63 [========================>.....] - ETA: 0s - loss: 9.7067
63/63 [==============================] - 0s 1ms/step - loss: 9.1497
Epoch 10/30
1/63 [..............................] - ETA: 0s - loss: 4.8695
48/63 [=====================>........] - ETA: 0s - loss: 6.2929
63/63 [==============================] - 0s 1ms/step - loss: 6.2438
Epoch 11/30
1/63 [..............................] - ETA: 0s - loss: 6.3005
49/63 [======================>.......] - ETA: 0s - loss: 3.9532
63/63 [==============================] - 0s 1ms/step - loss: 3.7343
Epoch 12/30
1/63 [..............................] - ETA: 0s - loss: 2.8742
48/63 [=====================>........] - ETA: 0s - loss: 3.3589
63/63 [==============================] - 0s 1ms/step - loss: 3.1419
Epoch 13/30
1/63 [..............................] - ETA: 0s - loss: 2.1622
55/63 [=========================>....] - ETA: 0s - loss: 1.9904
63/63 [==============================] - 0s 950us/step - loss: 1.9444
Epoch 14/30
1/63 [..............................] - ETA: 0s - loss: 1.9055
51/63 [=======================>......] - ETA: 0s - loss: 1.3589
63/63 [==============================] - 0s 1ms/step - loss: 1.4234
Epoch 15/30
1/63 [..............................] - ETA: 0s - loss: 2.3644
53/63 [========================>.....] - ETA: 0s - loss: 1.4362
63/63 [==============================] - 0s 963us/step - loss: 1.3774
Epoch 16/30
1/63 [..............................] - ETA: 0s - loss: 0.6225
57/63 [==========================>...] - ETA: 0s - loss: 0.8337
63/63 [==============================] - 0s 917us/step - loss: 0.8109
Epoch 17/30
1/63 [..............................] - ETA: 0s - loss: 0.5317
52/63 [=======================>......] - ETA: 0s - loss: 0.7082
63/63 [==============================] - 0s 992us/step - loss: 0.7266
Epoch 18/30
1/63 [..............................] - ETA: 0s - loss: 0.3751
57/63 [==========================>...] - ETA: 0s - loss: 0.5857
63/63 [==============================] - 0s 922us/step - loss: 0.5866
Epoch 19/30
1/63 [..............................] - ETA: 0s - loss: 0.6664
56/63 [=========================>....] - ETA: 0s - loss: 0.5301
63/63 [==============================] - 0s 929us/step - loss: 0.5284
Epoch 20/30
1/63 [..............................] - ETA: 0s - loss: 0.2963
54/63 [========================>.....] - ETA: 0s - loss: 0.5550
63/63 [==============================] - 0s 968us/step - loss: 0.5303
Epoch 21/30
1/63 [..............................] - ETA: 0s - loss: 0.4377
54/63 [========================>.....] - ETA: 0s - loss: 0.3165
63/63 [==============================] - 0s 967us/step - loss: 0.3362
Epoch 22/30
1/63 [..............................] - ETA: 0s - loss: 0.1393
51/63 [=======================>......] - ETA: 0s - loss: 0.2837
63/63 [==============================] - 0s 1ms/step - loss: 0.2669
Epoch 23/30
1/63 [..............................] - ETA: 0s - loss: 0.7585
52/63 [=======================>......] - ETA: 0s - loss: 0.3225
63/63 [==============================] - 0s 1ms/step - loss: 0.3281
Epoch 24/30
1/63 [..............................] - ETA: 0s - loss: 0.1837
52/63 [=======================>......] - ETA: 0s - loss: 0.2756
63/63 [==============================] - 0s 991us/step - loss: 0.2644
Epoch 25/30
1/63 [..............................] - ETA: 0s - loss: 0.3557
54/63 [========================>.....] - ETA: 0s - loss: 0.1744
63/63 [==============================] - 0s 961us/step - loss: 0.1793
Epoch 26/30
1/63 [..............................] - ETA: 0s - loss: 0.1272
51/63 [=======================>......] - ETA: 0s - loss: 0.1684
63/63 [==============================] - 0s 1ms/step - loss: 0.1770
Epoch 27/30
1/63 [..............................] - ETA: 0s - loss: 0.2307
54/63 [========================>.....] - ETA: 0s - loss: 0.1682
63/63 [==============================] - 0s 961us/step - loss: 0.1693
Epoch 28/30
1/63 [..............................] - ETA: 0s - loss: 0.4482
56/63 [=========================>....] - ETA: 0s - loss: 0.1901
63/63 [==============================] - 0s 943us/step - loss: 0.1874
Epoch 29/30
1/63 [..............................] - ETA: 0s - loss: 0.1806
55/63 [=========================>....] - ETA: 0s - loss: 0.1128
63/63 [==============================] - 0s 941us/step - loss: 0.1154
Epoch 30/30
1/63 [..............................] - ETA: 0s - loss: 0.0545
54/63 [========================>.....] - ETA: 0s - loss: 0.1041
63/63 [==============================] - 0s 957us/step - loss: 0.1077
<keras.src.callbacks.History object at 0x7f88e0344bb0>
# Evaluate your model
print ("Final loss value:" ,model.evaluate(time_steps, y_positions))
1/63 [..............................] - ETA: 4s - loss: 0.0972
63/63 [==============================] - 0s 745us/step - loss: 0.2000
Final loss value: 0.20000852644443512
Predicting the orbit!
You’ve already trained a model that approximates the orbit of the meteor approaching Earth and it’s loaded for you to use.
Since you trained your model for values between -10 and 10 minutes, your model hasn’t yet seen any other values for different time steps. You will now visualize how your model behaves on unseen data.
If you want to check the source code of plot_orbit, paste show_code(plot_orbit) into the console.
Hurry up, the Earth is running out of time!
Remember np.arange(x,y) produces a range of values from x to y-1. That is the [x, y) interval.
import numpy as np
# Predict the twenty minutes orbit
twenty_min_orbit = model.predict(np.arange(- 10 , 11 ))
1/1 [==============================] - ETA: 0s
1/1 [==============================] - 0s 57ms/step
# Plot the twenty minute orbit
plot_orbit(twenty_min_orbit)
# Predict the eighty minute orbit
eighty_min_orbit = model.predict(np.arange(- 40 , 41 ))
1/3 [=========>....................] - ETA: 0s
3/3 [==============================] - 0s 1ms/step
# Plot the eighty minute orbit
plot_orbit(eighty_min_orbit)
Going Deeper
Exploring dollar bills
You will practice building classification models in Keras with the Banknote Authentication dataset.
Your goal is to distinguish between real and fake dollar bills. In order to do this, the dataset comes with 4 features: variance,skewness,kurtosis and entropy. These features are calculated by applying mathematical operations over the dollar bill images. The labels are found in the dataframe’s class column.
import pandas as pd
banknotes = pd.read_csv("data/banknotes.csv" )
# Import seaborn
import seaborn as sns
# Use pairplot and set the hue to be our class column
sns.pairplot(banknotes, hue= "class" )
# Describe the data
print ('Dataset stats: \n ' ,banknotes.describe())
Dataset stats:
variace skewness curtosis entropy class
count 1372.000000 1372.000000 1372.000000 1372.000000 1372.000000
mean 0.433735 1.922353 1.397627 -1.191657 0.444606
std 2.842763 5.869047 4.310030 2.101013 0.497103
min -7.042100 -13.773100 -5.286100 -8.548200 0.000000
25% -1.773000 -1.708200 -1.574975 -2.413450 0.000000
50% 0.496180 2.319650 0.616630 -0.586650 0.000000
75% 2.821475 6.814625 3.179250 0.394810 1.000000
max 6.824800 12.951600 17.927400 2.449500 1.000000
# Count the number of observations per class
print ('Observations per class: \n ' ,banknotes["class" ].value_counts())
Observations per class:
class
0 762
1 610
Name: count, dtype: int64
A binary classification model
Now that you know what the Banknote Authentication dataset looks like, we’ll build a simple model to distinguish between real and fake bills.
You will perform binary classification by using a single neuron as an output. The input layer will have 4 neurons since we have 4 features in our dataset. The model’s output will be a value constrained between 0 and 1.
We will interpret this output number as the probability of our input variables coming from a fake dollar bill, with 1 meaning we are certain it’s a fake bill.
# Import the sequential model and dense layer
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
# Create a sequential model
model = Sequential()
# Add a dense layer
model.add(Dense(1 , input_shape= (4 ,), activation= "sigmoid" ))
# Compile your model
model.compile (loss= 'binary_crossentropy' , optimizer= "sgd" , metrics= ['accuracy' ])
# Display a summary of your model
model.summary()
Model: "sequential_4"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_10 (Dense) (None, 1) 5
=================================================================
Total params: 5 (20.00 Byte)
Trainable params: 5 (20.00 Byte)
Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________
Is this dollar bill fake ?
You are now ready to train your model and check how well it performs when classifying new bills! The dataset has already been partitioned into features: X_train & X_test, and labels: y_train & y_test.
from sklearn.model_selection import train_test_split
# Separate features and labels
X = banknotes.drop('class' , axis= 1 ) # Features
y = banknotes['class' ] # Labels
# Split the dataset into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size= 0.2 , random_state= 42 )
# Train your model for 20 epochs
model.fit(X_train, y_train, epochs = 20 )
Epoch 1/20
1/35 [..............................] - ETA: 6s - loss: 4.3534 - accuracy: 0.3750
35/35 [==============================] - 0s 848us/step - loss: 2.3420 - accuracy: 0.3418
Epoch 2/20
1/35 [..............................] - ETA: 0s - loss: 1.0439 - accuracy: 0.5000
35/35 [==============================] - 0s 881us/step - loss: 0.6762 - accuracy: 0.7201
Epoch 3/20
1/35 [..............................] - ETA: 0s - loss: 0.4907 - accuracy: 0.8125
35/35 [==============================] - 0s 766us/step - loss: 0.4409 - accuracy: 0.8332
Epoch 4/20
1/35 [..............................] - ETA: 0s - loss: 0.2553 - accuracy: 0.9062
35/35 [==============================] - 0s 720us/step - loss: 0.3375 - accuracy: 0.8861
Epoch 5/20
1/35 [..............................] - ETA: 0s - loss: 0.2321 - accuracy: 0.9688
35/35 [==============================] - 0s 836us/step - loss: 0.2805 - accuracy: 0.9107
Epoch 6/20
1/35 [..............................] - ETA: 0s - loss: 0.3618 - accuracy: 0.8438
35/35 [==============================] - 0s 775us/step - loss: 0.2439 - accuracy: 0.9262
Epoch 7/20
1/35 [..............................] - ETA: 0s - loss: 0.3466 - accuracy: 0.8750
35/35 [==============================] - 0s 801us/step - loss: 0.2180 - accuracy: 0.9371
Epoch 8/20
1/35 [..............................] - ETA: 0s - loss: 0.2251 - accuracy: 0.9688
35/35 [==============================] - 0s 840us/step - loss: 0.1983 - accuracy: 0.9417
Epoch 9/20
1/35 [..............................] - ETA: 0s - loss: 0.1350 - accuracy: 1.0000
35/35 [==============================] - 0s 823us/step - loss: 0.1832 - accuracy: 0.9471
Epoch 10/20
1/35 [..............................] - ETA: 0s - loss: 0.1383 - accuracy: 1.0000
35/35 [==============================] - 0s 805us/step - loss: 0.1710 - accuracy: 0.9508
Epoch 11/20
1/35 [..............................] - ETA: 0s - loss: 0.1414 - accuracy: 1.0000
35/35 [==============================] - 0s 784us/step - loss: 0.1609 - accuracy: 0.9508
Epoch 12/20
1/35 [..............................] - ETA: 0s - loss: 0.2066 - accuracy: 0.9062
35/35 [==============================] - 0s 802us/step - loss: 0.1525 - accuracy: 0.9544
Epoch 13/20
1/35 [..............................] - ETA: 0s - loss: 0.1014 - accuracy: 0.9375
35/35 [==============================] - 0s 889us/step - loss: 0.1454 - accuracy: 0.9581
Epoch 14/20
1/35 [..............................] - ETA: 0s - loss: 0.1284 - accuracy: 0.9688
35/35 [==============================] - 0s 821us/step - loss: 0.1391 - accuracy: 0.9599
Epoch 15/20
1/35 [..............................] - ETA: 0s - loss: 0.1977 - accuracy: 0.9688
35/35 [==============================] - 0s 810us/step - loss: 0.1339 - accuracy: 0.9644
Epoch 16/20
1/35 [..............................] - ETA: 0s - loss: 0.0980 - accuracy: 1.0000
35/35 [==============================] - 0s 803us/step - loss: 0.1292 - accuracy: 0.9663
Epoch 17/20
1/35 [..............................] - ETA: 0s - loss: 0.1205 - accuracy: 0.9688
35/35 [==============================] - 0s 882us/step - loss: 0.1248 - accuracy: 0.9699
Epoch 18/20
1/35 [..............................] - ETA: 0s - loss: 0.1335 - accuracy: 0.9688
35/35 [==============================] - 0s 785us/step - loss: 0.1210 - accuracy: 0.9717
Epoch 19/20
1/35 [..............................] - ETA: 0s - loss: 0.1701 - accuracy: 0.9688
35/35 [==============================] - 0s 744us/step - loss: 0.1176 - accuracy: 0.9736
Epoch 20/20
1/35 [..............................] - ETA: 0s - loss: 0.0756 - accuracy: 1.0000
35/35 [==============================] - 0s 729us/step - loss: 0.1144 - accuracy: 0.9772
<keras.src.callbacks.History object at 0x7f883c4e5fc0>
# Evaluate your model accuracy on the test set
accuracy = model.evaluate(X_test, y_test)[1 ]
1/9 [==>...........................] - ETA: 0s - loss: 0.1538 - accuracy: 0.9375
9/9 [==============================] - 0s 996us/step - loss: 0.1473 - accuracy: 0.9527
# Print accuracy
print ('Accuracy:' , accuracy)
Accuracy: 0.9527272582054138
A multi-class model
You’re going to build a model that predicts who threw which dart only based on where that dart landed! (That is the dart’s x and y coordinates on the board.)
This problem is a multi-class classification problem since each dart can only be thrown by one of 4 competitors. So classes/labels are mutually exclusive, and therefore we can build a neuron with as many output as competitors and use the softmax activation function to achieve a total sum of probabilities of 1 over all competitors.
The Sequential model and Dense layers are already imported for you to use.
# Instantiate a sequential model
model = Sequential()
# Add 3 dense layers of 128, 64 and 32 neurons each
model.add(Dense(128 , input_shape= (2 ,), activation= 'relu' ))
model.add(Dense(64 , activation= 'relu' ))
model.add(Dense(32 , activation= 'relu' ))
# Add a dense layer with as many neurons as competitors
model.add(Dense(4 , activation= "softmax" ))
# Compile your model using categorical_crossentropy loss
model.compile (loss= "categorical_crossentropy" ,
optimizer= 'adam' ,
metrics= ['accuracy' ])
Prepare your dataset
In the console you can check that your labels, darts.competitor are not yet in a format to be understood by your network. They contain the names of the competitors as strings. You will first turn these competitors into unique numbers,then use the to_categorical() function from keras.utils to turn these numbers into their one-hot encoded representation.
This is useful for multi-class classification problems, since there are as many output neurons as classes and for every observation in our dataset we just want one of the neurons to be activated.
The dart’s dataset is loaded as darts. Pandas is imported as pd. Let’s prepare this dataset!
darts = pd.read_csv("data/darts.csv" )
# Transform into a categorical variable
darts.competitor = pd.Categorical(darts.competitor)
# Assign a number to each category (label encoding)
darts.competitor = darts.competitor.cat.codes
# Import to_categorical from keras utils module
from tensorflow.keras.utils import to_categorical
coordinates = darts.drop(['competitor' ], axis= 1 )
# Use to_categorical on your labels
competitors = to_categorical(darts.competitor)
# Now print the one-hot encoded labels
print ('One-hot encoded competitors: \n ' ,competitors)
One-hot encoded competitors:
[[0. 0. 1. 0.]
[0. 0. 0. 1.]
[0. 1. 0. 0.]
...
[0. 1. 0. 0.]
[0. 1. 0. 0.]
[0. 0. 0. 1.]]
Training on dart throwers
Your model is now ready, just as your dataset. It’s time to train!
The coordinates features and competitors labels you just transformed have been partitioned into coord_train,coord_test and competitors_train,competitors_test.
Your model is also loaded. Feel free to visualize your training data or model.summary() in the console.
Let’s find out who threw which dart just by looking at the board!
# Now, split the datasets into training and testing sets
coord_train, coord_test, competitors_train, competitors_test = train_test_split(
coordinates, # features
competitors, # target
test_size= 0.2 , # proportion of the dataset to include in the test split
random_state= 42 # seed used by the random number generator for reproducibility
)
# Fit your model to the training data for 200 epochs
model.fit(coord_train, competitors_train, epochs= 200 )
Epoch 1/200
1/20 [>.............................] - ETA: 7s - loss: 1.3887 - accuracy: 0.1875
20/20 [==============================] - 0s 1ms/step - loss: 1.3731 - accuracy: 0.2703
Epoch 2/200
1/20 [>.............................] - ETA: 0s - loss: 1.3623 - accuracy: 0.3438
20/20 [==============================] - 0s 1ms/step - loss: 1.3363 - accuracy: 0.3016
Epoch 3/200
1/20 [>.............................] - ETA: 0s - loss: 1.3198 - accuracy: 0.2188
20/20 [==============================] - 0s 1ms/step - loss: 1.2857 - accuracy: 0.3766
Epoch 4/200
1/20 [>.............................] - ETA: 0s - loss: 1.2016 - accuracy: 0.4688
20/20 [==============================] - 0s 1ms/step - loss: 1.2186 - accuracy: 0.4703
Epoch 5/200
1/20 [>.............................] - ETA: 0s - loss: 1.2123 - accuracy: 0.5000
20/20 [==============================] - 0s 1ms/step - loss: 1.1451 - accuracy: 0.5063
Epoch 6/200
1/20 [>.............................] - ETA: 0s - loss: 1.0839 - accuracy: 0.5312
20/20 [==============================] - 0s 1ms/step - loss: 1.0646 - accuracy: 0.5531
Epoch 7/200
1/20 [>.............................] - ETA: 0s - loss: 1.0197 - accuracy: 0.5000
20/20 [==============================] - 0s 1ms/step - loss: 0.9788 - accuracy: 0.5750
Epoch 8/200
1/20 [>.............................] - ETA: 0s - loss: 0.8761 - accuracy: 0.6562
20/20 [==============================] - 0s 1ms/step - loss: 0.9109 - accuracy: 0.6016
Epoch 9/200
1/20 [>.............................] - ETA: 0s - loss: 0.8035 - accuracy: 0.7188
20/20 [==============================] - 0s 1ms/step - loss: 0.8641 - accuracy: 0.6187
Epoch 10/200
1/20 [>.............................] - ETA: 0s - loss: 0.7682 - accuracy: 0.6562
20/20 [==============================] - 0s 1ms/step - loss: 0.8344 - accuracy: 0.6500
Epoch 11/200
1/20 [>.............................] - ETA: 0s - loss: 1.0624 - accuracy: 0.5625
20/20 [==============================] - 0s 1ms/step - loss: 0.8174 - accuracy: 0.6734
Epoch 12/200
1/20 [>.............................] - ETA: 0s - loss: 0.7658 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.7999 - accuracy: 0.6781
Epoch 13/200
1/20 [>.............................] - ETA: 0s - loss: 0.6660 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.7782 - accuracy: 0.7031
Epoch 14/200
1/20 [>.............................] - ETA: 0s - loss: 0.6999 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.7581 - accuracy: 0.7156
Epoch 15/200
1/20 [>.............................] - ETA: 0s - loss: 0.6399 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.7526 - accuracy: 0.7188
Epoch 16/200
1/20 [>.............................] - ETA: 0s - loss: 0.9757 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.7294 - accuracy: 0.7281
Epoch 17/200
1/20 [>.............................] - ETA: 0s - loss: 0.7397 - accuracy: 0.7188
20/20 [==============================] - 0s 1ms/step - loss: 0.7276 - accuracy: 0.7234
Epoch 18/200
1/20 [>.............................] - ETA: 0s - loss: 0.6596 - accuracy: 0.6875
20/20 [==============================] - 0s 1ms/step - loss: 0.7212 - accuracy: 0.7328
Epoch 19/200
1/20 [>.............................] - ETA: 0s - loss: 0.8041 - accuracy: 0.7188
20/20 [==============================] - 0s 1ms/step - loss: 0.7182 - accuracy: 0.7281
Epoch 20/200
1/20 [>.............................] - ETA: 0s - loss: 0.7933 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.7085 - accuracy: 0.7375
Epoch 21/200
1/20 [>.............................] - ETA: 0s - loss: 0.6726 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.7138 - accuracy: 0.7375
Epoch 22/200
1/20 [>.............................] - ETA: 0s - loss: 0.6224 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.6952 - accuracy: 0.7406
Epoch 23/200
1/20 [>.............................] - ETA: 0s - loss: 0.6138 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.6904 - accuracy: 0.7484
Epoch 24/200
1/20 [>.............................] - ETA: 0s - loss: 0.9039 - accuracy: 0.6875
20/20 [==============================] - 0s 1ms/step - loss: 0.6837 - accuracy: 0.7484
Epoch 25/200
1/20 [>.............................] - ETA: 0s - loss: 0.6622 - accuracy: 0.7188
20/20 [==============================] - 0s 1ms/step - loss: 0.6686 - accuracy: 0.7609
Epoch 26/200
1/20 [>.............................] - ETA: 0s - loss: 0.8227 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.6645 - accuracy: 0.7547
Epoch 27/200
1/20 [>.............................] - ETA: 0s - loss: 0.7359 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.6712 - accuracy: 0.7641
Epoch 28/200
1/20 [>.............................] - ETA: 0s - loss: 0.4262 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.6664 - accuracy: 0.7516
Epoch 29/200
1/20 [>.............................] - ETA: 0s - loss: 0.6173 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.6579 - accuracy: 0.7656
Epoch 30/200
1/20 [>.............................] - ETA: 0s - loss: 0.5192 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.6447 - accuracy: 0.7750
Epoch 31/200
1/20 [>.............................] - ETA: 0s - loss: 0.5871 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.6472 - accuracy: 0.7750
Epoch 32/200
1/20 [>.............................] - ETA: 0s - loss: 0.6190 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.6546 - accuracy: 0.7500
Epoch 33/200
1/20 [>.............................] - ETA: 0s - loss: 0.6130 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.6424 - accuracy: 0.7625
Epoch 34/200
1/20 [>.............................] - ETA: 0s - loss: 0.5181 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.6417 - accuracy: 0.7641
Epoch 35/200
1/20 [>.............................] - ETA: 0s - loss: 0.5543 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.6417 - accuracy: 0.7531
Epoch 36/200
1/20 [>.............................] - ETA: 0s - loss: 0.8112 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.6417 - accuracy: 0.7641
Epoch 37/200
1/20 [>.............................] - ETA: 0s - loss: 0.4381 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.6352 - accuracy: 0.7734
Epoch 38/200
1/20 [>.............................] - ETA: 0s - loss: 0.5968 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.6172 - accuracy: 0.7812
Epoch 39/200
1/20 [>.............................] - ETA: 0s - loss: 0.7108 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.6203 - accuracy: 0.7844
Epoch 40/200
1/20 [>.............................] - ETA: 0s - loss: 0.7574 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.6168 - accuracy: 0.7766
Epoch 41/200
1/20 [>.............................] - ETA: 0s - loss: 0.6269 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.6067 - accuracy: 0.7844
Epoch 42/200
1/20 [>.............................] - ETA: 0s - loss: 0.8203 - accuracy: 0.7188
20/20 [==============================] - 0s 1ms/step - loss: 0.6199 - accuracy: 0.7781
Epoch 43/200
1/20 [>.............................] - ETA: 0s - loss: 0.4944 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5990 - accuracy: 0.7891
Epoch 44/200
1/20 [>.............................] - ETA: 0s - loss: 0.6158 - accuracy: 0.6875
20/20 [==============================] - 0s 1ms/step - loss: 0.5987 - accuracy: 0.7875
Epoch 45/200
1/20 [>.............................] - ETA: 0s - loss: 0.8731 - accuracy: 0.7188
20/20 [==============================] - 0s 1ms/step - loss: 0.6008 - accuracy: 0.7922
Epoch 46/200
1/20 [>.............................] - ETA: 0s - loss: 0.4606 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5970 - accuracy: 0.7766
Epoch 47/200
1/20 [>.............................] - ETA: 0s - loss: 0.5476 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.6019 - accuracy: 0.7828
Epoch 48/200
1/20 [>.............................] - ETA: 0s - loss: 0.7218 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.6111 - accuracy: 0.7781
Epoch 49/200
1/20 [>.............................] - ETA: 0s - loss: 0.5091 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5975 - accuracy: 0.7766
Epoch 50/200
1/20 [>.............................] - ETA: 0s - loss: 0.6196 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.5858 - accuracy: 0.7969
Epoch 51/200
1/20 [>.............................] - ETA: 0s - loss: 0.4200 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5875 - accuracy: 0.7859
Epoch 52/200
1/20 [>.............................] - ETA: 0s - loss: 0.7136 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.5842 - accuracy: 0.7922
Epoch 53/200
1/20 [>.............................] - ETA: 0s - loss: 0.6596 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.6028 - accuracy: 0.7812
Epoch 54/200
1/20 [>.............................] - ETA: 0s - loss: 0.6664 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.5845 - accuracy: 0.7922
Epoch 55/200
1/20 [>.............................] - ETA: 0s - loss: 0.7533 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5884 - accuracy: 0.7906
Epoch 56/200
1/20 [>.............................] - ETA: 0s - loss: 0.6815 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5866 - accuracy: 0.7937
Epoch 57/200
1/20 [>.............................] - ETA: 0s - loss: 0.4917 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5932 - accuracy: 0.7734
Epoch 58/200
1/20 [>.............................] - ETA: 0s - loss: 0.7670 - accuracy: 0.7188
20/20 [==============================] - 0s 1ms/step - loss: 0.5700 - accuracy: 0.7953
Epoch 59/200
1/20 [>.............................] - ETA: 0s - loss: 0.6999 - accuracy: 0.6875
20/20 [==============================] - 0s 1ms/step - loss: 0.5752 - accuracy: 0.7953
Epoch 60/200
1/20 [>.............................] - ETA: 0s - loss: 0.5337 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5701 - accuracy: 0.7937
Epoch 61/200
1/20 [>.............................] - ETA: 0s - loss: 0.5693 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5699 - accuracy: 0.7953
Epoch 62/200
1/20 [>.............................] - ETA: 0s - loss: 0.4720 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.5711 - accuracy: 0.8000
Epoch 63/200
1/20 [>.............................] - ETA: 0s - loss: 0.4689 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5612 - accuracy: 0.8000
Epoch 64/200
1/20 [>.............................] - ETA: 0s - loss: 0.7224 - accuracy: 0.6250
20/20 [==============================] - 0s 1ms/step - loss: 0.5651 - accuracy: 0.8031
Epoch 65/200
1/20 [>.............................] - ETA: 0s - loss: 0.5370 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5738 - accuracy: 0.7937
Epoch 66/200
1/20 [>.............................] - ETA: 0s - loss: 0.5172 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.5869 - accuracy: 0.7875
Epoch 67/200
1/20 [>.............................] - ETA: 0s - loss: 0.5874 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.5629 - accuracy: 0.7937
Epoch 68/200
1/20 [>.............................] - ETA: 0s - loss: 0.7175 - accuracy: 0.6875
20/20 [==============================] - 0s 1ms/step - loss: 0.5656 - accuracy: 0.7969
Epoch 69/200
1/20 [>.............................] - ETA: 0s - loss: 0.6009 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5631 - accuracy: 0.7969
Epoch 70/200
1/20 [>.............................] - ETA: 0s - loss: 0.6002 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.5561 - accuracy: 0.8031
Epoch 71/200
1/20 [>.............................] - ETA: 0s - loss: 0.2910 - accuracy: 0.9375
20/20 [==============================] - 0s 1ms/step - loss: 0.5659 - accuracy: 0.7875
Epoch 72/200
1/20 [>.............................] - ETA: 0s - loss: 0.5454 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5602 - accuracy: 0.7969
Epoch 73/200
1/20 [>.............................] - ETA: 0s - loss: 0.4472 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5491 - accuracy: 0.8047
Epoch 74/200
1/20 [>.............................] - ETA: 0s - loss: 0.5135 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.5648 - accuracy: 0.7891
Epoch 75/200
1/20 [>.............................] - ETA: 0s - loss: 0.5419 - accuracy: 0.7188
20/20 [==============================] - 0s 1ms/step - loss: 0.5781 - accuracy: 0.7937
Epoch 76/200
1/20 [>.............................] - ETA: 0s - loss: 0.6084 - accuracy: 0.7188
20/20 [==============================] - 0s 1ms/step - loss: 0.5747 - accuracy: 0.7891
Epoch 77/200
1/20 [>.............................] - ETA: 0s - loss: 0.8373 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5432 - accuracy: 0.8078
Epoch 78/200
1/20 [>.............................] - ETA: 0s - loss: 0.3986 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5562 - accuracy: 0.7984
Epoch 79/200
1/20 [>.............................] - ETA: 0s - loss: 0.6122 - accuracy: 0.7188
20/20 [==============================] - 0s 1ms/step - loss: 0.5561 - accuracy: 0.7969
Epoch 80/200
1/20 [>.............................] - ETA: 0s - loss: 0.3323 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.5737 - accuracy: 0.7844
Epoch 81/200
1/20 [>.............................] - ETA: 0s - loss: 0.4852 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5898 - accuracy: 0.7828
Epoch 82/200
1/20 [>.............................] - ETA: 0s - loss: 0.8386 - accuracy: 0.6562
20/20 [==============================] - 0s 1ms/step - loss: 0.5898 - accuracy: 0.7844
Epoch 83/200
1/20 [>.............................] - ETA: 0s - loss: 0.7122 - accuracy: 0.7188
20/20 [==============================] - 0s 1ms/step - loss: 0.5519 - accuracy: 0.8062
Epoch 84/200
1/20 [>.............................] - ETA: 0s - loss: 0.5021 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5522 - accuracy: 0.8094
Epoch 85/200
1/20 [>.............................] - ETA: 0s - loss: 0.6566 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.5450 - accuracy: 0.8062
Epoch 86/200
1/20 [>.............................] - ETA: 0s - loss: 0.4550 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5443 - accuracy: 0.8094
Epoch 87/200
1/20 [>.............................] - ETA: 0s - loss: 0.3599 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.5462 - accuracy: 0.8047
Epoch 88/200
1/20 [>.............................] - ETA: 0s - loss: 0.7188 - accuracy: 0.6562
20/20 [==============================] - 0s 1ms/step - loss: 0.5492 - accuracy: 0.8000
Epoch 89/200
1/20 [>.............................] - ETA: 0s - loss: 0.3834 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5494 - accuracy: 0.8062
Epoch 90/200
1/20 [>.............................] - ETA: 0s - loss: 0.8364 - accuracy: 0.6562
20/20 [==============================] - 0s 1ms/step - loss: 0.5544 - accuracy: 0.7953
Epoch 91/200
1/20 [>.............................] - ETA: 0s - loss: 0.4847 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5423 - accuracy: 0.8047
Epoch 92/200
1/20 [>.............................] - ETA: 0s - loss: 0.6416 - accuracy: 0.7812
20/20 [==============================] - 0s 997us/step - loss: 0.5465 - accuracy: 0.8062
Epoch 93/200
1/20 [>.............................] - ETA: 0s - loss: 0.6741 - accuracy: 0.6875
20/20 [==============================] - 0s 1ms/step - loss: 0.5503 - accuracy: 0.7953
Epoch 94/200
1/20 [>.............................] - ETA: 0s - loss: 0.5128 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5344 - accuracy: 0.8016
Epoch 95/200
1/20 [>.............................] - ETA: 0s - loss: 0.8374 - accuracy: 0.6875
20/20 [==============================] - 0s 1ms/step - loss: 0.5418 - accuracy: 0.8016
Epoch 96/200
1/20 [>.............................] - ETA: 0s - loss: 0.3566 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.5461 - accuracy: 0.8094
Epoch 97/200
1/20 [>.............................] - ETA: 0s - loss: 0.8082 - accuracy: 0.7188
20/20 [==============================] - 0s 1ms/step - loss: 0.5374 - accuracy: 0.8094
Epoch 98/200
1/20 [>.............................] - ETA: 0s - loss: 0.5388 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.5344 - accuracy: 0.8062
Epoch 99/200
1/20 [>.............................] - ETA: 0s - loss: 0.3697 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5288 - accuracy: 0.8141
Epoch 100/200
1/20 [>.............................] - ETA: 0s - loss: 0.4192 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5349 - accuracy: 0.8062
Epoch 101/200
1/20 [>.............................] - ETA: 0s - loss: 0.5157 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5290 - accuracy: 0.8141
Epoch 102/200
1/20 [>.............................] - ETA: 0s - loss: 0.5718 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5433 - accuracy: 0.8031
Epoch 103/200
1/20 [>.............................] - ETA: 0s - loss: 0.5273 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5234 - accuracy: 0.8141
Epoch 104/200
1/20 [>.............................] - ETA: 0s - loss: 0.3854 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.5353 - accuracy: 0.8125
Epoch 105/200
1/20 [>.............................] - ETA: 0s - loss: 0.6998 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5520 - accuracy: 0.7875
Epoch 106/200
1/20 [>.............................] - ETA: 0s - loss: 0.6896 - accuracy: 0.7500
20/20 [==============================] - 0s 998us/step - loss: 0.5338 - accuracy: 0.8031
Epoch 107/200
1/20 [>.............................] - ETA: 0s - loss: 0.4004 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5368 - accuracy: 0.8125
Epoch 108/200
1/20 [>.............................] - ETA: 0s - loss: 0.3915 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5248 - accuracy: 0.8156
Epoch 109/200
1/20 [>.............................] - ETA: 0s - loss: 0.4273 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5373 - accuracy: 0.8094
Epoch 110/200
1/20 [>.............................] - ETA: 0s - loss: 0.4189 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.5440 - accuracy: 0.8016
Epoch 111/200
1/20 [>.............................] - ETA: 0s - loss: 0.8649 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.5597 - accuracy: 0.7891
Epoch 112/200
1/20 [>.............................] - ETA: 0s - loss: 0.6948 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5254 - accuracy: 0.8125
Epoch 113/200
1/20 [>.............................] - ETA: 0s - loss: 0.5230 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5334 - accuracy: 0.8016
Epoch 114/200
1/20 [>.............................] - ETA: 0s - loss: 0.4781 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5278 - accuracy: 0.8125
Epoch 115/200
1/20 [>.............................] - ETA: 0s - loss: 0.5725 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5220 - accuracy: 0.8078
Epoch 116/200
1/20 [>.............................] - ETA: 0s - loss: 0.8243 - accuracy: 0.6875
20/20 [==============================] - 0s 1ms/step - loss: 0.5218 - accuracy: 0.8125
Epoch 117/200
1/20 [>.............................] - ETA: 0s - loss: 0.5143 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5223 - accuracy: 0.8078
Epoch 118/200
1/20 [>.............................] - ETA: 0s - loss: 0.5604 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5254 - accuracy: 0.8156
Epoch 119/200
1/20 [>.............................] - ETA: 0s - loss: 0.7334 - accuracy: 0.6562
20/20 [==============================] - 0s 1ms/step - loss: 0.5267 - accuracy: 0.8203
Epoch 120/200
1/20 [>.............................] - ETA: 0s - loss: 0.2974 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.5262 - accuracy: 0.8016
Epoch 121/200
1/20 [>.............................] - ETA: 0s - loss: 0.3540 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5208 - accuracy: 0.8062
Epoch 122/200
1/20 [>.............................] - ETA: 0s - loss: 0.2857 - accuracy: 0.9375
20/20 [==============================] - 0s 992us/step - loss: 0.5176 - accuracy: 0.8125
Epoch 123/200
1/20 [>.............................] - ETA: 0s - loss: 0.3322 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5200 - accuracy: 0.8016
Epoch 124/200
1/20 [>.............................] - ETA: 0s - loss: 0.5295 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5186 - accuracy: 0.8188
Epoch 125/200
1/20 [>.............................] - ETA: 0s - loss: 0.5877 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5193 - accuracy: 0.8062
Epoch 126/200
1/20 [>.............................] - ETA: 0s - loss: 0.3700 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.5272 - accuracy: 0.8000
Epoch 127/200
1/20 [>.............................] - ETA: 0s - loss: 0.4372 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5536 - accuracy: 0.7828
Epoch 128/200
1/20 [>.............................] - ETA: 0s - loss: 0.4878 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5228 - accuracy: 0.8016
Epoch 129/200
1/20 [>.............................] - ETA: 0s - loss: 0.6316 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5108 - accuracy: 0.8203
Epoch 130/200
1/20 [>.............................] - ETA: 0s - loss: 0.5723 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5208 - accuracy: 0.8094
Epoch 131/200
1/20 [>.............................] - ETA: 0s - loss: 0.5360 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5342 - accuracy: 0.8047
Epoch 132/200
1/20 [>.............................] - ETA: 0s - loss: 0.6305 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5198 - accuracy: 0.8094
Epoch 133/200
1/20 [>.............................] - ETA: 0s - loss: 0.3703 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5100 - accuracy: 0.8094
Epoch 134/200
1/20 [>.............................] - ETA: 0s - loss: 0.4945 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5175 - accuracy: 0.8156
Epoch 135/200
1/20 [>.............................] - ETA: 0s - loss: 0.3522 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5116 - accuracy: 0.8141
Epoch 136/200
1/20 [>.............................] - ETA: 0s - loss: 0.4448 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5384 - accuracy: 0.7937
Epoch 137/200
1/20 [>.............................] - ETA: 0s - loss: 0.5380 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5120 - accuracy: 0.8172
Epoch 138/200
1/20 [>.............................] - ETA: 0s - loss: 0.4022 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5185 - accuracy: 0.8047
Epoch 139/200
1/20 [>.............................] - ETA: 0s - loss: 0.4252 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5092 - accuracy: 0.8156
Epoch 140/200
1/20 [>.............................] - ETA: 0s - loss: 0.3870 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5038 - accuracy: 0.8125
Epoch 141/200
1/20 [>.............................] - ETA: 0s - loss: 0.3823 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.5180 - accuracy: 0.8062
Epoch 142/200
1/20 [>.............................] - ETA: 0s - loss: 0.6634 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5138 - accuracy: 0.8156
Epoch 143/200
1/20 [>.............................] - ETA: 0s - loss: 0.6180 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.5304 - accuracy: 0.7969
Epoch 144/200
1/20 [>.............................] - ETA: 0s - loss: 0.3263 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.5139 - accuracy: 0.8078
Epoch 145/200
1/20 [>.............................] - ETA: 0s - loss: 0.4445 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5027 - accuracy: 0.8219
Epoch 146/200
1/20 [>.............................] - ETA: 0s - loss: 0.5968 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5055 - accuracy: 0.8156
Epoch 147/200
1/20 [>.............................] - ETA: 0s - loss: 0.6057 - accuracy: 0.7188
20/20 [==============================] - 0s 1ms/step - loss: 0.5302 - accuracy: 0.8000
Epoch 148/200
1/20 [>.............................] - ETA: 0s - loss: 0.5226 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5291 - accuracy: 0.8016
Epoch 149/200
1/20 [>.............................] - ETA: 0s - loss: 0.2951 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.5242 - accuracy: 0.8016
Epoch 150/200
1/20 [>.............................] - ETA: 0s - loss: 0.4461 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5139 - accuracy: 0.8016
Epoch 151/200
1/20 [>.............................] - ETA: 0s - loss: 0.5858 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5217 - accuracy: 0.8094
Epoch 152/200
1/20 [>.............................] - ETA: 0s - loss: 0.5791 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.4968 - accuracy: 0.8219
Epoch 153/200
1/20 [>.............................] - ETA: 0s - loss: 0.5419 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5060 - accuracy: 0.8125
Epoch 154/200
1/20 [>.............................] - ETA: 0s - loss: 0.6230 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5008 - accuracy: 0.8141
Epoch 155/200
1/20 [>.............................] - ETA: 0s - loss: 0.5437 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5050 - accuracy: 0.8125
Epoch 156/200
1/20 [>.............................] - ETA: 0s - loss: 0.5406 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5189 - accuracy: 0.8062
Epoch 157/200
1/20 [>.............................] - ETA: 0s - loss: 0.3344 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5104 - accuracy: 0.8062
Epoch 158/200
1/20 [>.............................] - ETA: 0s - loss: 0.3993 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5175 - accuracy: 0.8016
Epoch 159/200
1/20 [>.............................] - ETA: 0s - loss: 0.4598 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5225 - accuracy: 0.8016
Epoch 160/200
1/20 [>.............................] - ETA: 0s - loss: 0.4467 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5055 - accuracy: 0.8172
Epoch 161/200
1/20 [>.............................] - ETA: 0s - loss: 0.3341 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.5027 - accuracy: 0.8172
Epoch 162/200
1/20 [>.............................] - ETA: 0s - loss: 0.5071 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5082 - accuracy: 0.8109
Epoch 163/200
1/20 [>.............................] - ETA: 0s - loss: 0.4235 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5077 - accuracy: 0.8047
Epoch 164/200
1/20 [>.............................] - ETA: 0s - loss: 0.4239 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5009 - accuracy: 0.8031
Epoch 165/200
1/20 [>.............................] - ETA: 0s - loss: 0.4945 - accuracy: 0.8750
20/20 [==============================] - 0s 999us/step - loss: 0.5103 - accuracy: 0.8125
Epoch 166/200
1/20 [>.............................] - ETA: 0s - loss: 0.5649 - accuracy: 0.7188
20/20 [==============================] - 0s 1ms/step - loss: 0.5091 - accuracy: 0.8031
Epoch 167/200
1/20 [>.............................] - ETA: 0s - loss: 0.5773 - accuracy: 0.7500
20/20 [==============================] - 0s 1ms/step - loss: 0.5101 - accuracy: 0.8062
Epoch 168/200
1/20 [>.............................] - ETA: 0s - loss: 0.5202 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5255 - accuracy: 0.8062
Epoch 169/200
1/20 [>.............................] - ETA: 0s - loss: 0.4153 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5045 - accuracy: 0.8125
Epoch 170/200
1/20 [>.............................] - ETA: 0s - loss: 0.5902 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5019 - accuracy: 0.8156
Epoch 171/200
1/20 [>.............................] - ETA: 0s - loss: 0.4161 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.4895 - accuracy: 0.8188
Epoch 172/200
1/20 [>.............................] - ETA: 0s - loss: 0.4054 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5019 - accuracy: 0.8156
Epoch 173/200
1/20 [>.............................] - ETA: 0s - loss: 0.3076 - accuracy: 0.9375
20/20 [==============================] - 0s 1ms/step - loss: 0.4954 - accuracy: 0.8219
Epoch 174/200
1/20 [>.............................] - ETA: 0s - loss: 0.4779 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5091 - accuracy: 0.8125
Epoch 175/200
1/20 [>.............................] - ETA: 0s - loss: 0.5585 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.4979 - accuracy: 0.8094
Epoch 176/200
1/20 [>.............................] - ETA: 0s - loss: 0.4466 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5109 - accuracy: 0.8047
Epoch 177/200
1/20 [>.............................] - ETA: 0s - loss: 0.5662 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5011 - accuracy: 0.8094
Epoch 178/200
1/20 [>.............................] - ETA: 0s - loss: 0.2879 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.4931 - accuracy: 0.8109
Epoch 179/200
1/20 [>.............................] - ETA: 0s - loss: 0.3553 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.5173 - accuracy: 0.8062
Epoch 180/200
1/20 [>.............................] - ETA: 0s - loss: 0.4403 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5814 - accuracy: 0.7828
Epoch 181/200
1/20 [>.............................] - ETA: 0s - loss: 0.4111 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.5116 - accuracy: 0.8047
Epoch 182/200
1/20 [>.............................] - ETA: 0s - loss: 0.3189 - accuracy: 0.9375
20/20 [==============================] - 0s 1ms/step - loss: 0.4961 - accuracy: 0.8203
Epoch 183/200
1/20 [>.............................] - ETA: 0s - loss: 0.3608 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.4971 - accuracy: 0.8188
Epoch 184/200
1/20 [>.............................] - ETA: 0s - loss: 0.4652 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.4885 - accuracy: 0.8172
Epoch 185/200
1/20 [>.............................] - ETA: 0s - loss: 0.4142 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.4924 - accuracy: 0.8125
Epoch 186/200
1/20 [>.............................] - ETA: 0s - loss: 0.3625 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.4973 - accuracy: 0.8078
Epoch 187/200
1/20 [>.............................] - ETA: 0s - loss: 0.4860 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.4949 - accuracy: 0.8109
Epoch 188/200
1/20 [>.............................] - ETA: 0s - loss: 0.3846 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.4982 - accuracy: 0.8125
Epoch 189/200
1/20 [>.............................] - ETA: 0s - loss: 0.3095 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.4981 - accuracy: 0.8188
Epoch 190/200
1/20 [>.............................] - ETA: 0s - loss: 0.2752 - accuracy: 0.8750
20/20 [==============================] - 0s 1ms/step - loss: 0.4896 - accuracy: 0.8172
Epoch 191/200
1/20 [>.............................] - ETA: 0s - loss: 0.3192 - accuracy: 0.9375
20/20 [==============================] - 0s 1ms/step - loss: 0.4905 - accuracy: 0.8156
Epoch 192/200
1/20 [>.............................] - ETA: 0s - loss: 0.6260 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5072 - accuracy: 0.8078
Epoch 193/200
1/20 [>.............................] - ETA: 0s - loss: 0.5014 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5126 - accuracy: 0.8000
Epoch 194/200
1/20 [>.............................] - ETA: 0s - loss: 0.5277 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.5136 - accuracy: 0.7984
Epoch 195/200
1/20 [>.............................] - ETA: 0s - loss: 0.5471 - accuracy: 0.8125
20/20 [==============================] - 0s 1ms/step - loss: 0.5502 - accuracy: 0.7969
Epoch 196/200
1/20 [>.............................] - ETA: 0s - loss: 0.6676 - accuracy: 0.7812
20/20 [==============================] - 0s 1ms/step - loss: 0.5050 - accuracy: 0.8219
Epoch 197/200
1/20 [>.............................] - ETA: 0s - loss: 0.2430 - accuracy: 0.9062
20/20 [==============================] - 0s 1ms/step - loss: 0.4951 - accuracy: 0.8031
Epoch 198/200
1/20 [>.............................] - ETA: 0s - loss: 0.7437 - accuracy: 0.7188
20/20 [==============================] - 0s 977us/step - loss: 0.4862 - accuracy: 0.8188
Epoch 199/200
1/20 [>.............................] - ETA: 0s - loss: 0.3874 - accuracy: 0.8438
20/20 [==============================] - 0s 994us/step - loss: 0.4986 - accuracy: 0.8234
Epoch 200/200
1/20 [>.............................] - ETA: 0s - loss: 0.3672 - accuracy: 0.8438
20/20 [==============================] - 0s 1ms/step - loss: 0.4984 - accuracy: 0.8125
<keras.src.callbacks.History object at 0x7f883f569990>
# Evaluate your model accuracy on the test data
accuracy = model.evaluate(coord_test, competitors_test)[1 ]
1/5 [=====>........................] - ETA: 0s - loss: 0.9331 - accuracy: 0.6562
5/5 [==============================] - 0s 1ms/step - loss: 0.6476 - accuracy: 0.7875
# Print accuracy
print ('Accuracy:' , accuracy)
Accuracy: 0.7875000238418579
Softmax predictions
Your recently trained model is loaded for you. This model is generalizing well!, that’s why you got a high accuracy on the test set.
Since you used the softmax activation function, for every input of 2 coordinates provided to your model there’s an output vector of 4 numbers. Each of these numbers encodes the probability of a given dart being thrown by one of the 4 possible competitors.
When computing accuracy with the model’s .evaluate() method, your model takes the class with the highest probability as the prediction. np.argmax() can help you do this since it returns the index with the highest value in an array.
Use the collection of test throws stored in coords_small_test and np.argmax()to check this out!
import numpy as np
# Predict on coords_small_test
coords_small_test = coord_test.iloc[:5 , :]
competitors_small_test = competitors_test[:5 , :]
preds = model.predict(coords_small_test)
1/1 [==============================] - ETA: 0s
1/1 [==============================] - 0s 37ms/step
# Print preds vs true values
print (" {:45} | {} " .format ('Raw Model Predictions' ,'True labels' ))
Raw Model Predictions | True labels
for i,pred in enumerate (preds):
print (" {} | {} " .format (pred,competitors_small_test[i]))
[0.23973112 0.02065724 0.7249623 0.0146494 ] | [0. 0. 1. 0.]
[6.1699899e-04 9.9796259e-01 1.3867869e-03 3.3657390e-05] | [0. 1. 0. 0.]
[0.6093697 0.02614117 0.35528654 0.00920259] | [1. 0. 0. 0.]
[0.09525175 0.01195013 0.8808766 0.01192153] | [1. 0. 0. 0.]
[0.2235388 0.01284886 0.7572938 0.00631859] | [0. 0. 0. 1.]
# Extract the position of highest probability from each pred vector
preds_chosen = [np.argmax(pred) for pred in preds]
# Print preds vs true values
print (" {:10} | {} " .format ('Rounded Model Predictions' ,'True labels' ))
Rounded Model Predictions | True labels
for i,pred in enumerate (preds_chosen):
print (" {:25} | {} " .format (pred,competitors_small_test[i]))
2 | [0. 0. 1. 0.]
1 | [0. 1. 0. 0.]
0 | [1. 0. 0. 0.]
2 | [1. 0. 0. 0.]
2 | [0. 0. 0. 1.]
An irrigation machine
You’re going to automate the watering of farm parcels by making an intelligent irrigation machine. Multi-label classification problems differ from multi-class problems in that each observation can be labeled with zero or more classes. So classes/labels are not mutually exclusive, you could water all, none or any combination of farm parcels based on the inputs.
To account for this behavior what we do is have an output layer with as many neurons as classes but this time, unlike in multi-class problems, each output neuron has a sigmoid activation function. This makes each neuron in the output layer able to output a number between 0 and 1 independently.
The Sequential() model and Dense() layers are ready to be used. It’s time to build an intelligent irrigation machine!
# Instantiate a Sequential model
model = Sequential()
# Add a hidden layer of 64 neurons and a 20 neuron's input
model.add(Dense(64 , input_shape = (20 ,), activation= 'relu' ))
# Add an output layer of 3 neurons with sigmoid activation
model.add(Dense(3 , activation= 'sigmoid' ))
# Compile your model with binary crossentropy loss
model.compile (optimizer= 'adam' ,
loss = "binary_crossentropy" ,
metrics= ['accuracy' ])
model.summary()
Model: "sequential_6"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_15 (Dense) (None, 64) 1344
dense_16 (Dense) (None, 3) 195
=================================================================
Total params: 1539 (6.01 KB)
Trainable params: 1539 (6.01 KB)
Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________
Training with multiple labels
An output of your multi-label model could look like this: [0.76 , 0.99 , 0.66 ]. If we round up probabilities higher than 0.5, this observation will be classified as containing all 3 possible labels [1,1,1]. For this particular problem, this would mean watering all 3 parcels in your farm is the right thing to do, according to the network, given the input sensor measurements.
You will now train and predict with the model you just built. sensors_train, parcels_train, sensors_test and parcels_test are already loaded for you to use.
Let’s see how well your intelligent machine performs!
irrigation_df = pd.read_csv('data/irrigation_machine.csv' , index_col= 0 )
# Define lists of column names for sensors and parcels
sensor_columns = [col for col in irrigation_df.columns if 'sensor_' in col]
parcel_columns = [col for col in irrigation_df .columns if 'parcel_' in col]
# Separate sensor readings and parcel indicators
sensors = irrigation_df[sensor_columns] # Sensor readings columns
parcels = irrigation_df[parcel_columns] # Parcel indicators columns
# Split the datasets into training and testing sets
sensors_train, sensors_test, parcels_train, parcels_test = train_test_split(
sensors, parcels, test_size= 0.2 , random_state= 42
)
# Checking the shapes of the resulting datasets to confirm successful split
sensors_train.shape, sensors_test.shape, parcels_train.shape, parcels_test.shape
((1600, 20), (400, 20), (1600, 3), (400, 3))
# Train for 100 epochs using a validation split of 0.2
model.fit(sensors_train, parcels_train, epochs = 100 , validation_split = 0.2 )
Epoch 1/100
1/40 [..............................] - ETA: 9s - loss: 1.0210 - accuracy: 0.1875
40/40 [==============================] - 0s 4ms/step - loss: 0.6166 - accuracy: 0.5453 - val_loss: 0.4724 - val_accuracy: 0.5031
Epoch 2/100
1/40 [..............................] - ETA: 0s - loss: 0.4952 - accuracy: 0.5312
40/40 [==============================] - 0s 1ms/step - loss: 0.4354 - accuracy: 0.6359 - val_loss: 0.3659 - val_accuracy: 0.6187
Epoch 3/100
1/40 [..............................] - ETA: 0s - loss: 0.4524 - accuracy: 0.7500
40/40 [==============================] - 0s 1ms/step - loss: 0.3614 - accuracy: 0.6625 - val_loss: 0.3289 - val_accuracy: 0.6000
Epoch 4/100
1/40 [..............................] - ETA: 0s - loss: 0.3064 - accuracy: 0.7500
40/40 [==============================] - 0s 1ms/step - loss: 0.3270 - accuracy: 0.6555 - val_loss: 0.3005 - val_accuracy: 0.5813
Epoch 5/100
1/40 [..............................] - ETA: 0s - loss: 0.2707 - accuracy: 0.6875
40/40 [==============================] - 0s 1ms/step - loss: 0.3071 - accuracy: 0.6625 - val_loss: 0.2874 - val_accuracy: 0.5906
Epoch 6/100
1/40 [..............................] - ETA: 0s - loss: 0.2405 - accuracy: 0.7188
40/40 [==============================] - 0s 1ms/step - loss: 0.2938 - accuracy: 0.6383 - val_loss: 0.2795 - val_accuracy: 0.6406
Epoch 7/100
1/40 [..............................] - ETA: 0s - loss: 0.3016 - accuracy: 0.7188
40/40 [==============================] - 0s 1ms/step - loss: 0.2836 - accuracy: 0.6539 - val_loss: 0.2667 - val_accuracy: 0.6094
Epoch 8/100
1/40 [..............................] - ETA: 0s - loss: 0.3170 - accuracy: 0.6562
40/40 [==============================] - 0s 2ms/step - loss: 0.2753 - accuracy: 0.6430 - val_loss: 0.2601 - val_accuracy: 0.6219
Epoch 9/100
1/40 [..............................] - ETA: 0s - loss: 0.2538 - accuracy: 0.5938
40/40 [==============================] - 0s 1ms/step - loss: 0.2687 - accuracy: 0.6453 - val_loss: 0.2579 - val_accuracy: 0.5719
Epoch 10/100
1/40 [..............................] - ETA: 0s - loss: 0.2941 - accuracy: 0.5312
40/40 [==============================] - 0s 1ms/step - loss: 0.2607 - accuracy: 0.6391 - val_loss: 0.2499 - val_accuracy: 0.5969
Epoch 11/100
1/40 [..............................] - ETA: 0s - loss: 0.1631 - accuracy: 0.7188
40/40 [==============================] - 0s 1ms/step - loss: 0.2570 - accuracy: 0.6359 - val_loss: 0.2430 - val_accuracy: 0.6000
Epoch 12/100
1/40 [..............................] - ETA: 0s - loss: 0.2997 - accuracy: 0.5312
40/40 [==============================] - 0s 1ms/step - loss: 0.2514 - accuracy: 0.6406 - val_loss: 0.2481 - val_accuracy: 0.5969
Epoch 13/100
1/40 [..............................] - ETA: 0s - loss: 0.3267 - accuracy: 0.6875
40/40 [==============================] - 0s 1ms/step - loss: 0.2451 - accuracy: 0.6367 - val_loss: 0.2336 - val_accuracy: 0.6344
Epoch 14/100
1/40 [..............................] - ETA: 0s - loss: 0.2336 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.2396 - accuracy: 0.6375 - val_loss: 0.2291 - val_accuracy: 0.6250
Epoch 15/100
1/40 [..............................] - ETA: 0s - loss: 0.2252 - accuracy: 0.5312
40/40 [==============================] - 0s 1ms/step - loss: 0.2353 - accuracy: 0.6383 - val_loss: 0.2270 - val_accuracy: 0.5969
Epoch 16/100
1/40 [..............................] - ETA: 0s - loss: 0.2371 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.2316 - accuracy: 0.6305 - val_loss: 0.2219 - val_accuracy: 0.6000
Epoch 17/100
1/40 [..............................] - ETA: 0s - loss: 0.2420 - accuracy: 0.5938
40/40 [==============================] - 0s 1ms/step - loss: 0.2283 - accuracy: 0.6328 - val_loss: 0.2188 - val_accuracy: 0.5844
Epoch 18/100
1/40 [..............................] - ETA: 0s - loss: 0.2861 - accuracy: 0.7500
40/40 [==============================] - 0s 1ms/step - loss: 0.2243 - accuracy: 0.6289 - val_loss: 0.2241 - val_accuracy: 0.6250
Epoch 19/100
1/40 [..............................] - ETA: 0s - loss: 0.2370 - accuracy: 0.7500
40/40 [==============================] - 0s 1ms/step - loss: 0.2254 - accuracy: 0.6211 - val_loss: 0.2131 - val_accuracy: 0.5844
Epoch 20/100
1/40 [..............................] - ETA: 0s - loss: 0.1997 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.2186 - accuracy: 0.6352 - val_loss: 0.2129 - val_accuracy: 0.6031
Epoch 21/100
1/40 [..............................] - ETA: 0s - loss: 0.2706 - accuracy: 0.7188
40/40 [==============================] - 0s 1ms/step - loss: 0.2164 - accuracy: 0.6344 - val_loss: 0.2083 - val_accuracy: 0.6281
Epoch 22/100
1/40 [..............................] - ETA: 0s - loss: 0.1379 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.2124 - accuracy: 0.6203 - val_loss: 0.2091 - val_accuracy: 0.6000
Epoch 23/100
1/40 [..............................] - ETA: 0s - loss: 0.1925 - accuracy: 0.6875
40/40 [==============================] - 0s 1ms/step - loss: 0.2109 - accuracy: 0.6367 - val_loss: 0.2061 - val_accuracy: 0.5906
Epoch 24/100
1/40 [..............................] - ETA: 0s - loss: 0.1991 - accuracy: 0.5938
40/40 [==============================] - 0s 1ms/step - loss: 0.2096 - accuracy: 0.6164 - val_loss: 0.2069 - val_accuracy: 0.5875
Epoch 25/100
1/40 [..............................] - ETA: 0s - loss: 0.1747 - accuracy: 0.5312
40/40 [==============================] - 0s 1ms/step - loss: 0.2067 - accuracy: 0.6367 - val_loss: 0.2080 - val_accuracy: 0.5875
Epoch 26/100
1/40 [..............................] - ETA: 0s - loss: 0.2787 - accuracy: 0.6562
40/40 [==============================] - 0s 1ms/step - loss: 0.2048 - accuracy: 0.6203 - val_loss: 0.2048 - val_accuracy: 0.5875
Epoch 27/100
1/40 [..............................] - ETA: 0s - loss: 0.1258 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.2041 - accuracy: 0.6211 - val_loss: 0.2041 - val_accuracy: 0.6031
Epoch 28/100
1/40 [..............................] - ETA: 0s - loss: 0.2780 - accuracy: 0.5625
40/40 [==============================] - 0s 1ms/step - loss: 0.2026 - accuracy: 0.6109 - val_loss: 0.2032 - val_accuracy: 0.5469
Epoch 29/100
1/40 [..............................] - ETA: 0s - loss: 0.2479 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.2014 - accuracy: 0.6320 - val_loss: 0.1995 - val_accuracy: 0.6031
Epoch 30/100
1/40 [..............................] - ETA: 0s - loss: 0.1830 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.1988 - accuracy: 0.6164 - val_loss: 0.2014 - val_accuracy: 0.5938
Epoch 31/100
1/40 [..............................] - ETA: 0s - loss: 0.2314 - accuracy: 0.7188
40/40 [==============================] - 0s 1ms/step - loss: 0.1989 - accuracy: 0.6102 - val_loss: 0.2044 - val_accuracy: 0.5719
Epoch 32/100
1/40 [..............................] - ETA: 0s - loss: 0.2450 - accuracy: 0.6875
40/40 [==============================] - 0s 1ms/step - loss: 0.1971 - accuracy: 0.6383 - val_loss: 0.1996 - val_accuracy: 0.5625
Epoch 33/100
1/40 [..............................] - ETA: 0s - loss: 0.2699 - accuracy: 0.5938
40/40 [==============================] - 0s 1ms/step - loss: 0.1968 - accuracy: 0.6070 - val_loss: 0.1983 - val_accuracy: 0.6125
Epoch 34/100
1/40 [..............................] - ETA: 0s - loss: 0.1943 - accuracy: 0.6875
40/40 [==============================] - 0s 1ms/step - loss: 0.1974 - accuracy: 0.6328 - val_loss: 0.1986 - val_accuracy: 0.6156
Epoch 35/100
1/40 [..............................] - ETA: 0s - loss: 0.1560 - accuracy: 0.5625
40/40 [==============================] - 0s 1ms/step - loss: 0.1941 - accuracy: 0.6250 - val_loss: 0.2000 - val_accuracy: 0.6031
Epoch 36/100
1/40 [..............................] - ETA: 0s - loss: 0.1346 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.1924 - accuracy: 0.6039 - val_loss: 0.2004 - val_accuracy: 0.6438
Epoch 37/100
1/40 [..............................] - ETA: 0s - loss: 0.4154 - accuracy: 0.7500
40/40 [==============================] - 0s 1ms/step - loss: 0.1915 - accuracy: 0.6297 - val_loss: 0.1974 - val_accuracy: 0.6125
Epoch 38/100
1/40 [..............................] - ETA: 0s - loss: 0.1666 - accuracy: 0.5938
40/40 [==============================] - 0s 1ms/step - loss: 0.1913 - accuracy: 0.6242 - val_loss: 0.1975 - val_accuracy: 0.5969
Epoch 39/100
1/40 [..............................] - ETA: 0s - loss: 0.1826 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.1902 - accuracy: 0.6336 - val_loss: 0.1994 - val_accuracy: 0.5562
Epoch 40/100
1/40 [..............................] - ETA: 0s - loss: 0.1786 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.1907 - accuracy: 0.6039 - val_loss: 0.1969 - val_accuracy: 0.6344
Epoch 41/100
1/40 [..............................] - ETA: 0s - loss: 0.2510 - accuracy: 0.6875
40/40 [==============================] - 0s 1ms/step - loss: 0.1877 - accuracy: 0.6258 - val_loss: 0.2023 - val_accuracy: 0.5531
Epoch 42/100
1/40 [..............................] - ETA: 0s - loss: 0.1810 - accuracy: 0.6562
40/40 [==============================] - 0s 1ms/step - loss: 0.1913 - accuracy: 0.6141 - val_loss: 0.2008 - val_accuracy: 0.5781
Epoch 43/100
1/40 [..............................] - ETA: 0s - loss: 0.1925 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.1857 - accuracy: 0.6211 - val_loss: 0.1972 - val_accuracy: 0.6219
Epoch 44/100
1/40 [..............................] - ETA: 0s - loss: 0.1920 - accuracy: 0.5938
40/40 [==============================] - 0s 1ms/step - loss: 0.1830 - accuracy: 0.6133 - val_loss: 0.1957 - val_accuracy: 0.6219
Epoch 45/100
1/40 [..............................] - ETA: 0s - loss: 0.2360 - accuracy: 0.6562
40/40 [==============================] - 0s 1ms/step - loss: 0.1825 - accuracy: 0.6242 - val_loss: 0.1973 - val_accuracy: 0.5406
Epoch 46/100
1/40 [..............................] - ETA: 0s - loss: 0.1993 - accuracy: 0.5000
40/40 [==============================] - 0s 1ms/step - loss: 0.1810 - accuracy: 0.6117 - val_loss: 0.2024 - val_accuracy: 0.6469
Epoch 47/100
1/40 [..............................] - ETA: 0s - loss: 0.1295 - accuracy: 0.6875
40/40 [==============================] - 0s 1ms/step - loss: 0.1833 - accuracy: 0.6195 - val_loss: 0.1958 - val_accuracy: 0.5875
Epoch 48/100
1/40 [..............................] - ETA: 0s - loss: 0.1159 - accuracy: 0.5000
40/40 [==============================] - 0s 1ms/step - loss: 0.1796 - accuracy: 0.6219 - val_loss: 0.1974 - val_accuracy: 0.5813
Epoch 49/100
1/40 [..............................] - ETA: 0s - loss: 0.1178 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.1807 - accuracy: 0.6148 - val_loss: 0.1983 - val_accuracy: 0.6250
Epoch 50/100
1/40 [..............................] - ETA: 0s - loss: 0.1903 - accuracy: 0.6562
40/40 [==============================] - 0s 1ms/step - loss: 0.1788 - accuracy: 0.6234 - val_loss: 0.1985 - val_accuracy: 0.5594
Epoch 51/100
1/40 [..............................] - ETA: 0s - loss: 0.1265 - accuracy: 0.5000
40/40 [==============================] - 0s 1ms/step - loss: 0.1794 - accuracy: 0.6078 - val_loss: 0.1992 - val_accuracy: 0.6281
Epoch 52/100
1/40 [..............................] - ETA: 0s - loss: 0.2288 - accuracy: 0.7188
40/40 [==============================] - 0s 1ms/step - loss: 0.1790 - accuracy: 0.6172 - val_loss: 0.1970 - val_accuracy: 0.6281
Epoch 53/100
1/40 [..............................] - ETA: 0s - loss: 0.1011 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.1772 - accuracy: 0.6219 - val_loss: 0.1955 - val_accuracy: 0.5719
Epoch 54/100
1/40 [..............................] - ETA: 0s - loss: 0.1309 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.1737 - accuracy: 0.6328 - val_loss: 0.2026 - val_accuracy: 0.5906
Epoch 55/100
1/40 [..............................] - ETA: 0s - loss: 0.2361 - accuracy: 0.5938
40/40 [==============================] - 0s 1ms/step - loss: 0.1729 - accuracy: 0.6117 - val_loss: 0.1969 - val_accuracy: 0.5500
Epoch 56/100
1/40 [..............................] - ETA: 0s - loss: 0.0671 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.1756 - accuracy: 0.6250 - val_loss: 0.1975 - val_accuracy: 0.5094
Epoch 57/100
1/40 [..............................] - ETA: 0s - loss: 0.2057 - accuracy: 0.3750
40/40 [==============================] - 0s 1ms/step - loss: 0.1722 - accuracy: 0.6094 - val_loss: 0.1976 - val_accuracy: 0.5781
Epoch 58/100
1/40 [..............................] - ETA: 0s - loss: 0.1633 - accuracy: 0.6562
40/40 [==============================] - 0s 1ms/step - loss: 0.1707 - accuracy: 0.6289 - val_loss: 0.1975 - val_accuracy: 0.5906
Epoch 59/100
1/40 [..............................] - ETA: 0s - loss: 0.1115 - accuracy: 0.5938
40/40 [==============================] - 0s 1ms/step - loss: 0.1713 - accuracy: 0.6039 - val_loss: 0.2010 - val_accuracy: 0.5969
Epoch 60/100
1/40 [..............................] - ETA: 0s - loss: 0.1610 - accuracy: 0.6875
40/40 [==============================] - 0s 1ms/step - loss: 0.1690 - accuracy: 0.6141 - val_loss: 0.1997 - val_accuracy: 0.5875
Epoch 61/100
1/40 [..............................] - ETA: 0s - loss: 0.1141 - accuracy: 0.7812
40/40 [==============================] - 0s 1ms/step - loss: 0.1698 - accuracy: 0.6195 - val_loss: 0.1975 - val_accuracy: 0.5625
Epoch 62/100
1/40 [..............................] - ETA: 0s - loss: 0.2327 - accuracy: 0.6562
40/40 [==============================] - 0s 1ms/step - loss: 0.1687 - accuracy: 0.6078 - val_loss: 0.1989 - val_accuracy: 0.6344
Epoch 63/100
1/40 [..............................] - ETA: 0s - loss: 0.1522 - accuracy: 0.7500
40/40 [==============================] - 0s 1ms/step - loss: 0.1715 - accuracy: 0.6328 - val_loss: 0.2056 - val_accuracy: 0.5562
Epoch 64/100
1/40 [..............................] - ETA: 0s - loss: 0.2274 - accuracy: 0.5625
40/40 [==============================] - 0s 1ms/step - loss: 0.1663 - accuracy: 0.5984 - val_loss: 0.2015 - val_accuracy: 0.5969
Epoch 65/100
1/40 [..............................] - ETA: 0s - loss: 0.1729 - accuracy: 0.5312
40/40 [==============================] - 0s 1ms/step - loss: 0.1665 - accuracy: 0.6242 - val_loss: 0.2032 - val_accuracy: 0.5281
Epoch 66/100
1/40 [..............................] - ETA: 0s - loss: 0.1149 - accuracy: 0.5312
40/40 [==============================] - 0s 1ms/step - loss: 0.1707 - accuracy: 0.6211 - val_loss: 0.2007 - val_accuracy: 0.5344
Epoch 67/100
1/40 [..............................] - ETA: 0s - loss: 0.2176 - accuracy: 0.6250
40/40 [==============================] - 0s 2ms/step - loss: 0.1669 - accuracy: 0.6438 - val_loss: 0.2012 - val_accuracy: 0.5063
Epoch 68/100
1/40 [..............................] - ETA: 0s - loss: 0.2165 - accuracy: 0.4062
40/40 [==============================] - 0s 2ms/step - loss: 0.1643 - accuracy: 0.5945 - val_loss: 0.1956 - val_accuracy: 0.5844
Epoch 69/100
1/40 [..............................] - ETA: 0s - loss: 0.1128 - accuracy: 0.5938
40/40 [==============================] - 0s 2ms/step - loss: 0.1617 - accuracy: 0.6180 - val_loss: 0.1972 - val_accuracy: 0.6156
Epoch 70/100
1/40 [..............................] - ETA: 0s - loss: 0.1535 - accuracy: 0.5938
40/40 [==============================] - 0s 1ms/step - loss: 0.1642 - accuracy: 0.6281 - val_loss: 0.1973 - val_accuracy: 0.5938
Epoch 71/100
1/40 [..............................] - ETA: 0s - loss: 0.1645 - accuracy: 0.6250
40/40 [==============================] - 0s 1ms/step - loss: 0.1620 - accuracy: 0.6070 - val_loss: 0.2003 - val_accuracy: 0.5875
Epoch 72/100
1/40 [..............................] - ETA: 0s - loss: 0.1162 - accuracy: 0.7500
40/40 [==============================] - 0s 1ms/step - loss: 0.1598 - accuracy: 0.5961 - val_loss: 0.2029 - val_accuracy: 0.6406
Epoch 73/100
1/40 [..............................] - ETA: 0s - loss: 0.2259 - accuracy: 0.7812
40/40 [==============================] - 0s 1ms/step - loss: 0.1605 - accuracy: 0.6195 - val_loss: 0.1959 - val_accuracy: 0.5813
Epoch 74/100
1/40 [..............................] - ETA: 0s - loss: 0.0943 - accuracy: 0.5625
40/40 [==============================] - 0s 1ms/step - loss: 0.1582 - accuracy: 0.6187 - val_loss: 0.2016 - val_accuracy: 0.6219
Epoch 75/100
1/40 [..............................] - ETA: 0s - loss: 0.1929 - accuracy: 0.8438
40/40 [==============================] - 0s 2ms/step - loss: 0.1577 - accuracy: 0.6172 - val_loss: 0.1971 - val_accuracy: 0.5906
Epoch 76/100
1/40 [..............................] - ETA: 0s - loss: 0.1708 - accuracy: 0.7500
40/40 [==============================] - 0s 1ms/step - loss: 0.1566 - accuracy: 0.5992 - val_loss: 0.1997 - val_accuracy: 0.5531
Epoch 77/100
1/40 [..............................] - ETA: 0s - loss: 0.1765 - accuracy: 0.7500
40/40 [==============================] - 0s 1ms/step - loss: 0.1553 - accuracy: 0.6062 - val_loss: 0.1964 - val_accuracy: 0.5844
Epoch 78/100
1/40 [..............................] - ETA: 0s - loss: 0.1911 - accuracy: 0.5000
40/40 [==============================] - 0s 1ms/step - loss: 0.1552 - accuracy: 0.6195 - val_loss: 0.1980 - val_accuracy: 0.5625
Epoch 79/100
1/40 [..............................] - ETA: 0s - loss: 0.0991 - accuracy: 0.4375
40/40 [==============================] - 0s 1ms/step - loss: 0.1549 - accuracy: 0.6266 - val_loss: 0.2016 - val_accuracy: 0.5344
Epoch 80/100
1/40 [..............................] - ETA: 0s - loss: 0.1141 - accuracy: 0.5312
40/40 [==============================] - 0s 1ms/step - loss: 0.1536 - accuracy: 0.6016 - val_loss: 0.1986 - val_accuracy: 0.5625
Epoch 81/100
1/40 [..............................] - ETA: 0s - loss: 0.1390 - accuracy: 0.7188
40/40 [==============================] - 0s 1ms/step - loss: 0.1521 - accuracy: 0.6078 - val_loss: 0.1979 - val_accuracy: 0.5469
Epoch 82/100
1/40 [..............................] - ETA: 0s - loss: 0.1211 - accuracy: 0.5000
40/40 [==============================] - 0s 1ms/step - loss: 0.1518 - accuracy: 0.6031 - val_loss: 0.1973 - val_accuracy: 0.5781
Epoch 83/100
1/40 [..............................] - ETA: 0s - loss: 0.1348 - accuracy: 0.5000
40/40 [==============================] - 0s 1ms/step - loss: 0.1512 - accuracy: 0.5953 - val_loss: 0.2016 - val_accuracy: 0.5375
Epoch 84/100
1/40 [..............................] - ETA: 0s - loss: 0.1569 - accuracy: 0.3438
40/40 [==============================] - 0s 1ms/step - loss: 0.1520 - accuracy: 0.6344 - val_loss: 0.2029 - val_accuracy: 0.5875
Epoch 85/100
1/40 [..............................] - ETA: 0s - loss: 0.1344 - accuracy: 0.6875
40/40 [==============================] - 0s 1ms/step - loss: 0.1503 - accuracy: 0.6039 - val_loss: 0.2040 - val_accuracy: 0.5156
Epoch 86/100
1/40 [..............................] - ETA: 0s - loss: 0.1431 - accuracy: 0.5000
40/40 [==============================] - 0s 1ms/step - loss: 0.1494 - accuracy: 0.6031 - val_loss: 0.1983 - val_accuracy: 0.5750
Epoch 87/100
1/40 [..............................] - ETA: 0s - loss: 0.0581 - accuracy: 0.5938
40/40 [==============================] - 0s 1ms/step - loss: 0.1476 - accuracy: 0.6305 - val_loss: 0.1986 - val_accuracy: 0.5750
Epoch 88/100
1/40 [..............................] - ETA: 0s - loss: 0.0602 - accuracy: 0.7500
40/40 [==============================] - 0s 1ms/step - loss: 0.1466 - accuracy: 0.6016 - val_loss: 0.2017 - val_accuracy: 0.5813
Epoch 89/100
1/40 [..............................] - ETA: 0s - loss: 0.0902 - accuracy: 0.7500
40/40 [==============================] - 0s 1ms/step - loss: 0.1458 - accuracy: 0.6133 - val_loss: 0.2025 - val_accuracy: 0.5250
Epoch 90/100
1/40 [..............................] - ETA: 0s - loss: 0.0771 - accuracy: 0.5625
40/40 [==============================] - 0s 1ms/step - loss: 0.1452 - accuracy: 0.6117 - val_loss: 0.1997 - val_accuracy: 0.5688
Epoch 91/100
1/40 [..............................] - ETA: 0s - loss: 0.1767 - accuracy: 0.3750
40/40 [==============================] - 0s 1ms/step - loss: 0.1441 - accuracy: 0.6008 - val_loss: 0.2038 - val_accuracy: 0.5562
Epoch 92/100
1/40 [..............................] - ETA: 0s - loss: 0.1240 - accuracy: 0.4688
40/40 [==============================] - 0s 1ms/step - loss: 0.1448 - accuracy: 0.6055 - val_loss: 0.1993 - val_accuracy: 0.5344
Epoch 93/100
1/40 [..............................] - ETA: 0s - loss: 0.0880 - accuracy: 0.6875
40/40 [==============================] - 0s 1ms/step - loss: 0.1442 - accuracy: 0.6016 - val_loss: 0.2073 - val_accuracy: 0.5531
Epoch 94/100
1/40 [..............................] - ETA: 0s - loss: 0.0965 - accuracy: 0.7188
40/40 [==============================] - 0s 1ms/step - loss: 0.1447 - accuracy: 0.5938 - val_loss: 0.2029 - val_accuracy: 0.5562
Epoch 95/100
1/40 [..............................] - ETA: 0s - loss: 0.1983 - accuracy: 0.5312
40/40 [==============================] - 0s 1ms/step - loss: 0.1418 - accuracy: 0.6133 - val_loss: 0.2041 - val_accuracy: 0.5562
Epoch 96/100
1/40 [..............................] - ETA: 0s - loss: 0.1344 - accuracy: 0.5625
40/40 [==============================] - 0s 1ms/step - loss: 0.1426 - accuracy: 0.5930 - val_loss: 0.1997 - val_accuracy: 0.5969
Epoch 97/100
1/40 [..............................] - ETA: 0s - loss: 0.1168 - accuracy: 0.7500
40/40 [==============================] - 0s 1ms/step - loss: 0.1394 - accuracy: 0.6234 - val_loss: 0.2016 - val_accuracy: 0.5594
Epoch 98/100
1/40 [..............................] - ETA: 0s - loss: 0.1116 - accuracy: 0.5625
40/40 [==============================] - 0s 1ms/step - loss: 0.1442 - accuracy: 0.5984 - val_loss: 0.2012 - val_accuracy: 0.5688
Epoch 99/100
1/40 [..............................] - ETA: 0s - loss: 0.0872 - accuracy: 0.5625
40/40 [==============================] - 0s 1ms/step - loss: 0.1398 - accuracy: 0.6102 - val_loss: 0.1995 - val_accuracy: 0.6094
Epoch 100/100
1/40 [..............................] - ETA: 0s - loss: 0.1313 - accuracy: 0.5938
40/40 [==============================] - 0s 1ms/step - loss: 0.1389 - accuracy: 0.5914 - val_loss: 0.2013 - val_accuracy: 0.5562
<keras.src.callbacks.History object at 0x7f883e7a5d50>
# Predict on sensors_test and round up the predictions
preds = model.predict(sensors_test)
1/13 [=>............................] - ETA: 0s
13/13 [==============================] - 0s 837us/step
preds_rounded = np.round (preds)
# Print rounded preds
print ('Rounded Predictions: \n ' , preds_rounded)
Rounded Predictions:
[[1. 1. 0.]
[0. 0. 0.]
[1. 1. 0.]
...
[0. 0. 0.]
[1. 1. 0.]
[0. 1. 0.]]
# Evaluate your model's accuracy on the test data
accuracy = model.evaluate(sensors_test, parcels_test)[1 ]
1/13 [=>............................] - ETA: 0s - loss: 0.3071 - accuracy: 0.5938
13/13 [==============================] - 0s 961us/step - loss: 0.2477 - accuracy: 0.6025
# Print accuracy
print ('Accuracy:' , accuracy)
Accuracy: 0.6025000214576721
The history callback
The history callback is returned by default every time you train a model with the .fit() method. To access these metrics you can access the history dictionary parameter inside the returned h_callback object with the corresponding keys.
The irrigation machine model you built in the previous lesson is loaded for you to train, along with its features and labels now loaded as X_train, y_train, X_test, y_test. This time you will store the model’s historycallback and use the validation_data parameter as it trains.
You will plot the results stored in history with plot_accuracy() and plot_loss(), two simple matplotlib functions. You can check their code in the console by pasting show_code(plot_loss).
Let’s see the behind the scenes of our training!
# Create a sequential model
model = Sequential()
# Add a dense layer
model.add(Dense(1 , input_shape= (4 ,), activation= "sigmoid" ))
# Compile your model
model.compile (loss= 'binary_crossentropy' , optimizer= "sgd" , metrics= ['accuracy' ])
# Train your model and save its history
h_callback = model.fit(X_train, y_train, epochs = 100 ,
validation_data= (X_test, y_test))
Epoch 1/100
1/35 [..............................] - ETA: 6s - loss: 3.4288 - accuracy: 0.2812
35/35 [==============================] - 0s 4ms/step - loss: 2.0041 - accuracy: 0.4321 - val_loss: 0.9324 - val_accuracy: 0.5745
Epoch 2/100
1/35 [..............................] - ETA: 0s - loss: 1.3360 - accuracy: 0.4375
35/35 [==============================] - 0s 2ms/step - loss: 0.6665 - accuracy: 0.7010 - val_loss: 0.5263 - val_accuracy: 0.7818
Epoch 3/100
1/35 [..............................] - ETA: 0s - loss: 0.4071 - accuracy: 0.8438
35/35 [==============================] - 0s 1ms/step - loss: 0.3838 - accuracy: 0.8396 - val_loss: 0.3436 - val_accuracy: 0.8509
Epoch 4/100
1/35 [..............................] - ETA: 0s - loss: 0.3864 - accuracy: 0.8125
35/35 [==============================] - 0s 1ms/step - loss: 0.2685 - accuracy: 0.9189 - val_loss: 0.2707 - val_accuracy: 0.9164
Epoch 5/100
1/35 [..............................] - ETA: 0s - loss: 0.2137 - accuracy: 0.9688
35/35 [==============================] - 0s 1ms/step - loss: 0.2201 - accuracy: 0.9526 - val_loss: 0.2376 - val_accuracy: 0.9236
Epoch 6/100
1/35 [..............................] - ETA: 0s - loss: 0.2107 - accuracy: 0.9375
35/35 [==============================] - 0s 1ms/step - loss: 0.1952 - accuracy: 0.9608 - val_loss: 0.2184 - val_accuracy: 0.9309
Epoch 7/100
1/35 [..............................] - ETA: 0s - loss: 0.1799 - accuracy: 0.9688
35/35 [==============================] - 0s 1ms/step - loss: 0.1788 - accuracy: 0.9635 - val_loss: 0.2044 - val_accuracy: 0.9418
Epoch 8/100
1/35 [..............................] - ETA: 0s - loss: 0.1636 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.1669 - accuracy: 0.9644 - val_loss: 0.1934 - val_accuracy: 0.9455
Epoch 9/100
1/35 [..............................] - ETA: 0s - loss: 0.1363 - accuracy: 0.9688
35/35 [==============================] - 0s 1ms/step - loss: 0.1577 - accuracy: 0.9699 - val_loss: 0.1857 - val_accuracy: 0.9455
Epoch 10/100
1/35 [..............................] - ETA: 0s - loss: 0.1145 - accuracy: 0.9688
35/35 [==============================] - 0s 1ms/step - loss: 0.1501 - accuracy: 0.9699 - val_loss: 0.1788 - val_accuracy: 0.9382
Epoch 11/100
1/35 [..............................] - ETA: 0s - loss: 0.1542 - accuracy: 0.9688
35/35 [==============================] - 0s 1ms/step - loss: 0.1436 - accuracy: 0.9681 - val_loss: 0.1718 - val_accuracy: 0.9418
Epoch 12/100
1/35 [..............................] - ETA: 0s - loss: 0.1498 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1379 - accuracy: 0.9699 - val_loss: 0.1662 - val_accuracy: 0.9418
Epoch 13/100
1/35 [..............................] - ETA: 0s - loss: 0.1190 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1330 - accuracy: 0.9699 - val_loss: 0.1614 - val_accuracy: 0.9491
Epoch 14/100
1/35 [..............................] - ETA: 0s - loss: 0.1470 - accuracy: 0.9375
35/35 [==============================] - 0s 2ms/step - loss: 0.1287 - accuracy: 0.9727 - val_loss: 0.1573 - val_accuracy: 0.9491
Epoch 15/100
1/35 [..............................] - ETA: 0s - loss: 0.0714 - accuracy: 1.0000
35/35 [==============================] - 0s 1ms/step - loss: 0.1248 - accuracy: 0.9717 - val_loss: 0.1533 - val_accuracy: 0.9527
Epoch 16/100
1/35 [..............................] - ETA: 0s - loss: 0.1212 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.1213 - accuracy: 0.9717 - val_loss: 0.1496 - val_accuracy: 0.9527
Epoch 17/100
1/35 [..............................] - ETA: 0s - loss: 0.1403 - accuracy: 0.9688
35/35 [==============================] - 0s 1ms/step - loss: 0.1181 - accuracy: 0.9727 - val_loss: 0.1460 - val_accuracy: 0.9564
Epoch 18/100
1/35 [..............................] - ETA: 0s - loss: 0.0814 - accuracy: 1.0000
35/35 [==============================] - 0s 1ms/step - loss: 0.1152 - accuracy: 0.9736 - val_loss: 0.1430 - val_accuracy: 0.9564
Epoch 19/100
1/35 [..............................] - ETA: 0s - loss: 0.1306 - accuracy: 0.9062
35/35 [==============================] - 0s 1ms/step - loss: 0.1125 - accuracy: 0.9745 - val_loss: 0.1405 - val_accuracy: 0.9564
Epoch 20/100
1/35 [..............................] - ETA: 0s - loss: 0.0973 - accuracy: 1.0000
35/35 [==============================] - 0s 1ms/step - loss: 0.1100 - accuracy: 0.9745 - val_loss: 0.1379 - val_accuracy: 0.9564
Epoch 21/100
1/35 [..............................] - ETA: 0s - loss: 0.1061 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1077 - accuracy: 0.9745 - val_loss: 0.1352 - val_accuracy: 0.9564
Epoch 22/100
1/35 [..............................] - ETA: 0s - loss: 0.0707 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1055 - accuracy: 0.9745 - val_loss: 0.1331 - val_accuracy: 0.9600
Epoch 23/100
1/35 [..............................] - ETA: 0s - loss: 0.1150 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.1035 - accuracy: 0.9754 - val_loss: 0.1308 - val_accuracy: 0.9600
Epoch 24/100
1/35 [..............................] - ETA: 0s - loss: 0.0813 - accuracy: 1.0000
35/35 [==============================] - 0s 1ms/step - loss: 0.1016 - accuracy: 0.9754 - val_loss: 0.1289 - val_accuracy: 0.9636
Epoch 25/100
1/35 [..............................] - ETA: 0s - loss: 0.1400 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0998 - accuracy: 0.9754 - val_loss: 0.1268 - val_accuracy: 0.9636
Epoch 26/100
1/35 [..............................] - ETA: 0s - loss: 0.1344 - accuracy: 0.9375
35/35 [==============================] - 0s 2ms/step - loss: 0.0981 - accuracy: 0.9763 - val_loss: 0.1249 - val_accuracy: 0.9636
Epoch 27/100
1/35 [..............................] - ETA: 0s - loss: 0.0492 - accuracy: 1.0000
35/35 [==============================] - 0s 1ms/step - loss: 0.0965 - accuracy: 0.9763 - val_loss: 0.1232 - val_accuracy: 0.9636
Epoch 28/100
1/35 [..............................] - ETA: 0s - loss: 0.0479 - accuracy: 1.0000
35/35 [==============================] - 0s 1ms/step - loss: 0.0950 - accuracy: 0.9763 - val_loss: 0.1215 - val_accuracy: 0.9636
Epoch 29/100
1/35 [..............................] - ETA: 0s - loss: 0.0591 - accuracy: 1.0000
35/35 [==============================] - 0s 1ms/step - loss: 0.0936 - accuracy: 0.9763 - val_loss: 0.1199 - val_accuracy: 0.9673
Epoch 30/100
1/35 [..............................] - ETA: 0s - loss: 0.0382 - accuracy: 1.0000
35/35 [==============================] - 0s 1ms/step - loss: 0.0922 - accuracy: 0.9772 - val_loss: 0.1185 - val_accuracy: 0.9673
Epoch 31/100
1/35 [..............................] - ETA: 0s - loss: 0.0521 - accuracy: 1.0000
35/35 [==============================] - 0s 1ms/step - loss: 0.0909 - accuracy: 0.9772 - val_loss: 0.1170 - val_accuracy: 0.9673
Epoch 32/100
1/35 [..............................] - ETA: 0s - loss: 0.0437 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0896 - accuracy: 0.9781 - val_loss: 0.1154 - val_accuracy: 0.9673
Epoch 33/100
1/35 [..............................] - ETA: 0s - loss: 0.0855 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0885 - accuracy: 0.9781 - val_loss: 0.1141 - val_accuracy: 0.9673
Epoch 34/100
1/35 [..............................] - ETA: 0s - loss: 0.1364 - accuracy: 0.9375
35/35 [==============================] - 0s 2ms/step - loss: 0.0873 - accuracy: 0.9781 - val_loss: 0.1128 - val_accuracy: 0.9673
Epoch 35/100
1/35 [..............................] - ETA: 0s - loss: 0.0608 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0862 - accuracy: 0.9781 - val_loss: 0.1116 - val_accuracy: 0.9673
Epoch 36/100
1/35 [..............................] - ETA: 0s - loss: 0.0618 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0851 - accuracy: 0.9781 - val_loss: 0.1104 - val_accuracy: 0.9673
Epoch 37/100
1/35 [..............................] - ETA: 0s - loss: 0.0681 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0841 - accuracy: 0.9781 - val_loss: 0.1089 - val_accuracy: 0.9673
Epoch 38/100
1/35 [..............................] - ETA: 0s - loss: 0.1146 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0831 - accuracy: 0.9790 - val_loss: 0.1079 - val_accuracy: 0.9673
Epoch 39/100
1/35 [..............................] - ETA: 0s - loss: 0.0611 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0822 - accuracy: 0.9790 - val_loss: 0.1067 - val_accuracy: 0.9673
Epoch 40/100
1/35 [..............................] - ETA: 0s - loss: 0.0890 - accuracy: 0.9375
35/35 [==============================] - 0s 3ms/step - loss: 0.0812 - accuracy: 0.9799 - val_loss: 0.1058 - val_accuracy: 0.9673
Epoch 41/100
1/35 [..............................] - ETA: 0s - loss: 0.1269 - accuracy: 0.9375
35/35 [==============================] - 0s 2ms/step - loss: 0.0803 - accuracy: 0.9799 - val_loss: 0.1048 - val_accuracy: 0.9673
Epoch 42/100
1/35 [..............................] - ETA: 0s - loss: 0.0425 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0795 - accuracy: 0.9818 - val_loss: 0.1037 - val_accuracy: 0.9673
Epoch 43/100
1/35 [..............................] - ETA: 0s - loss: 0.1413 - accuracy: 0.9375
26/35 [=====================>........] - ETA: 0s - loss: 0.0796 - accuracy: 0.9808
35/35 [==============================] - 0s 3ms/step - loss: 0.0786 - accuracy: 0.9818 - val_loss: 0.1029 - val_accuracy: 0.9673
Epoch 44/100
1/35 [..............................] - ETA: 0s - loss: 0.1204 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0779 - accuracy: 0.9818 - val_loss: 0.1017 - val_accuracy: 0.9673
Epoch 45/100
1/35 [..............................] - ETA: 0s - loss: 0.0468 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0771 - accuracy: 0.9818 - val_loss: 0.1009 - val_accuracy: 0.9673
Epoch 46/100
1/35 [..............................] - ETA: 0s - loss: 0.0631 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0763 - accuracy: 0.9818 - val_loss: 0.0999 - val_accuracy: 0.9673
Epoch 47/100
1/35 [..............................] - ETA: 0s - loss: 0.0428 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0756 - accuracy: 0.9818 - val_loss: 0.0991 - val_accuracy: 0.9673
Epoch 48/100
1/35 [..............................] - ETA: 0s - loss: 0.0437 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0749 - accuracy: 0.9827 - val_loss: 0.0984 - val_accuracy: 0.9673
Epoch 49/100
1/35 [..............................] - ETA: 0s - loss: 0.0350 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0742 - accuracy: 0.9818 - val_loss: 0.0976 - val_accuracy: 0.9673
Epoch 50/100
1/35 [..............................] - ETA: 0s - loss: 0.0511 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0735 - accuracy: 0.9827 - val_loss: 0.0969 - val_accuracy: 0.9673
Epoch 51/100
1/35 [..............................] - ETA: 0s - loss: 0.0384 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0729 - accuracy: 0.9827 - val_loss: 0.0958 - val_accuracy: 0.9673
Epoch 52/100
1/35 [..............................] - ETA: 0s - loss: 0.0588 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0722 - accuracy: 0.9836 - val_loss: 0.0953 - val_accuracy: 0.9673
Epoch 53/100
1/35 [..............................] - ETA: 0s - loss: 0.0570 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0716 - accuracy: 0.9836 - val_loss: 0.0941 - val_accuracy: 0.9709
Epoch 54/100
1/35 [..............................] - ETA: 0s - loss: 0.0943 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0710 - accuracy: 0.9836 - val_loss: 0.0936 - val_accuracy: 0.9673
Epoch 55/100
1/35 [..............................] - ETA: 0s - loss: 0.0937 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0704 - accuracy: 0.9836 - val_loss: 0.0928 - val_accuracy: 0.9709
Epoch 56/100
1/35 [..............................] - ETA: 0s - loss: 0.1195 - accuracy: 0.9062
35/35 [==============================] - 0s 2ms/step - loss: 0.0698 - accuracy: 0.9836 - val_loss: 0.0923 - val_accuracy: 0.9673
Epoch 57/100
1/35 [..............................] - ETA: 0s - loss: 0.0448 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0693 - accuracy: 0.9836 - val_loss: 0.0917 - val_accuracy: 0.9673
Epoch 58/100
1/35 [..............................] - ETA: 0s - loss: 0.1057 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0687 - accuracy: 0.9836 - val_loss: 0.0910 - val_accuracy: 0.9673
Epoch 59/100
1/35 [..............................] - ETA: 0s - loss: 0.0655 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0682 - accuracy: 0.9836 - val_loss: 0.0903 - val_accuracy: 0.9709
Epoch 60/100
1/35 [..............................] - ETA: 0s - loss: 0.0541 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0677 - accuracy: 0.9836 - val_loss: 0.0897 - val_accuracy: 0.9709
Epoch 61/100
1/35 [..............................] - ETA: 0s - loss: 0.0463 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0672 - accuracy: 0.9836 - val_loss: 0.0890 - val_accuracy: 0.9709
Epoch 62/100
1/35 [..............................] - ETA: 0s - loss: 0.0571 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0667 - accuracy: 0.9836 - val_loss: 0.0884 - val_accuracy: 0.9709
Epoch 63/100
1/35 [..............................] - ETA: 0s - loss: 0.0864 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0662 - accuracy: 0.9836 - val_loss: 0.0878 - val_accuracy: 0.9709
Epoch 64/100
1/35 [..............................] - ETA: 0s - loss: 0.0491 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0658 - accuracy: 0.9836 - val_loss: 0.0871 - val_accuracy: 0.9709
Epoch 65/100
1/35 [..............................] - ETA: 0s - loss: 0.0318 - accuracy: 1.0000
12/35 [=========>....................] - ETA: 0s - loss: 0.0739 - accuracy: 0.9818
35/35 [==============================] - 0s 4ms/step - loss: 0.0653 - accuracy: 0.9845 - val_loss: 0.0866 - val_accuracy: 0.9709
Epoch 66/100
1/35 [..............................] - ETA: 0s - loss: 0.0550 - accuracy: 1.0000
32/35 [==========================>...] - ETA: 0s - loss: 0.0663 - accuracy: 0.9834
35/35 [==============================] - 0s 3ms/step - loss: 0.0648 - accuracy: 0.9845 - val_loss: 0.0862 - val_accuracy: 0.9709
Epoch 67/100
1/35 [..............................] - ETA: 0s - loss: 0.0556 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0644 - accuracy: 0.9836 - val_loss: 0.0857 - val_accuracy: 0.9709
Epoch 68/100
1/35 [..............................] - ETA: 0s - loss: 0.0677 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0640 - accuracy: 0.9836 - val_loss: 0.0851 - val_accuracy: 0.9709
Epoch 69/100
1/35 [..............................] - ETA: 0s - loss: 0.0250 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0636 - accuracy: 0.9836 - val_loss: 0.0847 - val_accuracy: 0.9709
Epoch 70/100
1/35 [..............................] - ETA: 0s - loss: 0.0899 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0631 - accuracy: 0.9836 - val_loss: 0.0841 - val_accuracy: 0.9709
Epoch 71/100
1/35 [..............................] - ETA: 0s - loss: 0.0296 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0627 - accuracy: 0.9845 - val_loss: 0.0836 - val_accuracy: 0.9709
Epoch 72/100
1/35 [..............................] - ETA: 0s - loss: 0.0294 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0623 - accuracy: 0.9845 - val_loss: 0.0832 - val_accuracy: 0.9709
Epoch 73/100
1/35 [..............................] - ETA: 0s - loss: 0.0640 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0619 - accuracy: 0.9845 - val_loss: 0.0826 - val_accuracy: 0.9709
Epoch 74/100
1/35 [..............................] - ETA: 0s - loss: 0.0237 - accuracy: 1.0000
35/35 [==============================] - 0s 1ms/step - loss: 0.0616 - accuracy: 0.9845 - val_loss: 0.0823 - val_accuracy: 0.9709
Epoch 75/100
1/35 [..............................] - ETA: 0s - loss: 0.0550 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0612 - accuracy: 0.9845 - val_loss: 0.0817 - val_accuracy: 0.9709
Epoch 76/100
1/35 [..............................] - ETA: 0s - loss: 0.0291 - accuracy: 1.0000
35/35 [==============================] - 0s 1ms/step - loss: 0.0608 - accuracy: 0.9845 - val_loss: 0.0812 - val_accuracy: 0.9709
Epoch 77/100
1/35 [..............................] - ETA: 0s - loss: 0.0800 - accuracy: 0.9375
35/35 [==============================] - 0s 1ms/step - loss: 0.0605 - accuracy: 0.9845 - val_loss: 0.0807 - val_accuracy: 0.9745
Epoch 78/100
1/35 [..............................] - ETA: 0s - loss: 0.0567 - accuracy: 1.0000
35/35 [==============================] - ETA: 0s - loss: 0.0601 - accuracy: 0.9845
35/35 [==============================] - 0s 3ms/step - loss: 0.0601 - accuracy: 0.9845 - val_loss: 0.0803 - val_accuracy: 0.9745
Epoch 79/100
1/35 [..............................] - ETA: 0s - loss: 0.0601 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0597 - accuracy: 0.9845 - val_loss: 0.0799 - val_accuracy: 0.9745
Epoch 80/100
1/35 [..............................] - ETA: 0s - loss: 0.0918 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0594 - accuracy: 0.9845 - val_loss: 0.0795 - val_accuracy: 0.9782
Epoch 81/100
1/35 [..............................] - ETA: 0s - loss: 0.1071 - accuracy: 0.9375
35/35 [==============================] - 0s 2ms/step - loss: 0.0591 - accuracy: 0.9854 - val_loss: 0.0791 - val_accuracy: 0.9782
Epoch 82/100
1/35 [..............................] - ETA: 0s - loss: 0.0656 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0587 - accuracy: 0.9845 - val_loss: 0.0787 - val_accuracy: 0.9782
Epoch 83/100
1/35 [..............................] - ETA: 0s - loss: 0.0328 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0584 - accuracy: 0.9854 - val_loss: 0.0783 - val_accuracy: 0.9782
Epoch 84/100
1/35 [..............................] - ETA: 0s - loss: 0.0824 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0581 - accuracy: 0.9854 - val_loss: 0.0780 - val_accuracy: 0.9782
Epoch 85/100
1/35 [..............................] - ETA: 0s - loss: 0.1104 - accuracy: 0.9375
35/35 [==============================] - 0s 2ms/step - loss: 0.0578 - accuracy: 0.9854 - val_loss: 0.0776 - val_accuracy: 0.9782
Epoch 86/100
1/35 [..............................] - ETA: 0s - loss: 0.0960 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0574 - accuracy: 0.9854 - val_loss: 0.0771 - val_accuracy: 0.9782
Epoch 87/100
1/35 [..............................] - ETA: 0s - loss: 0.0820 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0572 - accuracy: 0.9854 - val_loss: 0.0767 - val_accuracy: 0.9782
Epoch 88/100
1/35 [..............................] - ETA: 0s - loss: 0.0328 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0569 - accuracy: 0.9863 - val_loss: 0.0763 - val_accuracy: 0.9782
Epoch 89/100
1/35 [..............................] - ETA: 0s - loss: 0.0087 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0566 - accuracy: 0.9863 - val_loss: 0.0761 - val_accuracy: 0.9782
Epoch 90/100
1/35 [..............................] - ETA: 0s - loss: 0.0393 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0563 - accuracy: 0.9863 - val_loss: 0.0758 - val_accuracy: 0.9782
Epoch 91/100
1/35 [..............................] - ETA: 0s - loss: 0.0510 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0560 - accuracy: 0.9863 - val_loss: 0.0755 - val_accuracy: 0.9782
Epoch 92/100
1/35 [..............................] - ETA: 0s - loss: 0.0714 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0557 - accuracy: 0.9854 - val_loss: 0.0750 - val_accuracy: 0.9782
Epoch 93/100
1/35 [..............................] - ETA: 0s - loss: 0.0544 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0555 - accuracy: 0.9863 - val_loss: 0.0747 - val_accuracy: 0.9818
Epoch 94/100
1/35 [..............................] - ETA: 0s - loss: 0.0313 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0552 - accuracy: 0.9863 - val_loss: 0.0743 - val_accuracy: 0.9818
Epoch 95/100
1/35 [..............................] - ETA: 0s - loss: 0.0564 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0549 - accuracy: 0.9863 - val_loss: 0.0739 - val_accuracy: 0.9818
Epoch 96/100
1/35 [..............................] - ETA: 0s - loss: 0.0782 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0547 - accuracy: 0.9863 - val_loss: 0.0736 - val_accuracy: 0.9818
Epoch 97/100
1/35 [..............................] - ETA: 0s - loss: 0.0697 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0544 - accuracy: 0.9863 - val_loss: 0.0733 - val_accuracy: 0.9818
Epoch 98/100
1/35 [..............................] - ETA: 0s - loss: 0.0767 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0542 - accuracy: 0.9863 - val_loss: 0.0729 - val_accuracy: 0.9818
Epoch 99/100
1/35 [..............................] - ETA: 0s - loss: 0.0256 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0539 - accuracy: 0.9863 - val_loss: 0.0726 - val_accuracy: 0.9818
Epoch 100/100
1/35 [..............................] - ETA: 0s - loss: 0.0323 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0537 - accuracy: 0.9863 - val_loss: 0.0724 - val_accuracy: 0.9818
# Plot train vs test loss during training
plot_loss(h_callback.history['loss' ], h_callback.history['val_loss' ])
# Plot train vs test accuracy during training
plot_accuracy(h_callback.history['accuracy' ], h_callback.history['val_accuracy' ])
Early stopping your model
The early stopping callback is useful since it allows for you to stop the model training if it no longer improves after a given number of epochs. To make use of this functionality you need to pass the callback inside a list to the model’s callback parameter in the .fit() method.
The model you built to detect fake dollar bills is loaded for you to train, this time with early stopping. X_train, y_train, X_test and y_test are also available for your use.
# Create a sequential model
model = Sequential()
# Add a dense layer
model.add(Dense(1 , input_shape= (4 ,), activation= "sigmoid" ))
# Compile your model
model.compile (loss= 'binary_crossentropy' , optimizer= "sgd" , metrics= ['accuracy' ])
# Import the early stopping callback
from tensorflow.keras.callbacks import EarlyStopping
# Define a callback to monitor val_accuracy
monitor_val_acc = EarlyStopping(monitor= 'val_accuracy' , patience= 5 )
# Create a sequential model
model = Sequential()
# Add a dense layer
model.add(Dense(1 , input_shape= (4 ,), activation= "sigmoid" ))
# Compile your model
model.compile (loss= 'binary_crossentropy' , optimizer= "sgd" , metrics= ['accuracy' ])
# Train your model using the early stopping callback
model.fit(X_train, y_train, epochs = 100 ,
validation_data= (X_test, y_test),
callbacks = [monitor_val_acc])
Epoch 1/100
1/35 [..............................] - ETA: 6s - loss: 2.2410 - accuracy: 0.6562
35/35 [==============================] - 0s 4ms/step - loss: 0.7436 - accuracy: 0.7876 - val_loss: 0.2294 - val_accuracy: 0.8836
Epoch 2/100
1/35 [..............................] - ETA: 0s - loss: 0.1580 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.1622 - accuracy: 0.9417 - val_loss: 0.1961 - val_accuracy: 0.9273
Epoch 3/100
1/35 [..............................] - ETA: 0s - loss: 0.0632 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1454 - accuracy: 0.9517 - val_loss: 0.1903 - val_accuracy: 0.9273
Epoch 4/100
1/35 [..............................] - ETA: 0s - loss: 0.1264 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.1398 - accuracy: 0.9535 - val_loss: 0.1850 - val_accuracy: 0.9309
Epoch 5/100
1/35 [..............................] - ETA: 0s - loss: 0.1326 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.1353 - accuracy: 0.9535 - val_loss: 0.1798 - val_accuracy: 0.9345
Epoch 6/100
1/35 [..............................] - ETA: 0s - loss: 0.1202 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.1313 - accuracy: 0.9544 - val_loss: 0.1748 - val_accuracy: 0.9345
Epoch 7/100
1/35 [..............................] - ETA: 0s - loss: 0.1520 - accuracy: 0.9375
35/35 [==============================] - 0s 2ms/step - loss: 0.1276 - accuracy: 0.9599 - val_loss: 0.1698 - val_accuracy: 0.9382
Epoch 8/100
1/35 [..............................] - ETA: 0s - loss: 0.1151 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1244 - accuracy: 0.9626 - val_loss: 0.1665 - val_accuracy: 0.9382
Epoch 9/100
1/35 [..............................] - ETA: 0s - loss: 0.1081 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1215 - accuracy: 0.9626 - val_loss: 0.1625 - val_accuracy: 0.9418
Epoch 10/100
1/35 [..............................] - ETA: 0s - loss: 0.1374 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.1186 - accuracy: 0.9663 - val_loss: 0.1591 - val_accuracy: 0.9418
Epoch 11/100
1/35 [..............................] - ETA: 0s - loss: 0.0810 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1161 - accuracy: 0.9663 - val_loss: 0.1557 - val_accuracy: 0.9455
Epoch 12/100
1/35 [..............................] - ETA: 0s - loss: 0.0831 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.1136 - accuracy: 0.9681 - val_loss: 0.1521 - val_accuracy: 0.9455
Epoch 13/100
1/35 [..............................] - ETA: 0s - loss: 0.0690 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1112 - accuracy: 0.9681 - val_loss: 0.1494 - val_accuracy: 0.9455
Epoch 14/100
1/35 [..............................] - ETA: 0s - loss: 0.1025 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1091 - accuracy: 0.9690 - val_loss: 0.1468 - val_accuracy: 0.9455
Epoch 15/100
1/35 [..............................] - ETA: 0s - loss: 0.0836 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1071 - accuracy: 0.9690 - val_loss: 0.1439 - val_accuracy: 0.9491
Epoch 16/100
1/35 [..............................] - ETA: 0s - loss: 0.1301 - accuracy: 0.9375
35/35 [==============================] - 0s 2ms/step - loss: 0.1051 - accuracy: 0.9717 - val_loss: 0.1412 - val_accuracy: 0.9491
Epoch 17/100
1/35 [..............................] - ETA: 0s - loss: 0.0585 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1032 - accuracy: 0.9736 - val_loss: 0.1390 - val_accuracy: 0.9527
Epoch 18/100
1/35 [..............................] - ETA: 0s - loss: 0.0518 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1015 - accuracy: 0.9745 - val_loss: 0.1368 - val_accuracy: 0.9564
Epoch 19/100
1/35 [..............................] - ETA: 0s - loss: 0.0359 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0999 - accuracy: 0.9745 - val_loss: 0.1350 - val_accuracy: 0.9564
Epoch 20/100
1/35 [..............................] - ETA: 0s - loss: 0.1084 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0984 - accuracy: 0.9745 - val_loss: 0.1330 - val_accuracy: 0.9564
Epoch 21/100
1/35 [..............................] - ETA: 0s - loss: 0.0908 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0969 - accuracy: 0.9754 - val_loss: 0.1312 - val_accuracy: 0.9564
Epoch 22/100
1/35 [..............................] - ETA: 0s - loss: 0.1075 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0955 - accuracy: 0.9763 - val_loss: 0.1295 - val_accuracy: 0.9600
Epoch 23/100
1/35 [..............................] - ETA: 0s - loss: 0.0362 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0942 - accuracy: 0.9772 - val_loss: 0.1272 - val_accuracy: 0.9600
Epoch 24/100
1/35 [..............................] - ETA: 0s - loss: 0.1457 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.0928 - accuracy: 0.9790 - val_loss: 0.1255 - val_accuracy: 0.9600
Epoch 25/100
1/35 [..............................] - ETA: 0s - loss: 0.1306 - accuracy: 0.9375
35/35 [==============================] - 0s 2ms/step - loss: 0.0916 - accuracy: 0.9790 - val_loss: 0.1239 - val_accuracy: 0.9636
Epoch 26/100
1/35 [..............................] - ETA: 0s - loss: 0.0852 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0904 - accuracy: 0.9799 - val_loss: 0.1223 - val_accuracy: 0.9636
Epoch 27/100
1/35 [..............................] - ETA: 0s - loss: 0.1736 - accuracy: 0.9375
35/35 [==============================] - 0s 2ms/step - loss: 0.0893 - accuracy: 0.9809 - val_loss: 0.1208 - val_accuracy: 0.9636
Epoch 28/100
1/35 [..............................] - ETA: 0s - loss: 0.1117 - accuracy: 0.9375
35/35 [==============================] - 0s 2ms/step - loss: 0.0881 - accuracy: 0.9809 - val_loss: 0.1192 - val_accuracy: 0.9709
Epoch 29/100
1/35 [..............................] - ETA: 0s - loss: 0.0716 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0871 - accuracy: 0.9818 - val_loss: 0.1178 - val_accuracy: 0.9709
Epoch 30/100
1/35 [..............................] - ETA: 0s - loss: 0.0400 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0861 - accuracy: 0.9827 - val_loss: 0.1165 - val_accuracy: 0.9709
Epoch 31/100
1/35 [..............................] - ETA: 0s - loss: 0.1079 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0851 - accuracy: 0.9818 - val_loss: 0.1153 - val_accuracy: 0.9709
Epoch 32/100
1/35 [..............................] - ETA: 0s - loss: 0.0598 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0842 - accuracy: 0.9799 - val_loss: 0.1140 - val_accuracy: 0.9709
Epoch 33/100
1/35 [..............................] - ETA: 0s - loss: 0.0835 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.0833 - accuracy: 0.9818 - val_loss: 0.1127 - val_accuracy: 0.9709
<keras.src.callbacks.History object at 0x7f883e15fdc0>
A combination of callbacks
# Import the EarlyStopping and ModelCheckpoint callbacks
from tensorflow.keras.callbacks import EarlyStopping, ModelCheckpoint
# Early stop on validation accuracy
monitor_val_acc = EarlyStopping(monitor = 'val_accuracy' , patience = 3 )
# Save the best model as best_banknote_model.hdf5
model_checkpoint = ModelCheckpoint("best_banknote_model.hdf5" , save_best_only = True )
# Create a sequential model
model = Sequential()
# Add a dense layer
model.add(Dense(1 , input_shape= (4 ,), activation= "sigmoid" ))
# Compile your model
model.compile (loss= 'binary_crossentropy' , optimizer= "sgd" , metrics= ['accuracy' ])
# Fit your model for a stupid amount of epochs
h_callback = model.fit(X_train, y_train,
epochs = 1000000000000 ,
callbacks = [monitor_val_acc, model_checkpoint],
validation_data = (X_test, y_test))
Epoch 1/1000000000000
1/35 [..............................] - ETA: 7s - loss: 0.8762 - accuracy: 0.7500
35/35 [==============================] - 0s 5ms/step - loss: 0.6347 - accuracy: 0.7748 - val_loss: 0.5276 - val_accuracy: 0.7891
Epoch 2/1000000000000
1/35 [..............................] - ETA: 0s - loss: 0.4862 - accuracy: 0.8438
35/35 [==============================] - 0s 2ms/step - loss: 0.3271 - accuracy: 0.8532 - val_loss: 0.3075 - val_accuracy: 0.8364
Epoch 3/1000000000000
1/35 [..............................] - ETA: 0s - loss: 0.1890 - accuracy: 0.8750
35/35 [==============================] - 0s 2ms/step - loss: 0.2060 - accuracy: 0.9107 - val_loss: 0.2262 - val_accuracy: 0.8945
Epoch 4/1000000000000
1/35 [..............................] - ETA: 0s - loss: 0.1107 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.1648 - accuracy: 0.9398 - val_loss: 0.1963 - val_accuracy: 0.9164
Epoch 5/1000000000000
1/35 [..............................] - ETA: 0s - loss: 0.1454 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.1489 - accuracy: 0.9389 - val_loss: 0.1821 - val_accuracy: 0.9127
Epoch 6/1000000000000
1/35 [..............................] - ETA: 0s - loss: 0.0986 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.1402 - accuracy: 0.9435 - val_loss: 0.1743 - val_accuracy: 0.9236
Epoch 7/1000000000000
1/35 [..............................] - ETA: 0s - loss: 0.1028 - accuracy: 0.9688
35/35 [==============================] - 0s 2ms/step - loss: 0.1347 - accuracy: 0.9444 - val_loss: 0.1685 - val_accuracy: 0.9309
Epoch 8/1000000000000
1/35 [..............................] - ETA: 0s - loss: 0.1564 - accuracy: 0.9375
35/35 [==============================] - 0s 2ms/step - loss: 0.1304 - accuracy: 0.9471 - val_loss: 0.1642 - val_accuracy: 0.9309
Epoch 9/1000000000000
1/35 [..............................] - ETA: 0s - loss: 0.1473 - accuracy: 0.9375
35/35 [==============================] - 0s 2ms/step - loss: 0.1268 - accuracy: 0.9471 - val_loss: 0.1605 - val_accuracy: 0.9382
Epoch 10/1000000000000
1/35 [..............................] - ETA: 0s - loss: 0.1529 - accuracy: 0.9062
35/35 [==============================] - 0s 2ms/step - loss: 0.1238 - accuracy: 0.9499 - val_loss: 0.1575 - val_accuracy: 0.9455
Epoch 11/1000000000000
1/35 [..............................] - ETA: 0s - loss: 0.1326 - accuracy: 0.9375
35/35 [==============================] - 0s 2ms/step - loss: 0.1210 - accuracy: 0.9517 - val_loss: 0.1550 - val_accuracy: 0.9455
Epoch 12/1000000000000
1/35 [..............................] - ETA: 0s - loss: 0.0715 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1185 - accuracy: 0.9535 - val_loss: 0.1520 - val_accuracy: 0.9455
Epoch 13/1000000000000
1/35 [..............................] - ETA: 0s - loss: 0.0677 - accuracy: 1.0000
35/35 [==============================] - 0s 2ms/step - loss: 0.1161 - accuracy: 0.9572 - val_loss: 0.1493 - val_accuracy: 0.9455
Improving Your Model Performance
Is the model overfitting?
Let’s train the model you just built and plot its learning curve to check out if it’s overfitting! You can make use of the loaded function plot_loss() to plot training loss against validation loss, you can get both from the history callback.
# Train your model for 60 epochs, using X_test and y_test as validation data
h_callback = model.fit(X_train, y_train, epochs = 60 , validation_data = (X_test, y_test), verbose= 0 )
# Extract from the h_callback object loss and val_loss to plot the learning curve
plot_loss(h_callback.history['loss' ], h_callback.history['val_loss' ])
This graph doesn’t show overfitting but convergence. It looks like your model has learned all it could from the data and it no longer improves. The test loss, although higher than the training loss, is not getting worse, so we aren’t overfitting to the training data. If you want to inspect the plot_loss() function code, paste this in the console: show_code(plot_loss)
Do we need more data?
It’s time to check whether the digits dataset model you built benefits from more training examples!
In order to keep code to a minimum, various things are already initialized and ready to use:
The model you just built.
X_train,y_train,X_test, and y_test.
The initial_weights of your model, saved after using model.get_weights().
A pre-defined list of training sizes: training_sizes.
A pre-defined early stopping callback monitoring loss: early_stop.
Two empty lists to store the evaluation results: train_accs and test_accs. Train your model on the different training sizes and evaluate the results on X_test. End by plotting the results with plot_results().
The full code for this exercise can be found on the slides!
# Create the list of training sizes
# Store initial model weights
initial_weights = model.get_weights()
training_sizes = np.array([ 125 , 502 , 879 , 1255 ])
train_accs = []
test_accs = []
early_stop = EarlyStopping(monitor = "loss" , patience = 3 )
for size in training_sizes:
# Get a fraction of training data (we only care about the training data)
X_train_frac, y_train_frac = X_train[:size], y_train[:size]
# Reset the model to the initial weights and train it on the new training data fraction
model.set_weights(initial_weights)
model.fit(X_train_frac, y_train_frac, epochs = 50 , callbacks = [early_stop])
# Evaluate and store both: the training data fraction and the complete test set results
train_accs.append(model.evaluate(X_train_frac, y_train_frac)[1 ])
test_accs.append(model.evaluate(X_test, y_test)[1 ])
Epoch 1/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0251 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0335 - accuracy: 1.0000
Epoch 2/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0309 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0291 - accuracy: 1.0000
Epoch 3/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0426 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0232 - accuracy: 1.0000
Epoch 4/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0160 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0192 - accuracy: 1.0000
Epoch 5/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0442 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0171 - accuracy: 1.0000
Epoch 6/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0237 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0153 - accuracy: 1.0000
Epoch 7/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0077 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0138 - accuracy: 1.0000
Epoch 8/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0202 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0129 - accuracy: 1.0000
Epoch 9/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0182 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0120 - accuracy: 1.0000
Epoch 10/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0245 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0113 - accuracy: 1.0000
Epoch 11/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0167 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0106 - accuracy: 1.0000
Epoch 12/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0115 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0102 - accuracy: 1.0000
Epoch 13/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0036 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0096 - accuracy: 1.0000
Epoch 14/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0061 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0093 - accuracy: 1.0000
Epoch 15/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0092 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0090 - accuracy: 1.0000
Epoch 16/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0065 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0086 - accuracy: 1.0000
Epoch 17/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0140 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0084 - accuracy: 1.0000
Epoch 18/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0101 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0081 - accuracy: 1.0000
Epoch 19/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0076 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0080 - accuracy: 1.0000
Epoch 20/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0057 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0077 - accuracy: 1.0000
Epoch 21/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0127 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0076 - accuracy: 1.0000
Epoch 22/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0067 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0073 - accuracy: 1.0000
Epoch 23/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0054 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0072 - accuracy: 1.0000
Epoch 24/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0069 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0070 - accuracy: 1.0000
Epoch 25/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0072 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0069 - accuracy: 1.0000
Epoch 26/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0075 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0068 - accuracy: 1.0000
Epoch 27/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0060 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0066 - accuracy: 1.0000
Epoch 28/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0046 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0065 - accuracy: 1.0000
Epoch 29/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0112 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0064 - accuracy: 1.0000
Epoch 30/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0095 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0062 - accuracy: 1.0000
Epoch 31/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0061 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0061 - accuracy: 1.0000
Epoch 32/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0025 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0060 - accuracy: 1.0000
Epoch 33/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0017 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0059 - accuracy: 1.0000
Epoch 34/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0105 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0058 - accuracy: 1.0000
Epoch 35/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0066 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0057 - accuracy: 1.0000
Epoch 36/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0066 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0056 - accuracy: 1.0000
Epoch 37/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0057 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0055 - accuracy: 1.0000
Epoch 38/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0049 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0055 - accuracy: 1.0000
Epoch 39/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0052 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0054 - accuracy: 1.0000
Epoch 40/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0048 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0053 - accuracy: 1.0000
Epoch 41/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0041 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0052 - accuracy: 1.0000
Epoch 42/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0040 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0052 - accuracy: 1.0000
Epoch 43/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0023 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0051 - accuracy: 1.0000
Epoch 44/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0046 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0050 - accuracy: 1.0000
Epoch 45/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0063 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0050 - accuracy: 1.0000
Epoch 46/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0030 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0049 - accuracy: 1.0000
Epoch 47/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0061 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0048 - accuracy: 1.0000
Epoch 48/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0075 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0048 - accuracy: 1.0000
Epoch 49/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0056 - accuracy: 1.0000
4/4 [==============================] - 0s 2ms/step - loss: 0.0047 - accuracy: 1.0000
Epoch 50/50
1/4 [======>.......................] - ETA: 0s - loss: 0.0075 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0047 - accuracy: 1.0000
<keras.src.callbacks.History object at 0x7f8837fbe710>
1/4 [======>.......................] - ETA: 0s - loss: 0.0032 - accuracy: 1.0000
4/4 [==============================] - 0s 1ms/step - loss: 0.0046 - accuracy: 1.0000
1/15 [=>............................] - ETA: 0s - loss: 0.2047 - accuracy: 0.9375
15/15 [==============================] - 0s 1ms/step - loss: 0.2011 - accuracy: 0.9556
Epoch 1/50
1/16 [>.............................] - ETA: 0s - loss: 0.0173 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0269 - accuracy: 0.9960
Epoch 2/50
1/16 [>.............................] - ETA: 0s - loss: 0.0051 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0241 - accuracy: 0.9940
Epoch 3/50
1/16 [>.............................] - ETA: 0s - loss: 0.0451 - accuracy: 0.9688
16/16 [==============================] - 0s 1ms/step - loss: 0.0228 - accuracy: 0.9960
Epoch 4/50
1/16 [>.............................] - ETA: 0s - loss: 0.0195 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0216 - accuracy: 0.9960
Epoch 5/50
1/16 [>.............................] - ETA: 0s - loss: 0.0118 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0198 - accuracy: 0.9980
Epoch 6/50
1/16 [>.............................] - ETA: 0s - loss: 0.0135 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0186 - accuracy: 0.9980
Epoch 7/50
1/16 [>.............................] - ETA: 0s - loss: 0.0207 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0182 - accuracy: 0.9980
Epoch 8/50
1/16 [>.............................] - ETA: 0s - loss: 0.0183 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0175 - accuracy: 0.9980
Epoch 9/50
1/16 [>.............................] - ETA: 0s - loss: 0.0257 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0166 - accuracy: 1.0000
Epoch 10/50
1/16 [>.............................] - ETA: 0s - loss: 0.0212 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0157 - accuracy: 1.0000
Epoch 11/50
1/16 [>.............................] - ETA: 0s - loss: 0.0302 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0154 - accuracy: 1.0000
Epoch 12/50
1/16 [>.............................] - ETA: 0s - loss: 0.0210 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0147 - accuracy: 1.0000
Epoch 13/50
1/16 [>.............................] - ETA: 0s - loss: 0.0119 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0142 - accuracy: 1.0000
Epoch 14/50
1/16 [>.............................] - ETA: 0s - loss: 0.0057 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0141 - accuracy: 1.0000
Epoch 15/50
1/16 [>.............................] - ETA: 0s - loss: 0.0166 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0132 - accuracy: 1.0000
Epoch 16/50
1/16 [>.............................] - ETA: 0s - loss: 0.0078 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0126 - accuracy: 1.0000
Epoch 17/50
1/16 [>.............................] - ETA: 0s - loss: 0.0078 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0122 - accuracy: 1.0000
Epoch 18/50
1/16 [>.............................] - ETA: 0s - loss: 0.0036 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0122 - accuracy: 1.0000
Epoch 19/50
1/16 [>.............................] - ETA: 0s - loss: 0.0241 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0119 - accuracy: 1.0000
Epoch 20/50
1/16 [>.............................] - ETA: 0s - loss: 0.0086 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0113 - accuracy: 1.0000
Epoch 21/50
1/16 [>.............................] - ETA: 0s - loss: 0.0194 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0109 - accuracy: 1.0000
Epoch 22/50
1/16 [>.............................] - ETA: 0s - loss: 0.0075 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0106 - accuracy: 1.0000
Epoch 23/50
1/16 [>.............................] - ETA: 0s - loss: 0.0096 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0106 - accuracy: 1.0000
Epoch 24/50
1/16 [>.............................] - ETA: 0s - loss: 0.0065 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0098 - accuracy: 1.0000
Epoch 25/50
1/16 [>.............................] - ETA: 0s - loss: 0.0097 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0098 - accuracy: 1.0000
Epoch 26/50
1/16 [>.............................] - ETA: 0s - loss: 0.0053 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0095 - accuracy: 1.0000
Epoch 27/50
1/16 [>.............................] - ETA: 0s - loss: 0.0072 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0091 - accuracy: 1.0000
Epoch 28/50
1/16 [>.............................] - ETA: 0s - loss: 0.0058 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0089 - accuracy: 1.0000
Epoch 29/50
1/16 [>.............................] - ETA: 0s - loss: 0.0043 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0087 - accuracy: 1.0000
Epoch 30/50
1/16 [>.............................] - ETA: 0s - loss: 0.0129 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0085 - accuracy: 1.0000
Epoch 31/50
1/16 [>.............................] - ETA: 0s - loss: 0.0024 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0083 - accuracy: 1.0000
Epoch 32/50
1/16 [>.............................] - ETA: 0s - loss: 0.0089 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0080 - accuracy: 1.0000
Epoch 33/50
1/16 [>.............................] - ETA: 0s - loss: 0.0080 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0079 - accuracy: 1.0000
Epoch 34/50
1/16 [>.............................] - ETA: 0s - loss: 0.0024 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0078 - accuracy: 1.0000
Epoch 35/50
1/16 [>.............................] - ETA: 0s - loss: 0.0032 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0076 - accuracy: 1.0000
Epoch 36/50
1/16 [>.............................] - ETA: 0s - loss: 0.0048 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0074 - accuracy: 1.0000
Epoch 37/50
1/16 [>.............................] - ETA: 0s - loss: 0.0164 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0072 - accuracy: 1.0000
Epoch 38/50
1/16 [>.............................] - ETA: 0s - loss: 0.0120 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0071 - accuracy: 1.0000
Epoch 39/50
1/16 [>.............................] - ETA: 0s - loss: 0.0122 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0071 - accuracy: 1.0000
Epoch 40/50
1/16 [>.............................] - ETA: 0s - loss: 0.0076 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0068 - accuracy: 1.0000
Epoch 41/50
1/16 [>.............................] - ETA: 0s - loss: 0.0077 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0068 - accuracy: 1.0000
Epoch 42/50
1/16 [>.............................] - ETA: 0s - loss: 0.0028 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0066 - accuracy: 1.0000
Epoch 43/50
1/16 [>.............................] - ETA: 0s - loss: 0.0023 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0063 - accuracy: 1.0000
Epoch 44/50
1/16 [>.............................] - ETA: 0s - loss: 0.0052 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0062 - accuracy: 1.0000
Epoch 45/50
1/16 [>.............................] - ETA: 0s - loss: 0.0052 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0061 - accuracy: 1.0000
Epoch 46/50
1/16 [>.............................] - ETA: 0s - loss: 0.0042 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0061 - accuracy: 1.0000
Epoch 47/50
1/16 [>.............................] - ETA: 0s - loss: 0.0051 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0058 - accuracy: 1.0000
Epoch 48/50
1/16 [>.............................] - ETA: 0s - loss: 0.0039 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0058 - accuracy: 1.0000
Epoch 49/50
1/16 [>.............................] - ETA: 0s - loss: 0.0093 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0056 - accuracy: 1.0000
Epoch 50/50
1/16 [>.............................] - ETA: 0s - loss: 0.0062 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0055 - accuracy: 1.0000
<keras.src.callbacks.History object at 0x7f8837fdf550>
1/16 [>.............................] - ETA: 0s - loss: 0.0035 - accuracy: 1.0000
16/16 [==============================] - 0s 1ms/step - loss: 0.0053 - accuracy: 1.0000
1/15 [=>............................] - ETA: 0s - loss: 0.1633 - accuracy: 0.9375
15/15 [==============================] - 0s 1ms/step - loss: 0.1864 - accuracy: 0.9600
Epoch 1/50
1/28 [>.............................] - ETA: 0s - loss: 0.0244 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0312 - accuracy: 0.9920
Epoch 2/50
1/28 [>.............................] - ETA: 0s - loss: 0.0227 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0299 - accuracy: 0.9954
Epoch 3/50
1/28 [>.............................] - ETA: 0s - loss: 0.0189 - accuracy: 1.0000
28/28 [==============================] - 0s 978us/step - loss: 0.0306 - accuracy: 0.9920
Epoch 4/50
1/28 [>.............................] - ETA: 0s - loss: 0.0175 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0241 - accuracy: 0.9954
Epoch 5/50
1/28 [>.............................] - ETA: 0s - loss: 0.0273 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0228 - accuracy: 0.9943
Epoch 6/50
1/28 [>.............................] - ETA: 0s - loss: 0.0213 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0210 - accuracy: 0.9966
Epoch 7/50
1/28 [>.............................] - ETA: 0s - loss: 0.0537 - accuracy: 0.9688
28/28 [==============================] - 0s 1ms/step - loss: 0.0196 - accuracy: 0.9966
Epoch 8/50
1/28 [>.............................] - ETA: 0s - loss: 0.0149 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0193 - accuracy: 0.9966
Epoch 9/50
1/28 [>.............................] - ETA: 0s - loss: 0.0179 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0188 - accuracy: 0.9966
Epoch 10/50
1/28 [>.............................] - ETA: 0s - loss: 0.0023 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0181 - accuracy: 0.9966
Epoch 11/50
1/28 [>.............................] - ETA: 0s - loss: 0.0112 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0161 - accuracy: 0.9977
Epoch 12/50
1/28 [>.............................] - ETA: 0s - loss: 0.0153 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0150 - accuracy: 0.9977
Epoch 13/50
1/28 [>.............................] - ETA: 0s - loss: 0.0143 - accuracy: 1.0000
28/28 [==============================] - 0s 995us/step - loss: 0.0163 - accuracy: 0.9977
Epoch 14/50
1/28 [>.............................] - ETA: 0s - loss: 0.0203 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0127 - accuracy: 0.9989
Epoch 15/50
1/28 [>.............................] - ETA: 0s - loss: 0.0070 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0124 - accuracy: 1.0000
Epoch 16/50
1/28 [>.............................] - ETA: 0s - loss: 0.0155 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0116 - accuracy: 1.0000
Epoch 17/50
1/28 [>.............................] - ETA: 0s - loss: 0.0060 - accuracy: 1.0000
28/28 [==============================] - 0s 996us/step - loss: 0.0117 - accuracy: 0.9989
Epoch 18/50
1/28 [>.............................] - ETA: 0s - loss: 0.0096 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0107 - accuracy: 1.0000
Epoch 19/50
1/28 [>.............................] - ETA: 0s - loss: 0.0034 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0099 - accuracy: 1.0000
Epoch 20/50
1/28 [>.............................] - ETA: 0s - loss: 0.0068 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0093 - accuracy: 1.0000
Epoch 21/50
1/28 [>.............................] - ETA: 0s - loss: 0.0077 - accuracy: 1.0000
28/28 [==============================] - 0s 946us/step - loss: 0.0090 - accuracy: 1.0000
Epoch 22/50
1/28 [>.............................] - ETA: 0s - loss: 0.0042 - accuracy: 1.0000
28/28 [==============================] - 0s 920us/step - loss: 0.0084 - accuracy: 1.0000
Epoch 23/50
1/28 [>.............................] - ETA: 0s - loss: 0.0108 - accuracy: 1.0000
28/28 [==============================] - 0s 932us/step - loss: 0.0086 - accuracy: 1.0000
Epoch 24/50
1/28 [>.............................] - ETA: 0s - loss: 0.0041 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0080 - accuracy: 1.0000
Epoch 25/50
1/28 [>.............................] - ETA: 0s - loss: 0.0044 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0075 - accuracy: 1.0000
Epoch 26/50
1/28 [>.............................] - ETA: 0s - loss: 0.0053 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0072 - accuracy: 1.0000
Epoch 27/50
1/28 [>.............................] - ETA: 0s - loss: 0.0024 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0069 - accuracy: 1.0000
Epoch 28/50
1/28 [>.............................] - ETA: 0s - loss: 0.0068 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0065 - accuracy: 1.0000
Epoch 29/50
1/28 [>.............................] - ETA: 0s - loss: 0.0036 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0064 - accuracy: 1.0000
Epoch 30/50
1/28 [>.............................] - ETA: 0s - loss: 0.0094 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0062 - accuracy: 1.0000
Epoch 31/50
1/28 [>.............................] - ETA: 0s - loss: 0.0088 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0060 - accuracy: 1.0000
Epoch 32/50
1/28 [>.............................] - ETA: 0s - loss: 0.0068 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0056 - accuracy: 1.0000
Epoch 33/50
1/28 [>.............................] - ETA: 0s - loss: 0.0032 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0054 - accuracy: 1.0000
Epoch 34/50
1/28 [>.............................] - ETA: 0s - loss: 0.0084 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0052 - accuracy: 1.0000
Epoch 35/50
1/28 [>.............................] - ETA: 0s - loss: 0.0041 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0051 - accuracy: 1.0000
Epoch 36/50
1/28 [>.............................] - ETA: 0s - loss: 0.0016 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0048 - accuracy: 1.0000
Epoch 37/50
1/28 [>.............................] - ETA: 0s - loss: 0.0073 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0047 - accuracy: 1.0000
Epoch 38/50
1/28 [>.............................] - ETA: 0s - loss: 0.0033 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0047 - accuracy: 1.0000
Epoch 39/50
1/28 [>.............................] - ETA: 0s - loss: 0.0048 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0044 - accuracy: 1.0000
Epoch 40/50
1/28 [>.............................] - ETA: 0s - loss: 0.0017 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0042 - accuracy: 1.0000
Epoch 41/50
1/28 [>.............................] - ETA: 0s - loss: 0.0065 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0041 - accuracy: 1.0000
Epoch 42/50
1/28 [>.............................] - ETA: 0s - loss: 0.0050 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0040 - accuracy: 1.0000
Epoch 43/50
1/28 [>.............................] - ETA: 0s - loss: 0.0089 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0039 - accuracy: 1.0000
Epoch 44/50
1/28 [>.............................] - ETA: 0s - loss: 0.0039 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0037 - accuracy: 1.0000
Epoch 45/50
1/28 [>.............................] - ETA: 0s - loss: 6.9002e-04 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0038 - accuracy: 1.0000
Epoch 46/50
1/28 [>.............................] - ETA: 0s - loss: 0.0036 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0036 - accuracy: 1.0000
Epoch 47/50
1/28 [>.............................] - ETA: 0s - loss: 7.1521e-04 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0035 - accuracy: 1.0000
Epoch 48/50
1/28 [>.............................] - ETA: 0s - loss: 0.0029 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0033 - accuracy: 1.0000
Epoch 49/50
1/28 [>.............................] - ETA: 0s - loss: 0.0053 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0032 - accuracy: 1.0000
Epoch 50/50
1/28 [>.............................] - ETA: 0s - loss: 0.0027 - accuracy: 1.0000
28/28 [==============================] - 0s 1ms/step - loss: 0.0031 - accuracy: 1.0000
<keras.src.callbacks.History object at 0x7f883e20a980>
1/28 [>.............................] - ETA: 0s - loss: 0.0052 - accuracy: 1.0000
28/28 [==============================] - 0s 924us/step - loss: 0.0029 - accuracy: 1.0000
1/15 [=>............................] - ETA: 0s - loss: 0.1491 - accuracy: 0.9375
15/15 [==============================] - 0s 1ms/step - loss: 0.1811 - accuracy: 0.9600
Epoch 1/50
1/40 [..............................] - ETA: 0s - loss: 0.0341 - accuracy: 0.9688
40/40 [==============================] - 0s 1ms/step - loss: 0.0310 - accuracy: 0.9944
Epoch 2/50
1/40 [..............................] - ETA: 0s - loss: 0.0292 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0305 - accuracy: 0.9952
Epoch 3/50
1/40 [..............................] - ETA: 0s - loss: 0.0187 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0344 - accuracy: 0.9928
Epoch 4/50
1/40 [..............................] - ETA: 0s - loss: 0.0144 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0244 - accuracy: 0.9960
Epoch 5/50
1/40 [..............................] - ETA: 0s - loss: 0.0036 - accuracy: 1.0000
40/40 [==============================] - 0s 985us/step - loss: 0.0201 - accuracy: 0.9968
Epoch 6/50
1/40 [..............................] - ETA: 0s - loss: 0.0227 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0188 - accuracy: 0.9944
Epoch 7/50
1/40 [..............................] - ETA: 0s - loss: 0.0180 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0196 - accuracy: 0.9952
Epoch 8/50
1/40 [..............................] - ETA: 0s - loss: 0.0036 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0161 - accuracy: 0.9976
Epoch 9/50
1/40 [..............................] - ETA: 0s - loss: 0.0062 - accuracy: 1.0000
40/40 [==============================] - 0s 993us/step - loss: 0.0160 - accuracy: 0.9968
Epoch 10/50
1/40 [..............................] - ETA: 0s - loss: 0.0095 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0134 - accuracy: 0.9984
Epoch 11/50
1/40 [..............................] - ETA: 0s - loss: 0.0094 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0117 - accuracy: 0.9992
Epoch 12/50
1/40 [..............................] - ETA: 0s - loss: 0.0156 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0104 - accuracy: 1.0000
Epoch 13/50
1/40 [..............................] - ETA: 0s - loss: 0.0098 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0100 - accuracy: 1.0000
Epoch 14/50
1/40 [..............................] - ETA: 0s - loss: 0.0082 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0094 - accuracy: 1.0000
Epoch 15/50
1/40 [..............................] - ETA: 0s - loss: 0.0045 - accuracy: 1.0000
40/40 [==============================] - 0s 988us/step - loss: 0.0095 - accuracy: 0.9992
Epoch 16/50
1/40 [..............................] - ETA: 0s - loss: 0.0088 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0080 - accuracy: 1.0000
Epoch 17/50
1/40 [..............................] - ETA: 0s - loss: 0.0075 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0076 - accuracy: 1.0000
Epoch 18/50
1/40 [..............................] - ETA: 0s - loss: 0.0027 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0069 - accuracy: 1.0000
Epoch 19/50
1/40 [..............................] - ETA: 0s - loss: 0.0045 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0066 - accuracy: 1.0000
Epoch 20/50
1/40 [..............................] - ETA: 0s - loss: 0.0047 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0063 - accuracy: 1.0000
Epoch 21/50
1/40 [..............................] - ETA: 0s - loss: 0.0054 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0059 - accuracy: 1.0000
Epoch 22/50
1/40 [..............................] - ETA: 0s - loss: 0.0045 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0058 - accuracy: 1.0000
Epoch 23/50
1/40 [..............................] - ETA: 0s - loss: 0.0084 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0053 - accuracy: 1.0000
Epoch 24/50
1/40 [..............................] - ETA: 0s - loss: 0.0055 - accuracy: 1.0000
40/40 [==============================] - 0s 981us/step - loss: 0.0050 - accuracy: 1.0000
Epoch 25/50
1/40 [..............................] - ETA: 0s - loss: 0.0055 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0048 - accuracy: 1.0000
Epoch 26/50
1/40 [..............................] - ETA: 0s - loss: 0.0031 - accuracy: 1.0000
40/40 [==============================] - 0s 988us/step - loss: 0.0047 - accuracy: 1.0000
Epoch 27/50
1/40 [..............................] - ETA: 0s - loss: 0.0042 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0044 - accuracy: 1.0000
Epoch 28/50
1/40 [..............................] - ETA: 0s - loss: 0.0025 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0043 - accuracy: 1.0000
Epoch 29/50
1/40 [..............................] - ETA: 0s - loss: 0.0024 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0041 - accuracy: 1.0000
Epoch 30/50
1/40 [..............................] - ETA: 0s - loss: 0.0037 - accuracy: 1.0000
40/40 [==============================] - 0s 991us/step - loss: 0.0039 - accuracy: 1.0000
Epoch 31/50
1/40 [..............................] - ETA: 0s - loss: 0.0042 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0039 - accuracy: 1.0000
Epoch 32/50
1/40 [..............................] - ETA: 0s - loss: 0.0037 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0038 - accuracy: 1.0000
Epoch 33/50
1/40 [..............................] - ETA: 0s - loss: 7.6103e-04 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0036 - accuracy: 1.0000
Epoch 34/50
1/40 [..............................] - ETA: 0s - loss: 0.0049 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0034 - accuracy: 1.0000
Epoch 35/50
1/40 [..............................] - ETA: 0s - loss: 7.1987e-04 - accuracy: 1.0000
40/40 [==============================] - 0s 989us/step - loss: 0.0032 - accuracy: 1.0000
Epoch 36/50
1/40 [..............................] - ETA: 0s - loss: 0.0064 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0033 - accuracy: 1.0000
Epoch 37/50
1/40 [..............................] - ETA: 0s - loss: 0.0034 - accuracy: 1.0000
40/40 [==============================] - 0s 998us/step - loss: 0.0030 - accuracy: 1.0000
Epoch 38/50
1/40 [..............................] - ETA: 0s - loss: 0.0014 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0036 - accuracy: 1.0000
Epoch 39/50
1/40 [..............................] - ETA: 0s - loss: 0.0017 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0029 - accuracy: 1.0000
Epoch 40/50
1/40 [..............................] - ETA: 0s - loss: 0.0052 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0027 - accuracy: 1.0000
Epoch 41/50
1/40 [..............................] - ETA: 0s - loss: 0.0015 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0027 - accuracy: 1.0000
Epoch 42/50
1/40 [..............................] - ETA: 0s - loss: 0.0030 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0025 - accuracy: 1.0000
Epoch 43/50
1/40 [..............................] - ETA: 0s - loss: 5.0883e-04 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0025 - accuracy: 1.0000
Epoch 44/50
1/40 [..............................] - ETA: 0s - loss: 0.0037 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0023 - accuracy: 1.0000
Epoch 45/50
1/40 [..............................] - ETA: 0s - loss: 0.0045 - accuracy: 1.0000
40/40 [==============================] - 0s 997us/step - loss: 0.0023 - accuracy: 1.0000
Epoch 46/50
1/40 [..............................] - ETA: 0s - loss: 0.0010 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0022 - accuracy: 1.0000
Epoch 47/50
1/40 [..............................] - ETA: 0s - loss: 0.0017 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0021 - accuracy: 1.0000
Epoch 48/50
1/40 [..............................] - ETA: 0s - loss: 8.2644e-04 - accuracy: 1.0000
40/40 [==============================] - 0s 987us/step - loss: 0.0020 - accuracy: 1.0000
Epoch 49/50
1/40 [..............................] - ETA: 0s - loss: 3.9066e-04 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0020 - accuracy: 1.0000
Epoch 50/50
1/40 [..............................] - ETA: 0s - loss: 0.0012 - accuracy: 1.0000
40/40 [==============================] - 0s 1ms/step - loss: 0.0020 - accuracy: 1.0000
<keras.src.callbacks.History object at 0x7f883e226440>
1/40 [..............................] - ETA: 0s - loss: 0.0032 - accuracy: 1.0000
40/40 [==============================] - 0s 946us/step - loss: 0.0018 - accuracy: 1.0000
1/15 [=>............................] - ETA: 0s - loss: 0.1063 - accuracy: 0.9688
15/15 [==============================] - 0s 1ms/step - loss: 0.1758 - accuracy: 0.9622
# Plot train vs test accuracies
plot_results(train_accs, test_accs)
Different activation functions
The sigmoid(),tanh(), ReLU(), and leaky_ReLU() functions have been defined and ready for you to use. Each function receives an input number X and returns its corresponding Y value.
Which of the statements below is false?
The false statement among the ones provided is:
“The sigmoid() and tanh() both take values close to -1 for big negative numbers.”
Here’s why the other statements are true and why the false one is incorrect:
True Statement: “The sigmoid() takes a value of 0.5 when X = 0 whilst tanh() takes a value of 0.”
Explanation: The sigmoid function, defined as \(f(x) = \frac{1}{1 + e^{-x}}\) , indeed takes a value of 0.5 when \(x = 0\) . The hyperbolic tangent function (tanh), defined as \(f(x) = \frac{e^x - e^{-x}}{e^x + e^{-x}}\) , takes a value of 0 when \(x = 0\) .
True Statement: “The leaky_ReLU() takes a value of -0.01 when X = -1 whilst ReLU() takes a value of 0.”
Explanation: The Leaky ReLU (Rectified Linear Unit) activation function is defined as \(f(x) = x\) for \(x > 0\) and \(f(x) = \alpha x\) for \(x \leq 0\) , where \(\alpha\) is a small coefficient (e.g., 0.01). Thus, for \(x = -1\) , leaky ReLU returns \(-0.01\) . The standard ReLU function is defined as \(f(x) = \max(0, x)\) , and it returns 0 for any \(x \leq 0\) , including \(x = -1\) .
False Statement: “The sigmoid() and tanh() both take values close to -1 for big negative numbers.”
Explanation: For large negative \(x\) values, the sigmoid function approaches 0, not -1. The sigmoid function’s output range is between 0 and 1. On the other hand, the tanh function, which outputs values in the range of -1 to 1, does approach -1 for large negative \(x\) values. This discrepancy makes the statement false.
Comparing activation functions
Comparing activation functions involves a bit of coding, but nothing you can’t do!
You will try out different activation functions on the multi-label model you built for your farm irrigation machine in chapter 2. The function get_model(‘relu’) returns a copy of this model and applies the ‘relu’ activation function to its hidden layer.
You will loop through several activation functions, generate a new model for each and train it. By storing the history callback in a dictionary you will be able to visualize which activation function performed best in the next exercise!
X_train, y_train, X_test, y_test are ready for you to use when training your models.
activations = ['relu' , 'leaky_relu' , 'tanh' , 'sigmoid' ]
activation_results = {}
for act in activations:
# Get a new model with the current activation using the get_model function
model = get_model(act)
# Fit the model on-the-go
h_callback = model.fit(sensors_train, parcels_train, epochs= 100 , validation_data= (sensors_test, parcels_test), verbose= 0 )
activation_results[act] = h_callback
print (act + " model ready" )
Finishing with relu ...
relu model ready
Finishing with leaky_relu ...
leaky_relu model ready
Finishing with tanh ...
tanh model ready
Finishing with sigmoid ...
sigmoid model ready
Comparing activation functions II
What you coded in the previous exercise has been executed to obtain theactivation_results variable, this time 100 epochs were used instead of 20. This way you will have more epochs to further compare how the training evolves per activation function.
For every h_callback of each activation function in activation_results:
The h_callback.history[‘val_loss’] has been extracted. The h_callback.history[‘val_accuracy’] has been extracted. Both are saved into two dictionaries: val_loss_per_function and val_acc_per_function.
Pandas is also loaded as pd for you to use. Let’s plot some quick validation loss and accuracy charts!
val_loss_per_function = {k:v.history['val_loss' ] for k,v in activation_results.items()}
val_acc_per_function = {k:v.history['accuracy' ] for k,v in activation_results.items()}
# Create a dataframe from val_loss_per_function
val_loss= pd.DataFrame(val_loss_per_function)
# Call plot on the dataframe
val_loss.plot(title= 'Loss per Activation function' )
plt.show()
# Create a dataframe from val_acc_per_function
val_acc = pd.DataFrame(val_acc_per_function)
# Call plot on the dataframe
val_acc.plot(title= 'Acc per Activation function' )
plt.show()
Changing batch sizes
You’ve seen models are usually trained in batches of a fixed size. The smaller a batch size, the more weight updates per epoch, but at a cost of a more unstable gradient descent. Specially if the batch size is too small and it’s not representative of the entire training set.
Let’s see how different batch sizes affect the accuracy of a simple binary classification model that separates red from blue dots.
You’ll use a batch size of one, updating the weights once per sample in your training set for each epoch. Then you will use the entire dataset, updating the weights only once per epoch.
# Split the dataset into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size= 0.2 , random_state= 42 )
# Get a fresh new model with get_model
model = bank_note_model()
# Train your model for 5 epochs with a batch size of 1
model.fit(X_train, y_train, epochs= 5 , batch_size= 1 )
Epoch 1/5
1/1097 [..............................] - ETA: 3:11 - loss: 1.9911e-05 - accuracy: 1.0000
58/1097 [>.............................] - ETA: 0s - loss: 1.0335 - accuracy: 0.7241
119/1097 [==>...........................] - ETA: 0s - loss: 0.6214 - accuracy: 0.8235
184/1097 [====>.........................] - ETA: 0s - loss: 0.4640 - accuracy: 0.8696
249/1097 [=====>........................] - ETA: 0s - loss: 0.3872 - accuracy: 0.8835
317/1097 [=======>......................] - ETA: 0s - loss: 0.3291 - accuracy: 0.9054
382/1097 [=========>....................] - ETA: 0s - loss: 0.2948 - accuracy: 0.9162
445/1097 [===========>..................] - ETA: 0s - loss: 0.2744 - accuracy: 0.9213
509/1097 [============>.................] - ETA: 0s - loss: 0.2653 - accuracy: 0.9155
574/1097 [==============>...............] - ETA: 0s - loss: 0.2465 - accuracy: 0.9233
636/1097 [================>.............] - ETA: 0s - loss: 0.2366 - accuracy: 0.9277
701/1097 [==================>...........] - ETA: 0s - loss: 0.2262 - accuracy: 0.9315
768/1097 [====================>.........] - ETA: 0s - loss: 0.2160 - accuracy: 0.9349
831/1097 [=====================>........] - ETA: 0s - loss: 0.2092 - accuracy: 0.9362
896/1097 [=======================>......] - ETA: 0s - loss: 0.2012 - accuracy: 0.9375
961/1097 [=========================>....] - ETA: 0s - loss: 0.1954 - accuracy: 0.9407
1025/1097 [===========================>..] - ETA: 0s - loss: 0.1885 - accuracy: 0.9424
1092/1097 [============================>.] - ETA: 0s - loss: 0.1836 - accuracy: 0.9441
1097/1097 [==============================] - 1s 785us/step - loss: 0.1829 - accuracy: 0.9444
Epoch 2/5
1/1097 [..............................] - ETA: 0s - loss: 0.0162 - accuracy: 1.0000
62/1097 [>.............................] - ETA: 0s - loss: 0.0753 - accuracy: 0.9677
128/1097 [==>...........................] - ETA: 0s - loss: 0.0831 - accuracy: 0.9688
194/1097 [====>.........................] - ETA: 0s - loss: 0.0873 - accuracy: 0.9742
261/1097 [======>.......................] - ETA: 0s - loss: 0.0919 - accuracy: 0.9732
326/1097 [=======>......................] - ETA: 0s - loss: 0.0865 - accuracy: 0.9724
390/1097 [=========>....................] - ETA: 0s - loss: 0.0878 - accuracy: 0.9744
458/1097 [===========>..................] - ETA: 0s - loss: 0.0800 - accuracy: 0.9782
519/1097 [=============>................] - ETA: 0s - loss: 0.0788 - accuracy: 0.9788
585/1097 [==============>...............] - ETA: 0s - loss: 0.0788 - accuracy: 0.9795
651/1097 [================>.............] - ETA: 0s - loss: 0.0823 - accuracy: 0.9770
717/1097 [==================>...........] - ETA: 0s - loss: 0.0796 - accuracy: 0.9777
780/1097 [====================>.........] - ETA: 0s - loss: 0.0817 - accuracy: 0.9769
845/1097 [======================>.......] - ETA: 0s - loss: 0.0822 - accuracy: 0.9775
907/1097 [=======================>......] - ETA: 0s - loss: 0.0816 - accuracy: 0.9779
970/1097 [=========================>....] - ETA: 0s - loss: 0.0794 - accuracy: 0.9784
1036/1097 [===========================>..] - ETA: 0s - loss: 0.0793 - accuracy: 0.9778
1097/1097 [==============================] - 1s 778us/step - loss: 0.0817 - accuracy: 0.9754
Epoch 3/5
1/1097 [..............................] - ETA: 1s - loss: 4.0605e-04 - accuracy: 1.0000
67/1097 [>.............................] - ETA: 0s - loss: 0.0763 - accuracy: 0.9701
132/1097 [==>...........................] - ETA: 0s - loss: 0.0755 - accuracy: 0.9621
200/1097 [====>.........................] - ETA: 0s - loss: 0.0711 - accuracy: 0.9750
267/1097 [======>.......................] - ETA: 0s - loss: 0.0684 - accuracy: 0.9813
331/1097 [========>.....................] - ETA: 0s - loss: 0.0625 - accuracy: 0.9849
396/1097 [=========>....................] - ETA: 0s - loss: 0.0713 - accuracy: 0.9798
461/1097 [===========>..................] - ETA: 0s - loss: 0.0695 - accuracy: 0.9826
527/1097 [=============>................] - ETA: 0s - loss: 0.0641 - accuracy: 0.9848
595/1097 [===============>..............] - ETA: 0s - loss: 0.0677 - accuracy: 0.9849
661/1097 [=================>............] - ETA: 0s - loss: 0.0698 - accuracy: 0.9818
728/1097 [==================>...........] - ETA: 0s - loss: 0.0706 - accuracy: 0.9835
792/1097 [====================>.........] - ETA: 0s - loss: 0.0677 - accuracy: 0.9848
856/1097 [======================>.......] - ETA: 0s - loss: 0.0680 - accuracy: 0.9836
919/1097 [========================>.....] - ETA: 0s - loss: 0.0669 - accuracy: 0.9837
984/1097 [=========================>....] - ETA: 0s - loss: 0.0657 - accuracy: 0.9827
1048/1097 [===========================>..] - ETA: 0s - loss: 0.0651 - accuracy: 0.9828
1097/1097 [==============================] - 1s 771us/step - loss: 0.0640 - accuracy: 0.9827
Epoch 4/5
1/1097 [..............................] - ETA: 0s - loss: 0.0258 - accuracy: 1.0000
64/1097 [>.............................] - ETA: 0s - loss: 0.0408 - accuracy: 0.9844
129/1097 [==>...........................] - ETA: 0s - loss: 0.0716 - accuracy: 0.9767
197/1097 [====>.........................] - ETA: 0s - loss: 0.0598 - accuracy: 0.9797
262/1097 [======>.......................] - ETA: 0s - loss: 0.0570 - accuracy: 0.9847
329/1097 [=======>......................] - ETA: 0s - loss: 0.0522 - accuracy: 0.9878
394/1097 [=========>....................] - ETA: 0s - loss: 0.0566 - accuracy: 0.9848
459/1097 [===========>..................] - ETA: 0s - loss: 0.0592 - accuracy: 0.9847
526/1097 [=============>................] - ETA: 0s - loss: 0.0551 - accuracy: 0.9867
589/1097 [===============>..............] - ETA: 0s - loss: 0.0544 - accuracy: 0.9864
656/1097 [================>.............] - ETA: 0s - loss: 0.0527 - accuracy: 0.9878
724/1097 [==================>...........] - ETA: 0s - loss: 0.0526 - accuracy: 0.9890
790/1097 [====================>.........] - ETA: 0s - loss: 0.0539 - accuracy: 0.9886
856/1097 [======================>.......] - ETA: 0s - loss: 0.0543 - accuracy: 0.9883
920/1097 [========================>.....] - ETA: 0s - loss: 0.0544 - accuracy: 0.9870
982/1097 [=========================>....] - ETA: 0s - loss: 0.0533 - accuracy: 0.9878
1049/1097 [===========================>..] - ETA: 0s - loss: 0.0533 - accuracy: 0.9886
1097/1097 [==============================] - 1s 772us/step - loss: 0.0539 - accuracy: 0.9881
Epoch 5/5
1/1097 [..............................] - ETA: 0s - loss: 0.1424 - accuracy: 1.0000
66/1097 [>.............................] - ETA: 0s - loss: 0.0370 - accuracy: 1.0000
127/1097 [==>...........................] - ETA: 0s - loss: 0.0412 - accuracy: 0.9843
188/1097 [====>.........................] - ETA: 0s - loss: 0.0514 - accuracy: 0.9840
251/1097 [=====>........................] - ETA: 0s - loss: 0.0540 - accuracy: 0.9801
316/1097 [=======>......................] - ETA: 0s - loss: 0.0504 - accuracy: 0.9842
380/1097 [=========>....................] - ETA: 0s - loss: 0.0487 - accuracy: 0.9842
447/1097 [===========>..................] - ETA: 0s - loss: 0.0499 - accuracy: 0.9843
511/1097 [============>.................] - ETA: 0s - loss: 0.0489 - accuracy: 0.9863
572/1097 [==============>...............] - ETA: 0s - loss: 0.0486 - accuracy: 0.9860
634/1097 [================>.............] - ETA: 0s - loss: 0.0462 - accuracy: 0.9874
696/1097 [==================>...........] - ETA: 0s - loss: 0.0453 - accuracy: 0.9885
761/1097 [===================>..........] - ETA: 0s - loss: 0.0459 - accuracy: 0.9869
812/1097 [=====================>........] - ETA: 0s - loss: 0.0468 - accuracy: 0.9865
846/1097 [======================>.......] - ETA: 0s - loss: 0.0457 - accuracy: 0.9870
900/1097 [=======================>......] - ETA: 0s - loss: 0.0460 - accuracy: 0.9867
966/1097 [=========================>....] - ETA: 0s - loss: 0.0466 - accuracy: 0.9865
1029/1097 [===========================>..] - ETA: 0s - loss: 0.0477 - accuracy: 0.9864
1095/1097 [============================>.] - ETA: 0s - loss: 0.0486 - accuracy: 0.9854
1097/1097 [==============================] - 1s 831us/step - loss: 0.0487 - accuracy: 0.9854
<keras.src.callbacks.History object at 0x7f8854b80e50>
print (" \n The accuracy when using a batch of size 1 is: " ,
model.evaluate(X_test, y_test)[1 ])
1/9 [==>...........................] - ETA: 0s - loss: 0.0678 - accuracy: 1.0000
9/9 [==============================] - 0s 1ms/step - loss: 0.0613 - accuracy: 0.9818
The accuracy when using a batch of size 1 is: 0.9818181991577148
model = bank_note_model()
# Fit your model for 5 epochs with a batch of size the training set
model.fit(X_train, y_train, epochs= 5 , batch_size= X_train.shape[0 ])
Epoch 1/5
1/1 [==============================] - ETA: 0s - loss: 0.4405 - accuracy: 0.7912
1/1 [==============================] - 0s 171ms/step - loss: 0.4405 - accuracy: 0.7912
Epoch 2/5
1/1 [==============================] - ETA: 0s - loss: 0.4350 - accuracy: 0.7949
1/1 [==============================] - 0s 2ms/step - loss: 0.4350 - accuracy: 0.7949
Epoch 3/5
1/1 [==============================] - ETA: 0s - loss: 0.4296 - accuracy: 0.7995
1/1 [==============================] - 0s 2ms/step - loss: 0.4296 - accuracy: 0.7995
Epoch 4/5
1/1 [==============================] - ETA: 0s - loss: 0.4246 - accuracy: 0.8058
1/1 [==============================] - 0s 2ms/step - loss: 0.4246 - accuracy: 0.8058
Epoch 5/5
1/1 [==============================] - ETA: 0s - loss: 0.4197 - accuracy: 0.8113
1/1 [==============================] - 0s 2ms/step - loss: 0.4197 - accuracy: 0.8113
<keras.src.callbacks.History object at 0x7f88549f2620>
print (" \n The accuracy when using the whole training set as batch-size was: " ,
model.evaluate(X_test, y_test)[1 ])
1/9 [==>...........................] - ETA: 0s - loss: 0.4270 - accuracy: 0.8125
9/9 [==============================] - 0s 1ms/step - loss: 0.4751 - accuracy: 0.7891
The accuracy when using the whole training set as batch-size was: 0.7890909314155579
Batch normalizing a familiar model
Batch normalizing a familiar model Remember the digits dataset you trained in the first exercise of this chapter?
A multi-class classification problem that you solved using softmax and 10 neurons in your output layer.
You will now build a new deeper model consisting of 3 hidden layers of 50 neurons each, using batch normalization in between layers. The kernel_initializer parameter is used to initialize weights in a similar way.
# Import batch normalization from keras layers
from tensorflow.keras.layers import BatchNormalization
# Build your deep network
batchnorm_model = Sequential()
batchnorm_model.add(Dense(50 , input_shape= (64 ,), activation= 'relu' , kernel_initializer= 'normal' ))
batchnorm_model.add(BatchNormalization())
batchnorm_model.add(Dense(50 , activation= 'relu' , kernel_initializer= 'normal' ))
batchnorm_model.add(BatchNormalization())
batchnorm_model.add(Dense(50 , activation= 'relu' , kernel_initializer= 'normal' ))
batchnorm_model.add(BatchNormalization())
batchnorm_model.add(Dense(10 , activation= 'softmax' , kernel_initializer= 'normal' ))
# Compile your model with sgd
batchnorm_model.compile (optimizer= "sgd" , loss= 'categorical_crossentropy' , metrics= ['accuracy' ])
Batch normalization effects
Batch normalization tends to increase the learning speed of our models and make their learning curves more stable. Let’s see how two identical models with and without batch normalization compare.
The model you just built batchnorm_model is loaded for you to use. An exact copy of it without batch normalization: standard_model, is available as well. You can check their summary() in the console. X_train, y_train, X_test, and y_test are also loaded so that you can train both models.
You will compare the accuracy learning curves for both models plotting them with compare_histories_acc().
You can check the function pasting show_code(compare_histories_acc) in the console.
X_train_digits, X_test_digits, y_train_digits, y_test_digits = train_test_split(digits_pixels, digits_target_binary, test_size= 0.25 , random_state= 42 )
standard_model = digits_model()
# Train your standard model, storing its history callback
h1_callback = standard_model.fit(X_train_digits, y_train_digits, validation_data= (X_test_digits, y_test_digits), epochs= 10 , verbose= 0 )
# Train the batch normalized model you recently built, store its history callback
h2_callback = batchnorm_model.fit(X_train_digits, y_train_digits, validation_data= (X_test_digits, y_test_digits), epochs= 10 , verbose= 0 )
# Call compare_histories_acc passing in both model histories
compare_histories_acc(h1_callback, h2_callback)
Preparing a model for tuning
Let’s tune the hyperparameters of a binary classification model that does well classifying the breast cancer dataset.
You’ve seen that the first step to turn a model into a sklearn estimator is to build a function that creates it. The definition of this function is important since hyperparameter tuning is carried out by varying the arguments your function receives.
Build a simple create_model() function that receives both a learning rate and an activation function as arguments. The Adam optimizer has been imported as an object from tensorflow.keras.optimizers so that you can also change its learning rate parameter.
# Creates a model given an activation and learning rate
from tensorflow.keras.optimizers import Adam
def create_model(learning_rate, activation):
# Create an Adam optimizer with the given learning rate
opt = Adam(learning_rate = learning_rate)
# Create your binary classification model
model = Sequential()
model.add(Dense(128 , input_shape = (30 ,), activation = activation))
model.add(Dense(256 , activation = activation))
model.add(Dense(1 , activation = 'sigmoid' ))
# Compile your model with your optimizer, loss, and metrics
model.compile (optimizer = opt, loss = "binary_crossentropy" , metrics = ['accuracy' ])
return model
Tuning the model parameters
It’s time to try out different parameters on your model and see how well it performs!
The create_model() function you built in the previous exercise is ready for you to use.
Since fitting the RandomizedSearchCV object would take too long, the results you’d get are printed in the show_results() function. You could try random_search.fit(X,y) in the console yourself to check it does work after you have built everything else, but you will probably timeout the exercise (so copy your code first if you try this or you can lose your progress!).
You don’t need to use the optional epochs and batch_size parameters when building your KerasClassifier object since you are passing them as params to the random search and this works already.
# Import KerasClassifier from tensorflow.keras scikit learn wrappers
from scikeras.wrappers import KerasClassifier
from sklearn.model_selection import KFold, RandomizedSearchCV
# Create a KerasClassifier
model = KerasClassifier(build_fn = create_model)
# Define the parameters to try out
params = {'activation' : ["relu" , "tanh" ], 'batch_size' : [32 , 128 , 256 ],
'epochs' : [50 , 100 , 200 ], 'learning_rate' : [0.1 , 0.01 , 0.001 ]}
# Create a randomize search cv object passing in the parameters to try
random_search = RandomizedSearchCV(model, param_distributions = params, cv = KFold(3 ))
# Running random_search.fit(X,y) would start the search,but it takes too long!
show_results()
Best:
0.975395 using {learning_rate: 0.001, epochs: 50, batch_size: 128, activation: relu}
Other:
0.956063 (0.013236) with: {learning_rate: 0.1, epochs: 200, batch_size: 32, activation: tanh}
0.970123 (0.019838) with: {learning_rate: 0.1, epochs: 50, batch_size: 256, activation: tanh}
0.971880 (0.006524) with: {learning_rate: 0.01, epochs: 100, batch_size: 128, activation: tanh}
0.724077 (0.072993) with: {learning_rate: 0.1, epochs: 50, batch_size: 32, activation: relu}
0.588752 (0.281793) with: {learning_rate: 0.1, epochs: 100, batch_size: 256, activation: relu}
0.966608 (0.004892) with: {learning_rate: 0.001, epochs: 100, batch_size: 128, activation: tanh}
0.952548 (0.019734) with: {learning_rate: 0.1, epochs: 50, batch_size: 256, activation: relu}
0.971880 (0.006524) with: {learning_rate: 0.001, epochs: 200, batch_size: 128, activation: relu}
0.968366 (0.004239) with: {learning_rate: 0.01, epochs: 100, batch_size: 32, activation: relu}
0.910369 (0.055824) with: {learning_rate: 0.1, epochs: 100, batch_size: 128, activation: relu}
Training with cross-validation
Time to train your model with the best parameters found: 0.001 for the learning rate, 50 epochs, a 128 batch_size and relu activations.
The create_model() function from the previous exercise is ready for you to use. X and y are loaded as features and labels.
Use the best values found for your model when creating your KerasClassifier object so that they are used when performing cross_validation.
End this chapter by training an awesome tuned model on the breast cancer dataset!
from sklearn.datasets import load_breast_cancer
# Load the breast cancer dataset
breast_cancer = load_breast_cancer()
X = breast_cancer.data
y = breast_cancer.target
# Create a KerasClassifier
model = KerasClassifier(build_fn = create_model(learning_rate = 0.001 , activation = "relu" ), epochs = 50 ,
batch_size = 128 , verbose = 0 )
# Calculate the accuracy score for each fold
kfolds = cross_val_score(model, X, y, cv = 3 )
# Print the mean accuracy
print ('The mean accuracy was:' , kfolds.mean())
The mean accuracy was: 0.9718834066666666
# Print the accuracy standard deviation
print ('With a standard deviation of:' , kfolds.std())
With a standard deviation of: 0.002448915612216046