Answers for "batch normalization and dropout together example"

0

batch normalization and dropout together example

import tensorflow as tf
from tensorflow import keras
(X_train, y_train), (X_test, y_test) = tf.keras.datasets.mnist.load_data() 
from keras.layers import Dense
from keras.layers import Conv2D
from keras.layers import MaxPooling2D
from keras.layers import BatchNormalization
model.add(Conv2D(64, (3,3), input_shape=(X_train[0].shape),activation='relu')) 
model.add(Conv2D(32, (3,3), activation='relu'))
model.add(Conv2D(32, (3,3), activation='relu'))
model.add(BatchNormalization())
model.add(MaxPooling2D())
model.add(Dense(units=128,activation = 'relu'))
model.add(Dropouts(0.25))
model.add(Dense(units = 64, activation = 'relu'))
model.add(Dense(units = 32, activation = 'relu'))
model.add(Dense(units = 10, activation = 'softmax'))
Posted by: Guest on February-20-2021

Python Answers by Framework

Browse Popular Code Answers by Language