Answers for "Dense(units = 128, activation = 'Leakyrelu'"

0

Dense(units = 128, activation = 'Leakyrelu'

from keras.layers import LeakyReLU
model = Sequential()

# here change your line to leave out an activation 
model.add(Dense(90))

# now add a ReLU layer explicitly:
model.add(LeakyReLU(alpha=0.05))
Posted by: Guest on June-21-2020

Python Answers by Framework

Browse Popular Code Answers by Language