Keras Model with Random Activation Function Layers

  • Share this:

Code introduction


This code defines a Keras model that contains multiple layers to perform a random activation function selection. The model accepts an input and outputs a prediction through a series of layers, including dense layers, Dropout, Batch Normalization, LeakyReLU, and sigmoid activation function.


Technology Stack : Keras, Input, Dense, Flatten, Dropout, BatchNormalization, LeakyReLU, Activation, Model

Code Type : The type of code

Code Difficulty : Intermediate


                
                    
def random_activation(input_shape):
    from keras.layers import Input, Dense, Flatten, Dropout, BatchNormalization, LeakyReLU, Activation
    from keras.models import Model

    input_layer = Input(shape=input_shape)
    x = Flatten()(input_layer)
    x = Dense(64, activation='relu')(x)
    x = Dropout(0.5)(x)
    x = BatchNormalization()(x)
    x = LeakyReLU(alpha=0.1)(x)
    output_layer = Dense(1, activation='sigmoid')(x)

    model = Model(inputs=input_layer, outputs=output_layer)
    return model