Changing dropout on the fly (during training time, test time) in keras

I was doing an experiment that involved Monte-Carlo (MC) dropout. I had to change the rate of dropout and have it dropout during test time.

I was able to turn dropout on during test time by making a new function and passing in the learning_phase tensor.

# This code allows dropout to run during test time.
newLayer = K.function([model.layers[0].input, K.learning_phase()], [model.output])
layer_output = newLayer([x, 1])[0] # Dropout on

To change dropout from train time to test time I did the following:

# This code allows you to change the dropout
# Load model from .json
model.load_weights(filenameToModelWeights) # Load weights
model.layers[-2].rate = 0.04  # layer[-2] is my dropout layer, rate is dropout attribute
model = keras.models.clone(model) # If I do not clone, the new rate is never used. Weights are re-init now.
model.load_weights(filenameToModelWeights) # Load weights
model.predict(x)