OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

Keep getting NaNs value for scoring when tuning on KerasRegressor

  • Thread starter Thread starter Chuen Yik Kang
  • Start date Start date
C

Chuen Yik Kang

Guest
I am trying to tune hyperparameter on the KerasRegressor

However, i only get the result of NaN's which is shown below, may i know what cause the issue?

everything works fine when i try to compile my model... but the scoring for the best parameters it always show NaNs, metrics that i used is RMSE enter image description here

code snippet at below:

Code:
def create_model(optimizer,activation,lstm_unit_1,lstm_unit_2,lstm_unit_3, init='glorot_uniform'):
    model = Sequential()
    model.add(Conv1D(lstm_unit_1, kernel_size=1, activation=activation, input_shape = (trainX.shape[1], trainX.shape[2])))
    model.add(GRU(lstm_unit_2, activation = activation, return_sequences=True, input_shape = (trainX.shape[1], trainX.shape[2])))
    model.add(GRU(lstm_unit_3, activation = activation, return_sequences=True, input_shape = (trainX.shape[1], trainX.shape[2])))
    model.add(Dense(units = 1))
    model.add(Flatten())
    model.compile(optimizer = optimizer, loss = 'mse', metrics = ['mean_squared_error'])
    return model

Code:
model = tf.keras.wrappers.scikit_learn.KerasRegressor(build_fn = create_model, 
                                                      epochs = 150, 
                                                      verbose=False)
batch_size = [16,32,64,128]
lstm_unit_1 = [128,256,512]
lstm_unit_2 = lstm_unit_1.copy()
lstm_unit_3 = lstm_unit_1.copy()
optimizer = ['SGD','Adam','Adamax','RMSprop']
activation = ['relu','linear','sigmoid',]
param_grid = dict(lstm_unit_1=lstm_unit_1,
                  lstm_unit_2=lstm_unit_2,
                  lstm_unit_3=lstm_unit_3,
                  optimizer=optimizer,
                  activation=activation,
                  batch_size = batch_size)

warnings.filterwarnings("ignore")
random = RandomizedSearchCV(estimator=model, param_distributions=param_grid, n_jobs=-1, scoring='neg_mean_squared_error')
random_result = random.fit(trainX,trainY)

print(random_result.best_score_)
print(random_result.best_params_)
<p>I am trying to tune hyperparameter on the KerasRegressor</p>
<p>However, i only get the result of NaN's which is shown below, may i know what cause the issue?</p>
<p>everything works fine when i try to compile my model... but the scoring for the best parameters it always show NaNs, metrics that i used is RMSE
<a href="https://i.sstatic.net/LGvFn.png" rel="nofollow noreferrer"><img src="https://i.sstatic.net/LGvFn.png" alt="enter image description here" /></a></p>
<p>code snippet at below:</p>
<pre><code>def create_model(optimizer,activation,lstm_unit_1,lstm_unit_2,lstm_unit_3, init='glorot_uniform'):
model = Sequential()
model.add(Conv1D(lstm_unit_1, kernel_size=1, activation=activation, input_shape = (trainX.shape[1], trainX.shape[2])))
model.add(GRU(lstm_unit_2, activation = activation, return_sequences=True, input_shape = (trainX.shape[1], trainX.shape[2])))
model.add(GRU(lstm_unit_3, activation = activation, return_sequences=True, input_shape = (trainX.shape[1], trainX.shape[2])))
model.add(Dense(units = 1))
model.add(Flatten())
model.compile(optimizer = optimizer, loss = 'mse', metrics = ['mean_squared_error'])
return model
</code></pre>
<pre><code>model = tf.keras.wrappers.scikit_learn.KerasRegressor(build_fn = create_model,
epochs = 150,
verbose=False)
batch_size = [16,32,64,128]
lstm_unit_1 = [128,256,512]
lstm_unit_2 = lstm_unit_1.copy()
lstm_unit_3 = lstm_unit_1.copy()
optimizer = ['SGD','Adam','Adamax','RMSprop']
activation = ['relu','linear','sigmoid',]
param_grid = dict(lstm_unit_1=lstm_unit_1,
lstm_unit_2=lstm_unit_2,
lstm_unit_3=lstm_unit_3,
optimizer=optimizer,
activation=activation,
batch_size = batch_size)

warnings.filterwarnings("ignore")
random = RandomizedSearchCV(estimator=model, param_distributions=param_grid, n_jobs=-1, scoring='neg_mean_squared_error')
random_result = random.fit(trainX,trainY)

print(random_result.best_score_)
print(random_result.best_params_)
</code></pre>
 

Online statistics

Members online
0
Guests online
3
Total visitors
3
Top