OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

what's the standard way of keras.model.predict to predict a single sample

  • Thread starter Thread starter dirac-
  • Start date Start date
D

dirac-

Guest
I have a trained model,and load it in the follow way. Since I have custom loss.

Code:
Nx=512
Ny=512
Ntot=Nx*Ny
totsize=500
batchsize =5

def dice_coef_loss(y_true, y_pred):
    y_true_f = tf.reshape(y_true,[-1])
    y_pred_f = tf.reshape(y_pred,[-1])
    return tf.reduce_sum(tf.abs(y_true_f-y_pred_f))/(Ntot*batchsize)

model=tf.keras.models.load_model('path_to_saved_model.keras',custom_objects={'dice_coef_loss':dice_coef_loss})

And in the training process, my dataset is in the follow way.

Code:
def decode(x,y):
    image = tf.io.read_file(x)
    image = tf.io.decode_raw(image, out_type=tf.float32)
    image = tf.transpose(tf.reshape(image,[Ny,Nx]),[1,0])
    image = tf.expand_dims(image,2)

    label = tf.io.read_file(y)
    label = tf.io.decode_raw(label, out_type=tf.float32)
    label = tf.transpose(tf.reshape(label,(Ny,Nx)),[1,0])
    label = label/np.pi
    label = tf.expand_dims(label,2)

    return image,label

dataset= tf.data.Dataset.from_tensor_slices((files_in_train,files_in_label))
dataset=dataset.map(decode,num_parallel_calls=tf.data.experimental.AUTOTUNE)
dataset=dataset.batch(batchsize)

when I want to test the time of predict using the trained model, I first use

Code:
start=time.perf_counter()
ypre=model.predict(dataset.take(1))
end=time.perf_counter()
print('predict time: %s Seconds'%(end-start))

ypre.shape is (5, 512, 512, 1) and

predict time: 2.670394100015983 Seconds

so in princeple every sample casts time 2.6/5 ? but when I define another dataset2 with batch 1.

Code:
def decode2(x):
    image = tf.io.read_file(x)
    image = tf.io.decode_raw(image, out_type=tf.float32)
    image = tf.transpose(tf.reshape(image,[Ny,Nx]),[1,0])
    image = tf.expand_dims(image,2)

    return image

dataset2= tf.data.Dataset.from_tensor_slices((files_in_train))
dataset2=dataset2.map(decode2,num_parallel_calls=tf.data.experimental.AUTOTUNE)
dataset2=dataset2.batch(1)

and test the predict time similarly.

Code:
start=time.perf_counter()
ypre=model.predict(dataset2.take(1))
end=time.perf_counter()
print('predict time: %s Seconds'%(end-start))

I get

predict time: 34.888951200060546 Seconds

I don't know why the time for predicting one sample is much slower than five sample. And I test other batchsize of the dataset2, it's seem that only when batchsize is same as the batchsize of training , the time is correct.

So what's the correct way when I want to predict only one sample?
<p>I have a trained model,and load it in the follow way. Since I have custom loss.</p>
<pre><code>Nx=512
Ny=512
Ntot=Nx*Ny
totsize=500
batchsize =5

def dice_coef_loss(y_true, y_pred):
y_true_f = tf.reshape(y_true,[-1])
y_pred_f = tf.reshape(y_pred,[-1])
return tf.reduce_sum(tf.abs(y_true_f-y_pred_f))/(Ntot*batchsize)

model=tf.keras.models.load_model('path_to_saved_model.keras',custom_objects={'dice_coef_loss':dice_coef_loss})
</code></pre>
<p>And in the training process, my dataset is in the follow way.</p>
<pre><code>def decode(x,y):
image = tf.io.read_file(x)
image = tf.io.decode_raw(image, out_type=tf.float32)
image = tf.transpose(tf.reshape(image,[Ny,Nx]),[1,0])
image = tf.expand_dims(image,2)

label = tf.io.read_file(y)
label = tf.io.decode_raw(label, out_type=tf.float32)
label = tf.transpose(tf.reshape(label,(Ny,Nx)),[1,0])
label = label/np.pi
label = tf.expand_dims(label,2)

return image,label

dataset= tf.data.Dataset.from_tensor_slices((files_in_train,files_in_label))
dataset=dataset.map(decode,num_parallel_calls=tf.data.experimental.AUTOTUNE)
dataset=dataset.batch(batchsize)
</code></pre>
<p>when I want to test the time of predict using the trained model, I first use</p>
<pre><code>start=time.perf_counter()
ypre=model.predict(dataset.take(1))
end=time.perf_counter()
print('predict time: %s Seconds'%(end-start))
</code></pre>
<p>ypre.shape is (5, 512, 512, 1) and</p>
<p>predict time: 2.670394100015983 Seconds</p>
<p>so in princeple every sample casts time 2.6/5 ? but when I define another dataset2 with batch 1.</p>
<pre><code>def decode2(x):
image = tf.io.read_file(x)
image = tf.io.decode_raw(image, out_type=tf.float32)
image = tf.transpose(tf.reshape(image,[Ny,Nx]),[1,0])
image = tf.expand_dims(image,2)

return image

dataset2= tf.data.Dataset.from_tensor_slices((files_in_train))
dataset2=dataset2.map(decode2,num_parallel_calls=tf.data.experimental.AUTOTUNE)
dataset2=dataset2.batch(1)
</code></pre>
<p>and test the predict time similarly.</p>
<pre><code>start=time.perf_counter()
ypre=model.predict(dataset2.take(1))
end=time.perf_counter()
print('predict time: %s Seconds'%(end-start))
</code></pre>
<p>I get</p>
<p>predict time: 34.888951200060546 Seconds</p>
<p>I don't know why the time for predicting one sample is much slower than five sample. And I test other batchsize of the dataset2, it's seem that only when batchsize is same as the batchsize of training , the time is correct.</p>
<p>So what's the correct way when I want to predict only one sample?</p>
 

Latest posts

Top