OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

Import custom prediction routine to vertex AI pipeline

  • Thread starter Thread starter bloukanov
  • Start date Start date
B

bloukanov

Guest
I have created a custom prediction routine on Vertex AI, uploaded the model, and am able to generate predictions with it through the UI. Now, I would like to incorporate this into a Vertex AI Pipeline, to run batch predictions after a data generation step. I am using the Kubeflow Pipelines SDK.

To do so, I am thinking of using the ModelBatchPredictOp prebuilt component. For this to work, I need to import the model into the pipeline, for example with an importer component. HOWEVER, the importer component requires an artifact URI, which my model does not have because it uses a custom container. It is baked into the container image; it is not sitting in GCS.

So I tried, though I did not really expect it to work, writing a quick custom importer component, which returns the model object, but I get a type mismatch. See sample code below:

Code:
from helper import data_component
from kfp.dsl import component, pipeline, Output, Model
from google_cloud_pipeline_components.v1.batch_predict_job import ModelBatchPredictOp


@component(packages_to_install=["google-cloud-aiplatform"])
def custom_importer(model: Output[Model]):
    from google.cloud import aiplatform
    return aiplatform.Model(model_name="model-id")


@pipeline(name="prediction-pipeline")
def pipeline():
    data_task = data_component()
    
    importer_task = custom_importer()

    batch_predict_op = ModelBatchPredictOp(
        job_display_name="batch_predict_job",
        model=importer_task.output,
        gcs_source_uris=data_task.outputs["dataset"],
        gcs_destination_output_uri_prefix="bucket",
        instances_format="csv",
        predictions_format="jsonl",
        starting_replica_count=1,
        max_replica_count=1,
    )

The ModelBatchPredictOp doesn't like the input type for arg model: InconsistentTypeException: Incompatible argument passed to the input 'model' of component 'model-batch-predict': Argument type '[email protected]' is incompatible with the input type '[email protected]'

How can I incorporate batch prediction from a custom prediction routine into a Vertex AI Pipeline?
<p>I have created a <a href="https://cloud.google.com/vertex-ai/docs/predictions/custom-prediction-routines" rel="nofollow noreferrer">custom prediction routine</a> on Vertex AI, uploaded the model, and am able to generate predictions with it through the UI. Now, I would like to incorporate this into a Vertex AI Pipeline, to run batch predictions after a data generation step. I am using the Kubeflow Pipelines SDK.</p>
<p>To do so, I am thinking of using the <a href="https://google-cloud-pipeline-compo...html#v1.batch_predict_job.ModelBatchPredictOp" rel="nofollow noreferrer"><code>ModelBatchPredictOp</code></a> prebuilt component. For this to work, I need to import the model into the pipeline, for example with an <a href="https://www.kubeflow.org/docs/components/pipelines/v2/components/importer-component/" rel="nofollow noreferrer">importer component</a>. HOWEVER, the importer component requires an artifact URI, which my model does not have because it uses a custom container. It is baked into the container image; it is not sitting in GCS.</p>
<p>So I tried, though I did not really expect it to work, writing a quick custom importer component, which returns the model object, but I get a type mismatch. See sample code below:</p>
<pre><code>from helper import data_component
from kfp.dsl import component, pipeline, Output, Model
from google_cloud_pipeline_components.v1.batch_predict_job import ModelBatchPredictOp


@component(packages_to_install=["google-cloud-aiplatform"])
def custom_importer(model: Output[Model]):
from google.cloud import aiplatform
return aiplatform.Model(model_name="model-id")


@pipeline(name="prediction-pipeline")
def pipeline():
data_task = data_component()

importer_task = custom_importer()

batch_predict_op = ModelBatchPredictOp(
job_display_name="batch_predict_job",
model=importer_task.output,
gcs_source_uris=data_task.outputs["dataset"],
gcs_destination_output_uri_prefix="bucket",
instances_format="csv",
predictions_format="jsonl",
starting_replica_count=1,
max_replica_count=1,
)

</code></pre>
<p>The <code>ModelBatchPredictOp</code> doesn't like the input type for arg <code>model</code>: <code>InconsistentTypeException: Incompatible argument passed to the input 'model' of component 'model-batch-predict': Argument type '[email protected]' is incompatible with the input type '[email protected]'</code></p>
<p>How can I incorporate batch prediction from a custom prediction routine into a Vertex AI Pipeline?</p>
 

Latest posts

M
Replies
0
Views
1
Mohit Pant
M
Top