OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

cleaning up staging buckets for a GCP dataflow ran via flex template

  • Thread starter Thread starter user1068378
  • Start date Start date
U

user1068378

Guest
i am creating GCP dataflow jobs via flex template, using cloudbuild to generate templates etc This results in brand new buckets being created every single time. eg i have a dataflow-staging-us-central1-bcc13063024968bf8d7e6420b45af926 which plenty of directories i have atleast 3-4 other directories like that What are best practices for cleaning these buckets? I have a cloudbuild trigger that gets activated every time there is a commit to my repo this trigger will try to build flex templates for all the gcp jobs i have Perhaps what i am doing is not best practice? Thanks and regars Marco
<p>i am creating GCP dataflow jobs via flex template, using cloudbuild to generate templates etc
This results in brand new buckets being created every single time. eg i have
a dataflow-staging-us-central1-bcc13063024968bf8d7e6420b45af926 which plenty of directories
i have atleast 3-4 other directories like that
What are best practices for cleaning these buckets?
I have a cloudbuild trigger that gets activated every time there is a commit to my repo
this trigger will try to build flex templates for all the gcp jobs i have
Perhaps what i am doing is not best practice?
Thanks and regars
Marco</p>
Continue reading...
 

Latest posts

Top