OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

Cheaper way to send data to another system from cosmos?

  • Thread starter Thread starter Vishal Verma
  • Start date Start date
V

Vishal Verma

Guest
Currently my system uses cosmos and sql db. What we do is perform our business functionality and store the outcome(huge Json file) to blob storage and then we pass this data based on filter criteria to downstream system. The blob is getting bigger and bigger and its the same system used from long. Also, since downstream uses pagination,i can not archive blob data as it might impact downstream. I am looking for new approach (more cloud focused azure) and thinking is mesh is better way to send data to downstream . Need to modernize the current architecture .

I am thinking instead of adding a layer of blob, we can directly send data from cosmos and SQL on demand but I am worried of COSMOS RU cost . I understand it is more subjective question and happy to delete if this does not meet criteria. Just wondering what is the best architecture to cater large data to downstream applications.

Option 1: Cosmos DB change feed to automatically send data to blob via functions. Question- Will it be cheaper as we wont have to select query on cosmos and save cost ?

Option 2 : use event driven architecture to publish events from cosmos to other system .

Option 3 : Any recommendation ?
<p>Currently my system uses cosmos and sql db. What we do is perform our business functionality and store the outcome(huge Json file) to blob storage and then we pass this data based on filter criteria to downstream system.
The blob is getting bigger and bigger and its the same system used from long. Also, since downstream uses pagination,i can not archive blob data as it might impact downstream.
I am looking for new approach (more cloud focused azure) and thinking is mesh is better way to send data to downstream .
Need to modernize the current architecture .</p>
<p>I am thinking instead of adding a layer of blob, we can directly send data from cosmos and SQL on demand but I am worried of COSMOS RU cost .
I understand it is more subjective question and happy to delete if this does not meet criteria. Just wondering what is the best architecture to cater large data to downstream applications.</p>
<p><strong>Option 1</strong>: Cosmos DB change feed to automatically send data to blob via functions.
Question- Will it be cheaper as we wont have to select query on cosmos and save cost ?</p>
<p><strong>Option 2</strong> : use event driven architecture to publish events from cosmos to other system .</p>
<p><strong>Option 3 :</strong> Any recommendation ?</p>
Continue reading...
 

Latest posts

B
Replies
0
Views
1
Blundering Ecologist
B
Top