OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

Is there any way to upload files to S3 bucket using Azure Data Factory?

  • Thread starter Thread starter code_hr
  • Start date Start date
C

code_hr

Guest
I am trying to setup an ETL pipeline where

  1. Source is a SQL Server table's column in binary stream form
  2. Destination (sink) is s3 bucket

My requirements are:

  1. To read binary stream column from SQL Server table
  2. Process the binary stream data row by row
  3. Upload file to a S3 bucket for each binary stream data

I have tried DataFlow, Copy, AWS Connectors on Azure Data Factory, but there is no option to set s3 bucket as destination (sink)

Is there any other approach available in Azure Data Factory to match these requirements?
<p>I am trying to setup an ETL pipeline where</p>
<ol>
<li>Source is a SQL Server table's column in binary stream form</li>
<li>Destination (sink) is s3 bucket</li>
</ol>
<p>My requirements are:</p>
<ol>
<li>To read binary stream column from SQL Server table</li>
<li>Process the binary stream data row by row</li>
<li>Upload file to a S3 bucket for each binary stream data</li>
</ol>
<p>I have tried DataFlow, Copy, AWS Connectors on Azure Data Factory, but there is no option to set s3 bucket as destination (sink)</p>
<p>Is there any other approach available in Azure Data Factory to match these requirements?</p>
Continue reading...
 

Latest posts

G
Replies
0
Views
1
Gamal Othman
G
Top