OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

Why does my Lambda hang when attempting to copy more than 21 files in S3?

  • Thread starter Thread starter Nikolaisyl
  • Start date Start date
N

Nikolaisyl

Guest
It works fine with up to 21 files, takes a couple of seconds. But if I try feeding it more than 21, it hangs indefinitely and times out (> 30 seconds). From what I saw online, I am very far from hitting the S3 concurrent requests limit (which should be in the thousands), so I'm really not sure what could be happening.

Here's the relevant part of the code, It specifically hangs on the await s3_client.copy line:

Code:
async def _copy_image(
    s3_client, source_key: str, destination_key: str, bucket_name: str
) -> None:
    copy_source = {
        "Bucket": bucket_name,
        "Key": source_key,
    }

    await s3_client.copy(
        CopySource=copy_source,
        Bucket=bucket_name,
        Key=destination_key,
    )


async def copy_images_for_new_guide(
    keys: list[str],
    new_guide_id: int,
    org_id: int,
) -> None:
    session = aioboto3.Session()

    async with session.client(
        "s3",
        region_name=variables.secrets.AWS_REGION_NAME,
    ) as s3_client:
        tasks = []
        for key in keys:
            new_key = (
                f"orgs/org_{org_id}/guide_{new_guide_id}/{key.split('/')[-1]}"
            )
            tasks.append(
                _copy_image(
                    s3_client, key, new_key, variables.settings.IMAGES_BUCKET
                )
            )

        await asyncio.gather(*tasks)

What could be causing this behavior?
<p>It works fine with up to 21 files, takes a couple of seconds. But if I try feeding it more than 21, it hangs indefinitely and times out (> 30 seconds). From what I saw online, I am very far from hitting the S3 concurrent requests limit (which should be in the thousands), so I'm really not sure what could be happening.</p>
<p>Here's the relevant part of the code, It specifically hangs on the <code>await s3_client.copy</code> line:</p>
<pre class="lang-py prettyprint-override"><code>async def _copy_image(
s3_client, source_key: str, destination_key: str, bucket_name: str
) -> None:
copy_source = {
"Bucket": bucket_name,
"Key": source_key,
}

await s3_client.copy(
CopySource=copy_source,
Bucket=bucket_name,
Key=destination_key,
)


async def copy_images_for_new_guide(
keys: list[str],
new_guide_id: int,
org_id: int,
) -> None:
session = aioboto3.Session()

async with session.client(
"s3",
region_name=variables.secrets.AWS_REGION_NAME,
) as s3_client:
tasks = []
for key in keys:
new_key = (
f"orgs/org_{org_id}/guide_{new_guide_id}/{key.split('/')[-1]}"
)
tasks.append(
_copy_image(
s3_client, key, new_key, variables.settings.IMAGES_BUCKET
)
)

await asyncio.gather(*tasks)
</code></pre>
<p>What could be causing this behavior?</p>
 

Latest posts

Top