OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

AWS Greengrass is not downloading S3 Artifact to custom location

  • Thread starter Thread starter Zain Ul Abidin
  • Start date Start date
Z

Zain Ul Abidin

Guest
I have created a AWS Greengrass component to pull docker-compose file and a python script to run docker-compose. I am defining my artifacts as below, but the problem is my python script is getting downloaded correctly in component's root location but the docker-compose.yml file is not getting downloaded in custom location. Due to which python script gives error and component is marked as broken and deployment simply fails.

Code:
{
    "RecipeFormatVersion": "2020-01-25",
    "ComponentName": "Backend_Services",
    "ComponentVersion": "2.3.3",
    "ComponentType": "aws.greengrass.generic",
    "ComponentDescription": "A component that updates BE, FE, DB and BACnet container from ECR and S3 compose file",
    "ComponentPublisher": "Amazon",
    "ComponentDependencies": {
        "aws.greengrass.TokenExchangeService": {
            "VersionRequirement": ">=2.0.0 <2.1.0",
            "DependencyType": "HARD"
        }
    },
    "Manifests": [
        {
            "Lifecycle": {
                "run": "python3 {artifacts:path}/auth-helper.py {artifacts:path}/containers"
            },
            "Artifacts": [
                {
                    "Uri": "s3://beartifacts/components/backend_services/docker-compose.yml",
                    "Digest": "dZ9DMTqHpULnDN8xIBHspspVFEUq/1EjmHcoVfPFXlI=",
                    "Algorithm": "SHA-256",
                    "Unarchive": "NONE",
                    "Permission": {
                        "Read": "OWNER",
                        "Execute": "NONE"
                    },
                    "Path": "containers/docker-compose.yml"
                },
                {
                    "Uri": "s3://beartifacts/components/backend_services/auth-helper.py",
                    "Digest": "vwkCkcBeORXfxSwiah9yjKan2GlkajzOuhSZTCaRbHQ=",
                    "Algorithm": "SHA-256",
                    "Unarchive": "NONE",
                    "Permission": {
                        "Read": "OWNER",
                        "Execute": "NONE"
                    }
                }
            ]
        }
    ],
    "Lifecycle": {}
}

The important point to note is that if I remove Path field from docker-compose artifact it also gets downloaded and containers are running after execution of python script. But directory structure is crucial for me to maintain, and I am not able to achieve the goal.

Any hint/suggestion is appreciated in advance.
<p>I have created a AWS Greengrass component to pull docker-compose file and a python script to run docker-compose. I am defining my artifacts as below, but the problem is my python script is getting downloaded correctly in component's root location but the docker-compose.yml file is not getting downloaded in custom location.
Due to which python script gives error and component is marked as broken and deployment simply fails.</p>
<pre><code>{
"RecipeFormatVersion": "2020-01-25",
"ComponentName": "Backend_Services",
"ComponentVersion": "2.3.3",
"ComponentType": "aws.greengrass.generic",
"ComponentDescription": "A component that updates BE, FE, DB and BACnet container from ECR and S3 compose file",
"ComponentPublisher": "Amazon",
"ComponentDependencies": {
"aws.greengrass.TokenExchangeService": {
"VersionRequirement": ">=2.0.0 <2.1.0",
"DependencyType": "HARD"
}
},
"Manifests": [
{
"Lifecycle": {
"run": "python3 {artifacts:path}/auth-helper.py {artifacts:path}/containers"
},
"Artifacts": [
{
"Uri": "s3://beartifacts/components/backend_services/docker-compose.yml",
"Digest": "dZ9DMTqHpULnDN8xIBHspspVFEUq/1EjmHcoVfPFXlI=",
"Algorithm": "SHA-256",
"Unarchive": "NONE",
"Permission": {
"Read": "OWNER",
"Execute": "NONE"
},
"Path": "containers/docker-compose.yml"
},
{
"Uri": "s3://beartifacts/components/backend_services/auth-helper.py",
"Digest": "vwkCkcBeORXfxSwiah9yjKan2GlkajzOuhSZTCaRbHQ=",
"Algorithm": "SHA-256",
"Unarchive": "NONE",
"Permission": {
"Read": "OWNER",
"Execute": "NONE"
}
}
]
}
],
"Lifecycle": {}
}
</code></pre>
<p>The important point to note is that if I remove Path field from docker-compose artifact it also gets downloaded and containers are running after execution of python script. But directory structure is crucial for me to maintain, and I am not able to achieve the goal.</p>
<p>Any hint/suggestion is appreciated in advance.</p>
Continue reading...
 
Top