OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

AWS account assumption mismatch between GH runner and .NET app

  • Thread starter Thread starter MC LinkTimeError
  • Start date Start date
M

MC LinkTimeError

Guest
My .NET app (version 8, c#) downloads and uploads data from and to S3 buckets. Im running a GH action that:

  1. runs on a self-hosted runner (linux)
  2. does the trivial things (checkout, install .NET etc)
  3. assumes role of a specific account (QA account env) and sets the needed env vars
  4. runs the project with the env vars

When Im listing S3 buckets during the GH runner session - I can see the expected buckets. When I list S3 buckets in my .NET app - I see buckets from another account - but not the expected account.

Really not sure whats going on and would appreciate your help.

Some validations I made:

  1. compared the "AWS_ACCESS_KEY_ID", "AWS_SECRET_ACCESS_KEY" and "AWS_SESSION_TOKEN" after assuming role - and during app runtime. they exist and they are the same.
  2. verified that S3 bucket and path exist
  3. verified that S3 bucket permits my assumed role to act
  4. verified that my role exists and has permissions to work with s3 in the expected env
  5. ran the code in another instance that assumed the role during instance creation and validated that it works (meaning - this is probably not a direct bug in the code etc)

code:

  1. my GH workflow:

Code:
name: my workflow name
on: [push]
jobs:
  run_update_tests:
    runs-on: some-linux-self-hosted-runner
    steps:
      - name: Checkout code
        uses: actions/checkout@v4
        with:
          ref: ${{ github.ref }}

      - name: Setup .NET
        uses: actions/setup-dotnet@v1
        with:
          dotnet-version: 8.0.105

      - name: Build project
        working-directory: path/to/project
        run: |
          echo "Building project"
          dotnet build

      - name: Copy test configuration file - Linux
        working-directory: some-config-path
        run: |
          echo "Copying test configuration file"
          mkdir -p /tmp/path
          cp some/json/file/path some/path
          sed -i 's/SOME_SECRET_PLACEHOLDER/${{ secrets.SOME_SECRET }}/' some/path

      - name: Assume role
        uses: ./.github/actions/assume-role

      - name: Check env vars
        run: |
          echo "AWS_ACCESS_KEY_ID is set: ${{ env.AWS_ACCESS_KEY_ID != '' }}"
          echo "AWS_SECRET_ACCESS_KEY is set: ${{ env.AWS_SECRET_ACCESS_KEY != '' }}"
          echo "AWS_SESSION_TOKEN is set: ${{ env.AWS_SESSION_TOKEN != '' }}"

      - name: Run project
        working-directory: some/path
        run: |
            echo "Running project"
            sudo -E dotnet run
  1. assume-role action:

Code:
name: 'Assume Role'
description: 'Assuming role in AWS'

runs:
  using: "composite"

  steps:

    - name: Install AWS cli
      shell: bash
      run: |
        curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
        unzip awscliv2.zip
        sudo ./aws/install --update

    - name: Assume role
      shell: bash
      run: |
        ASSUME_ROLE_OUTPUT=$(aws sts assume-role --role-arn "some:arn:some:role" --role-session-name "AWSCLI-lab")
        AWS_ACCESS_KEY_ID=$(echo $ASSUME_ROLE_OUTPUT | jq -r '.Credentials.AccessKeyId')
        AWS_SECRET_ACCESS_KEY=$(echo $ASSUME_ROLE_OUTPUT | jq -r '.Credentials.SecretAccessKey')
        AWS_SESSION_TOKEN=$(echo $ASSUME_ROLE_OUTPUT | jq -r '.Credentials.SessionToken')
        echo "AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID"
        echo "AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY"
        echo "AWS_SESSION_TOKEN=$AWS_SESSION_TOKEN"
        echo "AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID" >> $GITHUB_ENV
        echo "AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY" >> $GITHUB_ENV
        echo "AWS_SESSION_TOKEN=$AWS_SESSION_TOKEN" >> $GITHUB_ENV

    # validate assumed role
    - name: Validate assumed role success
      shell: bash
      run: |
        aws ec2 describe-vpcs | grep VpcId
        aws s3 ls

after listing s3 buckets here - i can see the expected buckets (meaning - expected account)

  1. my code that tries to download from S3 and lists S3 buckets:

Code:
public class RemoteObjectsHandler: IRemoteObjectsHandler
{
    private readonly ITransferUtility _transferUtility;
    private readonly ILogger _logger;
    private readonly AmazonS3Client _s3Client;

    public RemoteObjectsHandler(ITransferUtility transferUtility, ILogger logger)
    {
        _transferUtility = transferUtility;
        _logger = logger;
        
        var awsAccessKeyId = Environment.GetEnvironmentVariable("AWS_ACCESS_KEY_ID");
        var awsSecretAccessKey = Environment.GetEnvironmentVariable("AWS_SECRET_ACCESS_KEY");
        var awsSessionToken = Environment.GetEnvironmentVariable("AWS_SESSION_TOKEN");
        Console.WriteLine($"awsAccessKeyId = {awsAccessKeyId}");
        Console.WriteLine($"awsSecretAccessKey = {awsSecretAccessKey}");
        Console.WriteLine($"awsSessionToken = {awsSessionToken}");

        var credentials = new Amazon.Runtime.SessionAWSCredentials(awsAccessKeyId, awsSecretAccessKey, awsSessionToken);
        _s3Client = new AmazonS3Client(credentials, Amazon.RegionEndpoint.EUCentral1);
    }
    
    public async Task DownloadObject(string bucketName, string objectName, string destinationPath)
    {
        try
        {
            // this is for test purposes - not logic
            await ListBucketsAsync();
            
            _logger.SafeLog($"Downloading object from bucket {bucketName} in {objectName} to {destinationPath}");

            var request = new TransferUtilityDownloadRequest
            {
                BucketName = bucketName,
                Key = objectName,
                FilePath = destinationPath
            };
            
            var s3Cts = new CancellationTokenSource(Consts.S3_ACTION_TIME_LIMIT_MS);
            await _transferUtility.DownloadAsync(request, s3Cts.Token);
        }
        catch (Exception e)
        {
            _logger.SafeLogException($"DownloadObject failed. bucketName = {bucketName}. objectName = {objectName}. destinationPath = {destinationPath}", e);
            throw;
        }
    }

    public async Task UploadObject(string objectPath, string bucketName, string uploadPath)
    {
        try
        {
            _logger.SafeLog($"Uploading object from {objectPath} to bucket {bucketName} in {uploadPath}");
            var s3Cts = new CancellationTokenSource(Consts.S3_ACTION_TIME_LIMIT_MS);
            await _transferUtility.UploadAsync(objectPath, bucketName, uploadPath, s3Cts.Token);
        }
        catch (Exception e)
        {
            _logger.SafeLogException($"UploadObject failed. objectPath = {objectPath}. bucketName = {bucketName}. uploadPath = {uploadPath}", e);
            throw;
        }
    }
    
    public async Task ListBucketsAsync()
    {
        try
        {
            _logger.SafeLog("Listing S3 buckets");

            var response = await _s3Client.ListBucketsAsync();

            foreach (var bucket in response.Buckets)
            {
                Console.WriteLine(bucket.BucketName);
            }
        }
        catch (AmazonS3Exception e)
        {
            _logger.SafeLogException("Error occurred while listing buckets.", e);
            throw;
        }
    }
}

here - when listing buckets after using client that uses the same creds as in the GH runner - i get s3 buckets from another account - in which the role doesnt exists etc.

NOTE - if there is anything thats missing that will help u understand my situation better - plz let me know and ill do my best to add it.

again - would love ur help.



Code:
EDIT - 1


when printing the assumed role in code - i can see that the role is entirely different than the one described in the action - and actually belongs to the other account. thats why it lists a different set of buckets. when i sasumed the role in the same way in the code - i got the correct role and listed the correct buckets. what i dont get is why i assume role and get creds of one type of role - and by using them in code - i get a different role?
<p>My .NET app (version 8, c#) downloads and uploads data from and to S3 buckets.
Im running a GH action that:</p>
<ol>
<li>runs on a self-hosted runner (linux)</li>
<li>does the trivial things (checkout, install .NET etc)</li>
<li>assumes role of a specific account (<code>QA</code> account env) and sets the needed env vars</li>
<li>runs the project with the env vars</li>
</ol>
<p>When Im listing S3 buckets during the GH runner session - I can see the expected buckets.
When I list S3 buckets in my .NET app - I see buckets from another account - but not the expected account.</p>
<p>Really not sure whats going on and would appreciate your help.</p>
<p>Some validations I made:</p>
<ol>
<li>compared the "AWS_ACCESS_KEY_ID", "AWS_SECRET_ACCESS_KEY" and "AWS_SESSION_TOKEN" after assuming role - and during app runtime. they exist and they are the same.</li>
<li>verified that S3 bucket and path exist</li>
<li>verified that S3 bucket permits my assumed role to act</li>
<li>verified that my role exists and has permissions to work with s3 in the expected env</li>
<li>ran the code in another instance that assumed the role during instance creation and validated that it works (meaning - this is probably not a direct bug in the code etc)</li>
</ol>
<p>code:</p>
<ol>
<li>my GH workflow:</li>
</ol>
<pre><code>name: my workflow name
on: [push]
jobs:
run_update_tests:
runs-on: some-linux-self-hosted-runner
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
ref: ${{ github.ref }}

- name: Setup .NET
uses: actions/setup-dotnet@v1
with:
dotnet-version: 8.0.105

- name: Build project
working-directory: path/to/project
run: |
echo "Building project"
dotnet build

- name: Copy test configuration file - Linux
working-directory: some-config-path
run: |
echo "Copying test configuration file"
mkdir -p /tmp/path
cp some/json/file/path some/path
sed -i 's/SOME_SECRET_PLACEHOLDER/${{ secrets.SOME_SECRET }}/' some/path

- name: Assume role
uses: ./.github/actions/assume-role

- name: Check env vars
run: |
echo "AWS_ACCESS_KEY_ID is set: ${{ env.AWS_ACCESS_KEY_ID != '' }}"
echo "AWS_SECRET_ACCESS_KEY is set: ${{ env.AWS_SECRET_ACCESS_KEY != '' }}"
echo "AWS_SESSION_TOKEN is set: ${{ env.AWS_SESSION_TOKEN != '' }}"

- name: Run project
working-directory: some/path
run: |
echo "Running project"
sudo -E dotnet run
</code></pre>
<ol start="2">
<li>assume-role action:</li>
</ol>
<pre><code>name: 'Assume Role'
description: 'Assuming role in AWS'

runs:
using: "composite"

steps:

- name: Install AWS cli
shell: bash
run: |
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install --update

- name: Assume role
shell: bash
run: |
ASSUME_ROLE_OUTPUT=$(aws sts assume-role --role-arn "some:arn:some:role" --role-session-name "AWSCLI-lab")
AWS_ACCESS_KEY_ID=$(echo $ASSUME_ROLE_OUTPUT | jq -r '.Credentials.AccessKeyId')
AWS_SECRET_ACCESS_KEY=$(echo $ASSUME_ROLE_OUTPUT | jq -r '.Credentials.SecretAccessKey')
AWS_SESSION_TOKEN=$(echo $ASSUME_ROLE_OUTPUT | jq -r '.Credentials.SessionToken')
echo "AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID"
echo "AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY"
echo "AWS_SESSION_TOKEN=$AWS_SESSION_TOKEN"
echo "AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID" >> $GITHUB_ENV
echo "AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY" >> $GITHUB_ENV
echo "AWS_SESSION_TOKEN=$AWS_SESSION_TOKEN" >> $GITHUB_ENV

# validate assumed role
- name: Validate assumed role success
shell: bash
run: |
aws ec2 describe-vpcs | grep VpcId
aws s3 ls
</code></pre>
<p>after listing s3 buckets here - i can see the expected buckets (meaning - expected account)</p>
<ol start="3">
<li>my code that tries to download from S3 and lists S3 buckets:</li>
</ol>
<pre><code>public class RemoteObjectsHandler: IRemoteObjectsHandler
{
private readonly ITransferUtility _transferUtility;
private readonly ILogger _logger;
private readonly AmazonS3Client _s3Client;

public RemoteObjectsHandler(ITransferUtility transferUtility, ILogger logger)
{
_transferUtility = transferUtility;
_logger = logger;

var awsAccessKeyId = Environment.GetEnvironmentVariable("AWS_ACCESS_KEY_ID");
var awsSecretAccessKey = Environment.GetEnvironmentVariable("AWS_SECRET_ACCESS_KEY");
var awsSessionToken = Environment.GetEnvironmentVariable("AWS_SESSION_TOKEN");
Console.WriteLine($"awsAccessKeyId = {awsAccessKeyId}");
Console.WriteLine($"awsSecretAccessKey = {awsSecretAccessKey}");
Console.WriteLine($"awsSessionToken = {awsSessionToken}");

var credentials = new Amazon.Runtime.SessionAWSCredentials(awsAccessKeyId, awsSecretAccessKey, awsSessionToken);
_s3Client = new AmazonS3Client(credentials, Amazon.RegionEndpoint.EUCentral1);
}

public async Task DownloadObject(string bucketName, string objectName, string destinationPath)
{
try
{
// this is for test purposes - not logic
await ListBucketsAsync();

_logger.SafeLog($"Downloading object from bucket {bucketName} in {objectName} to {destinationPath}");

var request = new TransferUtilityDownloadRequest
{
BucketName = bucketName,
Key = objectName,
FilePath = destinationPath
};

var s3Cts = new CancellationTokenSource(Consts.S3_ACTION_TIME_LIMIT_MS);
await _transferUtility.DownloadAsync(request, s3Cts.Token);
}
catch (Exception e)
{
_logger.SafeLogException($"DownloadObject failed. bucketName = {bucketName}. objectName = {objectName}. destinationPath = {destinationPath}", e);
throw;
}
}

public async Task UploadObject(string objectPath, string bucketName, string uploadPath)
{
try
{
_logger.SafeLog($"Uploading object from {objectPath} to bucket {bucketName} in {uploadPath}");
var s3Cts = new CancellationTokenSource(Consts.S3_ACTION_TIME_LIMIT_MS);
await _transferUtility.UploadAsync(objectPath, bucketName, uploadPath, s3Cts.Token);
}
catch (Exception e)
{
_logger.SafeLogException($"UploadObject failed. objectPath = {objectPath}. bucketName = {bucketName}. uploadPath = {uploadPath}", e);
throw;
}
}

public async Task ListBucketsAsync()
{
try
{
_logger.SafeLog("Listing S3 buckets");

var response = await _s3Client.ListBucketsAsync();

foreach (var bucket in response.Buckets)
{
Console.WriteLine(bucket.BucketName);
}
}
catch (AmazonS3Exception e)
{
_logger.SafeLogException("Error occurred while listing buckets.", e);
throw;
}
}
}
</code></pre>
<p>here - when listing buckets after using client that uses the same creds as in the GH runner - i get s3 buckets from another account - in which the role doesnt exists etc.</p>
<p>NOTE - if there is anything thats missing that will help u understand my situation better - plz let me know and ill do my best to add it.</p>
<p>again - would love ur help.</p>
<hr />
<pre><code>EDIT - 1
</code></pre>
<hr />
<p>when printing the assumed role in code - i can see that the role is entirely different than the one described in the action - and actually belongs to the other account. thats why it lists a different set of buckets.
when i sasumed the role in the same way in the code - i got the correct role and listed the correct buckets.
what i dont get is why i assume role and get creds of one type of role - and by using them in code - i get a different role?</p>
Continue reading...
 

Latest posts

Top