Skip to content

AWS

AWS

List of open source tools for AWS security: defensive, offensive, auditing, DFIR, etc.
Amazon Web Services — a practical guide

Tools:
Cloud Security Suite for auditing AWS
The AWS exploitation framework, designed for testing the security of Amazon Web Services environments.
Enumerate AWS S3 buckets and grep for files as well as download interesting files if you're not afraid to quickly fill up your hard drive.

Amazon Web Services — a practical guide

AWS Testing Scripts

pacu AWS testing
https://tracebit.com/blog/2024/02/finding-aws-account-id-of-any-s3-bucket/

ScoutSuite

Download ScoutSuite

Test AWS (when aws has been configured):

python Scout.py --provider aws

GCP:

Test GCP (when GCP has been configured):

python Scout.py --provider gcp --user-account

$ python Scout.py --provider gcp --service-account --key-file </PATH/TO/KEY_FILE.JSON>

Scout2

Download Scout2

Note

Scout2 has been merged into the main scout repo https://github.com/nccgroup/ScoutSuite

CloudMapper

CloudMapper helps you analyze your Amazon Web Services (AWS) environments

Configuring Access Keys

Example Key IDs:
AKIAJWTKNE6KHHJSERSA - IAM Access Key
ASIAJ6JADUC2OKZH32VQ - Role Session Key
A3T36DS33RDBS9ESBQOU - Root Access Key

Example Secret Key:
WjH53wpBrKV83vSoyWmnYOJ8KTJvVkz/kv3dZkPS

Example Session Token:
FQoDYXdzEBoaDNDNnH55fGtOR1ssXCL9AbjgQBu3KOTuDkzoSw
uOmyk3yipMlwT9j2cmvCE2nJ0UTTQn3QKW7FE1BsAs+ZUEYQNB
DUpqD64CbmXueScpMhaL1HIkaww7VRzDvKYRoAtn2a88BlUECZ
FTaDJgM0uNCJEFSI4SgoiL8f89zzgNpQqj4YE9AiPVv4ObBfAH
m6YGOQ8m31fjlU3iukqzI0sXKUaAn/m4zLseIx4HuiB5DX9wI/
tLvnriCP4HtPYqkR0DMFODV0MF3aGCkm13LeXz/=

Configure AWS creds:

aws configure
AWS Access Key ID [****************BVPA]:
AWS Secret Access Key [****************ycBd]:
Default region name [None]: us-east-2
Default output format [None]:

Configure AWS creds for different profile:

aws configure --profile <PROFILE_NAME>
stored AWS creds
cat ~/.aws/config
[default]
region = us-east-1
[profile another-profile]
output = json
region = us-east-1
cat ~/.aws/credentials
[default]
aws_access_key_id = AKIA...
aws_secret_access_key = dpor...
[another-profile]
aws_access_key_id = ASIA...
aws_secret_access_key = lhIO...
aws_session_token = FQoGZX...

whoami for AWS:

aws sts get-caller-identity --profile <PROFILE_NAME>

Secrets via Config Files on Disk

AWS Credentials:
- ~/.aws/credentials
- ~/.s3cfg
- ~/.aws/config
- s3cmd.ini
- \~/.elasticbeanstalk/aws_credential_file

Boto configuration:
- ~/.boto
- /etc/boto.cfg
- *.boto (likely contains AWS creds)

Fog configuration:
- .fog

Private keys and certificates:
- *.pem,
- *.key,
- *.cert
- ~/.ssh/id_rsa
- id_dsa
- id_ed25519
- id_ecdsa
- *.pkcs12
- *.pfx
- *.p12
- *.asc

macOS:
- *.keychain

Secrets via Control Plane Interface

AWS EC2 instances can query the metadata service to obtain some additional information from the control plane, which is mostly used to create and maintain the EC2 instances operational status.

The metadata service is accessible via the "169.254.169.254" IP address from an EC2 instance.

  • http://169.254.169.254/latest/meta-data/
  • http://169.254.169.254/latest/meta-data/instance-id/
  • http://169.254.169.254/latest/meta-data/iam/
  • http://169.254.169.254/latest/meta-data/iam/info/
  • http://169.254.169.254/latest/meta-data/iam/securitycredentials/
  • http://169.254.169.254/latest/meta-data/iam/securitycredentials//
  • http://169.254.169.254/latest/meta-data/public-ipv4
  • http://169.254.169.254/latest/user-data

Other Endpoints:
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-instance-metadata.html

Boto3

https://github.com/boto/boto3

We can write our own Boto3 scripts in Python to directly interface with AWS's control plane APIs.
When creating our own python scripts leveraging Boto to interact with the AWS APIs, the first things we do is import the boto library we wish to use.

import boto3

#Create a boto3 session using a profile located within the ~/.aws/credentials file:
session = boto3.Session(profile_name='<PROFILE_NAME>')

#Next, lets create a client representing EC2:
ec2 = session.client("ec2")

#Now we can use that client to describe the regions that are available within EC2 and assign the result to a variable called "regions":
regions = ec2.describe_regions()

We can then break the result up into a list within python with each element containing a name of a region.

For example, we can query the Simple Systems Manager (SSM) service in each region by first creating an SSM client.

Then describing the SSM parameters via the SSM client.

Finally we can loop through each parameter extracting the parameter name and values.

regions = [r['RegionName'] for r in regions['Regions']]
for r in regions:
    ssmClient = session.client("ssm", r)
    params = ssmClient.describe_parameters()
    for p in params['Parameters']:
        p_name = p['Name']
        response = ssmClient.get_parameter(Name=p_name)
        val = response['Parameter']['Value']
        print "%s - %s" % (p_name, val)

For further information:
- https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ssm.html
- https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ssm.html#SSM.Client.describe_parameters

Exploitation Tools

weirdAAL

python3 weirdAAL.py -m recon_all -t MyTarget
python3 weirdAAL.py -m list_services_by_key -t MyTarget
python3 weirdAAL.py -m ec2_describe_instances_basic -t MyTarget
python3 weirdAAL.py -m cloudwatch_list_metrics -t MyTarget

https://github.com/Lifka/hacking-resources/blob/main/cloud-hacking-cheat-sheets.md