# Powertools for AWS Lambda (Python) > Powertools for AWS Lambda (Python) Powertools for AWS Lambda (Python) is a developer toolkit to implement Serverless best practices and increase developer velocity. It provides a suite of utilities for AWS Lambda Functions that makes tracing with AWS X-Ray, structured logging and creating custom metrics asynchronously easier. # Project Overview Powertools for AWS Lambda (Python) is a developer toolkit to implement Serverless best practices and increase developer velocity. - **Features** ______________________________________________________________________ Adopt one, a few, or all industry practices. **Progressively**. [All features](#features) - **Support this project** ______________________________________________________________________ Become a public reference customer, share your work, contribute, use Lambda Layers, etc. [Support](#support-powertools-for-aws-lambda-python) - **Available languages** ______________________________________________________________________ Powertools for AWS Lambda is also available in other languages [Java](https://docs.powertools.aws.dev/lambda/java/), [TypeScript](https://docs.powertools.aws.dev/lambda/typescript/latest/), and [.NET](https://docs.powertools.aws.dev/lambda/dotnet/) ## Install You can install Powertools for AWS Lambda (Python) using your favorite dependency management, or Lambda Layers: Most features use Python standard library and the AWS SDK *(boto3)* that are available in the AWS Lambda runtime. - **pip**: **`pip install "aws-lambda-powertools"`** - **poetry**: **`poetry add "aws-lambda-powertools"`** - **pdm**: **`pdm add "aws-lambda-powertools"`** ### Extra dependencies However, you will need additional dependencies if you are using any of the features below: | Feature | Install | Default dependency | | --- | --- | --- | | **[Tracer](core/tracer/#install)** | **`pip install "aws-lambda-powertools[tracer]"`** | `aws-xray-sdk` | | **[Validation](utilities/validation/#install)** | **`pip install "aws-lambda-powertools[validation]"`** | `fastjsonschema` | | **[Parser](utilities/parser/#install)** | **`pip install "aws-lambda-powertools[parser]"`** | `pydantic` *(v2)* | | **[Data Masking](utilities/data_masking/#install)** | **`pip install "aws-lambda-powertools[datamasking]"`** | `aws-encryption-sdk`, `jsonpath-ng` | | **All extra dependencies at once** | **`pip install "aws-lambda-powertools[all]"`** | | | **Two or more extra dependencies only, not all** | **`pip install "aws-lambda-powertools[tracer,parser,datamasking]"`** | | [Lambda Layer](https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html) is a .zip file archive that can contain additional code, pre-packaged dependencies, data, or configuration files. We compile and optimize [all dependencies](#install), and remove duplicate dependencies [already available in the Lambda runtime](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/layer_v3/docker/Dockerfile#L34) to achieve the most optimal size. For the latter, make sure to replace `{region}` with your AWS region, e.g., `eu-west-1`, and the `{python_version}` without the period (.), e.g., `python313` for `Python 3.13`. | Architecture | Layer ARN | | --- | --- | | x86_64 | **arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-{python_version}-x86_64:7** | | ARM | **arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-{python_version}-arm64:7** | You can add our layer using the [AWS Lambda Console *(direct link)*](https://console.aws.amazon.com/lambda/home#/add/layer): - Under Layers, choose `AWS layers` or `Specify an ARN` - Click to copy the [correct ARN](#lambda-layer) value based on your AWS Lambda function architecture and region We offer Parameter Store aliases for releases too, allowing you to specify either specific versions or use the latest version on every deploy. To use these you can add these snippets to your AWS CloudFormation or Terraform projects: **CloudFormation** Sample Placeholders: - `{arch}` is either `arm64` (Graviton based functions) or `x86_64` - `{python_version}` is the Python runtime version, e.g., `python3.13` for `Python 3.13`. - `{version}` is the semantic version number (e,g. 3.1.0) for a release or `latest` ``` MyFunction: Type: "AWS::Lambda::Function" Properties: ... Layers: - {{resolve:ssm:/aws/service/powertools/python/{arch}/{python_version}/{version}}} ``` **Terraform** Using the [`aws_ssm_parameter`](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/data-sources/ssm_parameter) data provider from the AWS Terraform provider allows you to lookup the value of parameters to use later in your project. ``` data "aws_ssm_parameter" "powertools_version" { name = "/aws/service/powertools/python/{arch}/{python_version}/{version}" } resource "aws_lambda_function" "test_lambda" { ... runtime = "python3.13" layers = [data.aws_ssm_parameter.powertools_version.value] } ``` > Are we missing a framework? please create [a documentation request](https://github.com/aws-powertools/powertools-lambda-python/issues/new?assignees=&labels=documentation%2Ctriage&projects=&template=documentation_improvements.yml&title=Docs%3A+TITLE). Thanks to the community, we've covered most popular frameworks on how to add a Lambda Layer to an existing function. ``` AWSTemplateFormatVersion: '2010-09-09' Transform: AWS::Serverless-2016-10-31 Resources: MyLambdaFunction: Type: AWS::Serverless::Function Properties: Runtime: python3.12 Handler: app.lambda_handler Layers: - !Sub arn:aws:lambda:${AWS::Region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15 ``` ``` service: powertools-lambda provider: name: aws runtime: python3.12 region: us-east-1 functions: powertools: handler: lambda_function.lambda_handler architecture: arm64 layers: - arn:aws:lambda:${aws:region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15 ``` ``` from aws_cdk import Aws, Stack, aws_lambda from constructs import Construct class SampleApp(Stack): def __init__(self, scope: Construct, construct_id: str, **kwargs) -> None: super().__init__(scope, construct_id, **kwargs) powertools_layer = aws_lambda.LayerVersion.from_layer_version_arn( self, id="lambda-powertools", layer_version_arn=f"arn:aws:lambda:{Aws.REGION}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15", ) aws_lambda.Function( self, "sample-app-lambda", runtime=aws_lambda.Runtime.PYTHON_3_12, layers=[powertools_layer], code=aws_lambda.Code.from_asset("lambda"), handler="hello.handler", ) ``` ``` terraform { required_version = "~> 1.0.5" required_providers { aws = "~> 3.50.0" } } provider "aws" { region = "{region}" } resource "aws_iam_role" "iam_for_lambda" { name = "iam_for_lambda" assume_role_policy = < ? Choose the runtime that you want to use: Python ? Do you want to configure advanced settings? Yes ... ? Do you want to enable Lambda layers for this function? Yes ? Enter up to 5 existing Lambda layer ARNs (comma-separated): arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15 ❯ amplify push -y # Updating an existing function and add the layer ❯ amplify update function ? Select the Lambda function you want to update test2 General information - Name: ? Which setting do you want to update? Lambda layers configuration ? Do you want to enable Lambda layers for this function? Yes ? Enter up to 5 existing Lambda layer ARNs (comma-separated): arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15 ? Do you want to edit the local lambda function now? No ``` ``` AWSTemplateFormatVersion: '2010-09-09' Transform: AWS::Serverless-2016-10-31 Resources: MyLambdaFunction: Type: AWS::Serverless::Function Properties: Architectures: [arm64] Runtime: python3.12 Handler: app.lambda_handler Layers: - !Sub arn:aws:lambda:${AWS::Region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15 ``` ``` service: powertools-lambda provider: name: aws runtime: python3.12 region: us-east-1 functions: powertools: handler: lambda_function.lambda_handler architecture: arm64 layers: - arn:aws:lambda:${aws:region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15 ``` ``` from aws_cdk import Aws, Stack, aws_lambda from constructs import Construct class SampleApp(Stack): def __init__(self, scope: Construct, construct_id: str, **kwargs) -> None: super().__init__(scope, construct_id, **kwargs) powertools_layer = aws_lambda.LayerVersion.from_layer_version_arn( self, id="lambda-powertools", layer_version_arn=f"arn:aws:lambda:{Aws.REGION}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15", ) aws_lambda.Function( self, "sample-app-lambda", runtime=aws_lambda.Runtime.PYTHON_3_12, layers=[powertools_layer], architecture=aws_lambda.Architecture.ARM_64, code=aws_lambda.Code.from_asset("lambda"), handler="hello.handler", ) ``` ``` terraform { required_version = "~> 1.0.5" required_providers { aws = "~> 3.50.0" } } provider "aws" { region = "{region}" } resource "aws_iam_role" "iam_for_lambda" { name = "iam_for_lambda" assume_role_policy = < ? Choose the runtime that you want to use: Python ? Do you want to configure advanced settings? Yes ... ? Do you want to enable Lambda layers for this function? Yes ? Enter up to 5 existing Lambda layer ARNs (comma-separated): arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15 ❯ amplify push -y # Updating an existing function and add the layer ❯ amplify update function ? Select the Lambda function you want to update test2 General information - Name: ? Which setting do you want to update? Lambda layers configuration ? Do you want to enable Lambda layers for this function? Yes ? Enter up to 5 existing Lambda layer ARNs (comma-separated): arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15 ? Do you want to edit the local lambda function now? No ``` You can use AWS CLI to generate a pre-signed URL to download the contents of our Lambda Layer. ``` aws lambda get-layer-version-by-arn --arn arn:aws:lambda:eu-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15 --region eu-west-1 ``` You'll find the pre-signed URL under `Location` key as part of the CLI command output. [Lambda Layer](https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html) is a .zip file archive that can contain additional code, pre-packaged dependencies, data, or configuration files. We compile and optimize [all dependencies](#install), and remove duplicate dependencies [already available in the Lambda runtime](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/layer_v3/docker/Dockerfile#L34) to achieve the most optimal size. For the latter, make sure to replace `{python_version}` without the period (.), e.g., `python313` for `Python 3.13`. **AWS GovCloud (us-gov-east-1)** | Architecture | Layer ARN | | --- | --- | | x86_64 | **arn:aws-us-gov:lambda:us-gov-east-1:165087284144:layer:AWSLambdaPowertoolsPythonV3-{python_version}-x86_64:7** | | ARM | **arn:aws-us-gov:lambda:us-gov-east-1:165087284144:layer:AWSLambdaPowertoolsPythonV3-{python_version}-arm64:7** | **AWS GovCloud (us-gov-west-1)** | Architecture | Layer ARN | | --- | --- | | x86_64 | **arn:aws-us-gov:lambda:us-gov-west-1:165093116878:layer:AWSLambdaPowertoolsPythonV3-{python_version}-x86_64:7** | | ARM | **arn:aws-us-gov:lambda:us-gov-west-1:165093116878:layer:AWSLambdaPowertoolsPythonV3-{python_version}-arm64:7** | We provide a SAR App that deploys a CloudFormation stack with a copy of our Lambda Layer in your AWS account and region. Compared with the [public Layer ARN](#lambda-layer) option, the advantage is being able to use a semantic version. Make sure to replace `{python_version}` without the period (.), e.g., `python313` for `Python 3.13`. | App | ARN | Architecture | | --- | --- | --- | | aws-lambda-powertools-python-layer-v3-{python_version}-x86-64 | arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-{python_version}-x86-64 | X86_64 | | aws-lambda-powertools-python-layer-v3-{python_version}-arm64 | arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-{python_version}-arm64 | ARM64 | Don't have enough permissions? Expand for a least-privilege IAM policy example Credits to [mwarkentin](https://github.com/mwarkentin) for providing the scoped down IAM permissions. ``` AWSTemplateFormatVersion: "2010-09-09" Resources: PowertoolsLayerIamRole: Type: "AWS::IAM::Role" Properties: AssumeRolePolicyDocument: Version: "2012-10-17" Statement: - Effect: "Allow" Principal: Service: - "cloudformation.amazonaws.com" Action: - "sts:AssumeRole" Path: "/" PowertoolsLayerIamPolicy: Type: "AWS::IAM::Policy" Properties: PolicyName: PowertoolsLambdaLayerPolicy PolicyDocument: Version: "2012-10-17" Statement: - Sid: CloudFormationTransform Effect: Allow Action: cloudformation:CreateChangeSet Resource: - arn:aws:cloudformation:us-east-1:aws:transform/Serverless-2016-10-31 - Sid: GetCfnTemplate Effect: Allow Action: - serverlessrepo:CreateCloudFormationTemplate - serverlessrepo:GetCloudFormationTemplate Resource: # this is arn of the Powertools for AWS Lambda (Python) SAR app - arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python313-x86-64 - Sid: S3AccessLayer Effect: Allow Action: - s3:GetObject Resource: # AWS publishes to an external S3 bucket locked down to your account ID # The below example is us publishing Powertools for AWS Lambda (Python) # Bucket: awsserverlessrepo-changesets-plntc6bfnfj # Key: *****/arn:aws:serverlessrepo:eu-west-1:057560766410:applications-aws-lambda-powertools-python-layer-v3-python313-x86-64-3.0.9/aeeccf50-****-****-****-********* - arn:aws:s3:::awsserverlessrepo-changesets-*/* - Sid: GetLayerVersion Effect: Allow Action: - lambda:PublishLayerVersion - lambda:GetLayerVersion Resource: - !Sub arn:aws:lambda:${AWS::Region}:${AWS::AccountId}:layer:aws-lambda-powertools-python-layer-v3* Roles: - Ref: "PowertoolsLayerIamRole" ``` If you're using Infrastructure as Code, here are some excerpts on how to use SAR: ``` AWSTemplateFormatVersion: '2010-09-09' Transform: AWS::Serverless-2016-10-31 Resources: AwsLambdaPowertoolsPythonLayer: Type: AWS::Serverless::Application Properties: Location: ApplicationId: arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python313-x86-64 SemanticVersion: 3.0.9 # change to latest semantic version available in SAR MyLambdaFunction: Type: AWS::Serverless::Function Properties: Runtime: python3.13 Handler: app.lambda_handler Layers: # fetch Layer ARN from SAR App stack output - !GetAtt AwsLambdaPowertoolsPythonLayer.Outputs.LayerVersionArn ``` ``` service: powertools-lambda provider: name: aws runtime: python3.13 region: us-east-1 functions: powertools: handler: lambda_function.lambda_handler layers: - !GetAtt AwsLambdaPowertoolsPythonLayer.Outputs.LayerVersionArn resources: - AwsLambdaPowertoolsPythonLayer: Type: AWS::Serverless::Application Properties: Location: ApplicationId: arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python313-x86-64 SemanticVersion: 3.0.9 ``` ``` from aws_cdk import Stack, aws_lambda, aws_sam from constructs import Construct POWERTOOLS_BASE_NAME = "AWSLambdaPowertools" # Find latest from github.com/aws-powertools/powertools-lambda-python/releases POWERTOOLS_VER = "3.0.9" POWERTOOLS_ARN = ( "arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python313-x86-64" ) class SampleApp(Stack): def __init__(self, scope: Construct, id_: str) -> None: super().__init__(scope, id_) # Launches SAR App as CloudFormation nested stack and return Lambda Layer powertools_app = aws_sam.CfnApplication( self, f"{POWERTOOLS_BASE_NAME}Application", location={"applicationId": POWERTOOLS_ARN, "semanticVersion": POWERTOOLS_VER}, ) powertools_layer_arn = powertools_app.get_att("Outputs.LayerVersionArn").to_string() powertools_layer_version = aws_lambda.LayerVersion.from_layer_version_arn( self, f"{POWERTOOLS_BASE_NAME}", powertools_layer_arn, ) aws_lambda.Function( self, "sample-app-lambda", runtime=aws_lambda.Runtime.PYTHON_3_13, function_name="sample-lambda", code=aws_lambda.Code.from_asset("lambda"), handler="hello.handler", layers=[powertools_layer_version], ) ``` > Credits to [Dani Comnea](https://github.com/DanyC97) for providing the Terraform equivalent. ``` terraform { required_version = "~> 0.13" required_providers { aws = "~> 3.50.0" } } provider "aws" { region = "us-east-1" } resource "aws_serverlessapplicationrepository_cloudformation_stack" "deploy_sar_stack" { name = "aws-lambda-powertools-python-layer" application_id = data.aws_serverlessapplicationrepository_application.sar_app.application_id semantic_version = data.aws_serverlessapplicationrepository_application.sar_app.semantic_version capabilities = [ "CAPABILITY_IAM", "CAPABILITY_NAMED_IAM" ] } data "aws_serverlessapplicationrepository_application" "sar_app" { application_id = "arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python313-x86-64" semantic_version = var.aws_powertools_version } variable "aws_powertools_version" { type = string default = "3.0.9" description = "The Powertools for AWS Lambda (Python) release version" } output "deployed_powertools_sar_version" { value = data.aws_serverlessapplicationrepository_application.sar_app.semantic_version } # Fetch Powertools for AWS Lambda (Python) Layer ARN from deployed SAR App output "aws_lambda_powertools_layer_arn" { value = aws_serverlessapplicationrepository_cloudformation_stack.deploy_sar_stack.outputs.LayerVersionArn } ``` Every morning during business days *(~8am UTC)*, we publish a `prerelease` to PyPi to accelerate customer feedback on **unstable** releases / bugfixes until they become production ready. Here's how you can use them: - **Pip**: [**`pip install --pre "aws-lambda-powertools"`**](#) - **Poetry**: [**`poetry add --allow-prereleases "aws-lambda-powertools" --group dev`**](#) - **Pdm**: [**`pdm add -dG --prerelease "aws-lambda-powertools"`**](#) ### Local development Using Lambda Layer? Simply add [**`"aws-lambda-powertools[all]"`**](#) as a development dependency. Powertools for AWS Lambda (Python) relies on the [AWS SDK bundled in the Lambda runtime](https://docs.aws.amazon.com/lambda/latest/dg/lambda-python.html). This helps us achieve an optimal package size and initialization. However, when developing locally, you need to install AWS SDK as a development dependency to support IDE auto-completion and to run your tests locally: - **Pip**: [**`pip install "aws-lambda-powertools[aws-sdk]"`**](#) - **Poetry**: [**`poetry add "aws-lambda-powertools[aws-sdk]" --group dev`**](#) - **Pdm**: [**`pdm add -dG "aws-lambda-powertools[aws-sdk]"`**](#) **A word about dependency resolution** In this context, `[aws-sdk]` is an alias to the `boto3` package. Due to dependency resolution, it'll either install: - **(A)** the SDK version available in [Lambda runtime](https://docs.aws.amazon.com/lambda/latest/dg/lambda-python.html) - **(B)** a more up-to-date version if another package you use also depends on `boto3`, for example [Powertools for AWS Lambda (Python) Tracer](core/tracer/) ### Lambda Layer [Lambda Layer](https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html) is a .zip file archive that can contain additional code, pre-packaged dependencies, data, or configuration files. We compile and optimize [all dependencies](#install) for Python versions from **3.9 to 3.13**, as well as for both **arm64 and x86_64** architectures, to ensure compatibility. We also remove duplicate dependencies [already available in the Lambda runtime](https://github.com/aws-powertools/powertools-lambda-layer-cdk/blob/d24716744f7d1f37617b4998c992c4c067e19e64/layer/Python/Dockerfile#L36) to achieve the most optimal size. Click to expand and copy any regional Lambda Layer ARN | Region | Layer ARN | | --- | --- | | **`af-south-1`** | **arn:aws:lambda:af-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`ap-east-1`** | **arn:aws:lambda:ap-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`ap-northeast-1`** | **arn:aws:lambda:ap-northeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`ap-northeast-2`** | **arn:aws:lambda:ap-northeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`ap-northeast-3`** | **arn:aws:lambda:ap-northeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`ap-south-1`** | **arn:aws:lambda:ap-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`ap-south-2`** | **arn:aws:lambda:ap-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`ap-southeast-1`** | **arn:aws:lambda:ap-southeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`ap-southeast-2`** | **arn:aws:lambda:ap-southeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`ap-southeast-3`** | **arn:aws:lambda:ap-southeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`ap-southeast-4`** | **arn:aws:lambda:ap-southeast-4:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`ap-southeast-5`** | **arn:aws:lambda:ap-southeast-5:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`ap-southeast-7`** | **arn:aws:lambda:ap-southeast-7:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`ca-central-1`** | **arn:aws:lambda:ca-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`ca-west-1`** | **arn:aws:lambda:ca-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`eu-central-1`** | **arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`eu-central-2`** | **arn:aws:lambda:eu-central-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`eu-north-1`** | **arn:aws:lambda:eu-north-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`eu-south-1`** | **arn:aws:lambda:eu-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`eu-south-2`** | **arn:aws:lambda:eu-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`eu-west-1`** | **arn:aws:lambda:eu-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`eu-west-2`** | **arn:aws:lambda:eu-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`eu-west-3`** | **arn:aws:lambda:eu-west-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`il-central-1`** | **arn:aws:lambda:il-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`me-central-1`** | **arn:aws:lambda:me-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`me-south-1`** | **arn:aws:lambda:me-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`mx-central-1`** | **arn:aws:lambda:mx-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`sa-east-1`** | **arn:aws:lambda:sa-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`us-east-1`** | **arn:aws:lambda:us-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`us-east-2`** | **arn:aws:lambda:us-east-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`us-west-1`** | **arn:aws:lambda:us-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | **`us-west-2`** | **arn:aws:lambda:us-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:15** | | Region | Layer ARN | | --- | --- | | **`af-south-1`** | **arn:aws:lambda:af-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`ap-east-1`** | **arn:aws:lambda:ap-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`ap-northeast-1`** | **arn:aws:lambda:ap-northeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`ap-northeast-2`** | **arn:aws:lambda:ap-northeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`ap-northeast-3`** | **arn:aws:lambda:ap-northeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`ap-south-1`** | **arn:aws:lambda:ap-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`ap-south-2`** | **arn:aws:lambda:ap-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`ap-southeast-1`** | **arn:aws:lambda:ap-southeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`ap-southeast-2`** | **arn:aws:lambda:ap-southeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`ap-southeast-3`** | **arn:aws:lambda:ap-southeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`ap-southeast-4`** | **arn:aws:lambda:ap-southeast-4:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`ap-southeast-5`** | **arn:aws:lambda:ap-southeast-5:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`ap-southeast-7`** | **arn:aws:lambda:ap-southeast-7:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`ca-central-1`** | **arn:aws:lambda:ca-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`ca-west-1`** | **arn:aws:lambda:ca-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`eu-central-1`** | **arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`eu-central-2`** | **arn:aws:lambda:eu-central-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`eu-north-1`** | **arn:aws:lambda:eu-north-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`eu-south-1`** | **arn:aws:lambda:eu-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`eu-south-2`** | **arn:aws:lambda:eu-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`eu-west-1`** | **arn:aws:lambda:eu-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`eu-west-2`** | **arn:aws:lambda:eu-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`eu-west-3`** | **arn:aws:lambda:eu-west-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`il-central-1`** | **arn:aws:lambda:il-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`me-central-1`** | **arn:aws:lambda:me-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`me-south-1`** | **arn:aws:lambda:me-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`mx-central-1`** | **arn:aws:lambda:mx-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`sa-east-1`** | **arn:aws:lambda:sa-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`us-east-1`** | **arn:aws:lambda:us-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`us-east-2`** | **arn:aws:lambda:us-east-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`us-west-1`** | **arn:aws:lambda:us-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | **`us-west-2`** | **arn:aws:lambda:us-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:15** | | Region | Layer ARN | | --- | --- | | **`af-south-1`** | **arn:aws:lambda:af-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`ap-east-1`** | **arn:aws:lambda:ap-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`ap-northeast-1`** | **arn:aws:lambda:ap-northeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`ap-northeast-2`** | **arn:aws:lambda:ap-northeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`ap-northeast-3`** | **arn:aws:lambda:ap-northeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`ap-south-1`** | **arn:aws:lambda:ap-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`ap-south-2`** | **arn:aws:lambda:ap-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`ap-southeast-1`** | **arn:aws:lambda:ap-southeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`ap-southeast-2`** | **arn:aws:lambda:ap-southeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`ap-southeast-3`** | **arn:aws:lambda:ap-southeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`ap-southeast-4`** | **arn:aws:lambda:ap-southeast-4:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`ap-southeast-5`** | **arn:aws:lambda:ap-southeast-5:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`ap-southeast-7`** | **arn:aws:lambda:ap-southeast-7:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`ca-central-1`** | **arn:aws:lambda:ca-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`ca-west-1`** | **arn:aws:lambda:ca-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`eu-central-1`** | **arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`eu-central-2`** | **arn:aws:lambda:eu-central-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`eu-north-1`** | **arn:aws:lambda:eu-north-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`eu-south-1`** | **arn:aws:lambda:eu-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`eu-south-2`** | **arn:aws:lambda:eu-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`eu-west-1`** | **arn:aws:lambda:eu-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`eu-west-2`** | **arn:aws:lambda:eu-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`eu-west-3`** | **arn:aws:lambda:eu-west-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`il-central-1`** | **arn:aws:lambda:il-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`me-central-1`** | **arn:aws:lambda:me-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`me-south-1`** | **arn:aws:lambda:me-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`mx-central-1`** | **arn:aws:lambda:mx-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`sa-east-1`** | **arn:aws:lambda:sa-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`us-east-1`** | **arn:aws:lambda:us-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`us-east-2`** | **arn:aws:lambda:us-east-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`us-west-1`** | **arn:aws:lambda:us-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | **`us-west-2`** | **arn:aws:lambda:us-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:15** | | Region | Layer ARN | | --- | --- | | **`af-south-1`** | **arn:aws:lambda:af-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`ap-east-1`** | **arn:aws:lambda:ap-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`ap-northeast-1`** | **arn:aws:lambda:ap-northeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`ap-northeast-2`** | **arn:aws:lambda:ap-northeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`ap-northeast-3`** | **arn:aws:lambda:ap-northeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`ap-south-1`** | **arn:aws:lambda:ap-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`ap-south-2`** | **arn:aws:lambda:ap-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`ap-southeast-1`** | **arn:aws:lambda:ap-southeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`ap-southeast-2`** | **arn:aws:lambda:ap-southeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`ap-southeast-3`** | **arn:aws:lambda:ap-southeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`ap-southeast-4`** | **arn:aws:lambda:ap-southeast-4:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`ap-southeast-5`** | **arn:aws:lambda:ap-southeast-5:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`ap-southeast-7`** | **arn:aws:lambda:ap-southeast-7:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`ca-central-1`** | **arn:aws:lambda:ca-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`ca-west-1`** | **arn:aws:lambda:ca-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`eu-central-1`** | **arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`eu-central-2`** | **arn:aws:lambda:eu-central-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`eu-north-1`** | **arn:aws:lambda:eu-north-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`eu-south-1`** | **arn:aws:lambda:eu-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`eu-south-2`** | **arn:aws:lambda:eu-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`eu-west-1`** | **arn:aws:lambda:eu-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`eu-west-2`** | **arn:aws:lambda:eu-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`eu-west-3`** | **arn:aws:lambda:eu-west-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`il-central-1`** | **arn:aws:lambda:il-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`me-central-1`** | **arn:aws:lambda:me-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`me-south-1`** | **arn:aws:lambda:me-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`mx-central-1`** | **arn:aws:lambda:mx-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`sa-east-1`** | **arn:aws:lambda:sa-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`us-east-1`** | **arn:aws:lambda:us-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`us-east-2`** | **arn:aws:lambda:us-east-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`us-west-1`** | **arn:aws:lambda:us-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | **`us-west-2`** | **arn:aws:lambda:us-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15** | | Region | Layer ARN | | --- | --- | | **`af-south-1`** | **arn:aws:lambda:af-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`ap-east-1`** | **arn:aws:lambda:ap-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`ap-northeast-1`** | **arn:aws:lambda:ap-northeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`ap-northeast-2`** | **arn:aws:lambda:ap-northeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`ap-northeast-3`** | **arn:aws:lambda:ap-northeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`ap-south-1`** | **arn:aws:lambda:ap-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`ap-south-2`** | **arn:aws:lambda:ap-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`ap-southeast-1`** | **arn:aws:lambda:ap-southeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`ap-southeast-2`** | **arn:aws:lambda:ap-southeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`ap-southeast-3`** | **arn:aws:lambda:ap-southeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`ap-southeast-4`** | **arn:aws:lambda:ap-southeast-4:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`ap-southeast-5`** | **arn:aws:lambda:ap-southeast-5:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`ap-southeast-7`** | **arn:aws:lambda:ap-southeast-7:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`ca-central-1`** | **arn:aws:lambda:ca-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`ca-west-1`** | **arn:aws:lambda:ca-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`eu-central-1`** | **arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`eu-central-2`** | **arn:aws:lambda:eu-central-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`eu-north-1`** | **arn:aws:lambda:eu-north-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`eu-south-1`** | **arn:aws:lambda:eu-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`eu-south-2`** | **arn:aws:lambda:eu-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`eu-west-1`** | **arn:aws:lambda:eu-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`eu-west-2`** | **arn:aws:lambda:eu-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`eu-west-3`** | **arn:aws:lambda:eu-west-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`il-central-1`** | **arn:aws:lambda:il-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`me-central-1`** | **arn:aws:lambda:me-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`me-south-1`** | **arn:aws:lambda:me-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`mx-central-1`** | **arn:aws:lambda:mx-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`sa-east-1`** | **arn:aws:lambda:sa-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`us-east-1`** | **arn:aws:lambda:us-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`us-east-2`** | **arn:aws:lambda:us-east-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`us-west-1`** | **arn:aws:lambda:us-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | | **`us-west-2`** | **arn:aws:lambda:us-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:15** | Click to expand and copy any regional Lambda Layer ARN | Region | Layer ARN | | --- | --- | | **`af-south-1`** | **arn:aws:lambda:af-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`ap-east-1`** | **arn:aws:lambda:ap-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`ap-northeast-1`** | **arn:aws:lambda:ap-northeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`ap-northeast-2`** | **arn:aws:lambda:ap-northeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`ap-northeast-3`** | **arn:aws:lambda:ap-northeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`ap-south-1`** | **arn:aws:lambda:ap-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`ap-south-2`** | **arn:aws:lambda:ap-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`ap-southeast-1`** | **arn:aws:lambda:ap-southeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`ap-southeast-2`** | **arn:aws:lambda:ap-southeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`ap-southeast-3`** | **arn:aws:lambda:ap-southeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`ap-southeast-4`** | **arn:aws:lambda:ap-southeast-4:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`ap-southeast-5`** | **arn:aws:lambda:ap-southeast-5:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`ap-southeast-7`** | **arn:aws:lambda:ap-southeast-7:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`ca-central-1`** | **arn:aws:lambda:ca-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`eu-central-1`** | **arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`eu-central-2`** | **arn:aws:lambda:eu-central-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`eu-north-1`** | **arn:aws:lambda:eu-north-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`eu-south-1`** | **arn:aws:lambda:eu-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`eu-south-2`** | **arn:aws:lambda:eu-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`eu-west-1`** | **arn:aws:lambda:eu-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`eu-west-2`** | **arn:aws:lambda:eu-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`eu-west-3`** | **arn:aws:lambda:eu-west-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`il-central-1`** | **arn:aws:lambda:il-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`me-central-1`** | **arn:aws:lambda:me-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`me-south-1`** | **arn:aws:lambda:me-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`mx-central-1`** | **arn:aws:lambda:mx-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`sa-east-1`** | **arn:aws:lambda:sa-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`us-east-1`** | **arn:aws:lambda:us-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`us-east-2`** | **arn:aws:lambda:us-east-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`us-west-1`** | **arn:aws:lambda:us-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | **`us-west-2`** | **arn:aws:lambda:us-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:15** | | Region | Layer ARN | | --- | --- | | **`af-south-1`** | **arn:aws:lambda:af-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`ap-east-1`** | **arn:aws:lambda:ap-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`ap-northeast-1`** | **arn:aws:lambda:ap-northeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`ap-northeast-2`** | **arn:aws:lambda:ap-northeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`ap-northeast-3`** | **arn:aws:lambda:ap-northeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`ap-south-1`** | **arn:aws:lambda:ap-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`ap-south-2`** | **arn:aws:lambda:ap-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`ap-southeast-1`** | **arn:aws:lambda:ap-southeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`ap-southeast-2`** | **arn:aws:lambda:ap-southeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`ap-southeast-3`** | **arn:aws:lambda:ap-southeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`ap-southeast-4`** | **arn:aws:lambda:ap-southeast-4:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`ap-southeast-5`** | **arn:aws:lambda:ap-southeast-5:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`ap-southeast-7`** | **arn:aws:lambda:ap-southeast-7:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`ca-central-1`** | **arn:aws:lambda:ca-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`eu-central-1`** | **arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`eu-central-2`** | **arn:aws:lambda:eu-central-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`eu-north-1`** | **arn:aws:lambda:eu-north-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`eu-south-1`** | **arn:aws:lambda:eu-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`eu-south-2`** | **arn:aws:lambda:eu-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`eu-west-1`** | **arn:aws:lambda:eu-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`eu-west-2`** | **arn:aws:lambda:eu-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`eu-west-3`** | **arn:aws:lambda:eu-west-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`il-central-1`** | **arn:aws:lambda:il-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`me-central-1`** | **arn:aws:lambda:me-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`me-south-1`** | **arn:aws:lambda:me-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`mx-central-1`** | **arn:aws:lambda:mx-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`sa-east-1`** | **arn:aws:lambda:sa-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`us-east-1`** | **arn:aws:lambda:us-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`us-east-2`** | **arn:aws:lambda:us-east-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`us-west-1`** | **arn:aws:lambda:us-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | **`us-west-2`** | **arn:aws:lambda:us-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:15** | | Region | Layer ARN | | --- | --- | | **`af-south-1`** | **arn:aws:lambda:af-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`ap-east-1`** | **arn:aws:lambda:ap-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`ap-northeast-1`** | **arn:aws:lambda:ap-northeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`ap-northeast-2`** | **arn:aws:lambda:ap-northeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`ap-northeast-3`** | **arn:aws:lambda:ap-northeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`ap-south-1`** | **arn:aws:lambda:ap-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`ap-south-2`** | **arn:aws:lambda:ap-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`ap-southeast-1`** | **arn:aws:lambda:ap-southeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`ap-southeast-2`** | **arn:aws:lambda:ap-southeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`ap-southeast-3`** | **arn:aws:lambda:ap-southeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`ap-southeast-4`** | **arn:aws:lambda:ap-southeast-4:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`ap-southeast-5`** | **arn:aws:lambda:ap-southeast-5:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`ap-southeast-7`** | **arn:aws:lambda:ap-southeast-7:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`ca-central-1`** | **arn:aws:lambda:ca-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`eu-central-1`** | **arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`eu-central-2`** | **arn:aws:lambda:eu-central-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`eu-north-1`** | **arn:aws:lambda:eu-north-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`eu-south-1`** | **arn:aws:lambda:eu-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`eu-south-2`** | **arn:aws:lambda:eu-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`eu-west-1`** | **arn:aws:lambda:eu-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`eu-west-2`** | **arn:aws:lambda:eu-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`eu-west-3`** | **arn:aws:lambda:eu-west-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`il-central-1`** | **arn:aws:lambda:il-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`me-central-1`** | **arn:aws:lambda:me-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`me-south-1`** | **arn:aws:lambda:me-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`mx-central-1`** | **arn:aws:lambda:mx-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`sa-east-1`** | **arn:aws:lambda:sa-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`us-east-1`** | **arn:aws:lambda:us-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`us-east-2`** | **arn:aws:lambda:us-east-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`us-west-1`** | **arn:aws:lambda:us-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | **`us-west-2`** | **arn:aws:lambda:us-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:15** | | Region | Layer ARN | | --- | --- | | **`af-south-1`** | **arn:aws:lambda:af-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`ap-east-1`** | **arn:aws:lambda:ap-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`ap-northeast-1`** | **arn:aws:lambda:ap-northeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`ap-northeast-2`** | **arn:aws:lambda:ap-northeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`ap-northeast-3`** | **arn:aws:lambda:ap-northeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`ap-south-1`** | **arn:aws:lambda:ap-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`ap-south-2`** | **arn:aws:lambda:ap-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`ap-southeast-1`** | **arn:aws:lambda:ap-southeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`ap-southeast-2`** | **arn:aws:lambda:ap-southeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`ap-southeast-3`** | **arn:aws:lambda:ap-southeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`ap-southeast-4`** | **arn:aws:lambda:ap-southeast-4:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`ap-southeast-5`** | **arn:aws:lambda:ap-southeast-5:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`ap-southeast-7`** | **arn:aws:lambda:ap-southeast-7:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`ca-central-1`** | **arn:aws:lambda:ca-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`eu-central-1`** | **arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`eu-central-2`** | **arn:aws:lambda:eu-central-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`eu-north-1`** | **arn:aws:lambda:eu-north-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`eu-south-1`** | **arn:aws:lambda:eu-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`eu-south-2`** | **arn:aws:lambda:eu-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`eu-west-1`** | **arn:aws:lambda:eu-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`eu-west-2`** | **arn:aws:lambda:eu-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`eu-west-3`** | **arn:aws:lambda:eu-west-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`il-central-1`** | **arn:aws:lambda:il-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`me-central-1`** | **arn:aws:lambda:me-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`me-south-1`** | **arn:aws:lambda:me-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`mx-central-1`** | **arn:aws:lambda:mx-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`sa-east-1`** | **arn:aws:lambda:sa-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`us-east-1`** | **arn:aws:lambda:us-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`us-east-2`** | **arn:aws:lambda:us-east-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`us-west-1`** | **arn:aws:lambda:us-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | **`us-west-2`** | **arn:aws:lambda:us-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:15** | | Region | Layer ARN | | --- | --- | | **`af-south-1`** | **arn:aws:lambda:af-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`ap-east-1`** | **arn:aws:lambda:ap-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`ap-northeast-1`** | **arn:aws:lambda:ap-northeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`ap-northeast-2`** | **arn:aws:lambda:ap-northeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`ap-northeast-3`** | **arn:aws:lambda:ap-northeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`ap-south-1`** | **arn:aws:lambda:ap-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`ap-south-2`** | **arn:aws:lambda:ap-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`ap-southeast-1`** | **arn:aws:lambda:ap-southeast-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`ap-southeast-2`** | **arn:aws:lambda:ap-southeast-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`ap-southeast-3`** | **arn:aws:lambda:ap-southeast-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`ap-southeast-4`** | **arn:aws:lambda:ap-southeast-4:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`ap-southeast-5`** | **arn:aws:lambda:ap-southeast-5:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`ap-southeast-7`** | **arn:aws:lambda:ap-southeast-7:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`ca-central-1`** | **arn:aws:lambda:ca-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`eu-central-1`** | **arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`eu-central-2`** | **arn:aws:lambda:eu-central-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`eu-north-1`** | **arn:aws:lambda:eu-north-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`eu-south-1`** | **arn:aws:lambda:eu-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`eu-south-2`** | **arn:aws:lambda:eu-south-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`eu-west-1`** | **arn:aws:lambda:eu-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`eu-west-2`** | **arn:aws:lambda:eu-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`eu-west-3`** | **arn:aws:lambda:eu-west-3:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`il-central-1`** | **arn:aws:lambda:il-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`me-central-1`** | **arn:aws:lambda:me-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`me-south-1`** | **arn:aws:lambda:me-south-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`mx-central-1`** | **arn:aws:lambda:mx-central-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`sa-east-1`** | **arn:aws:lambda:sa-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`us-east-1`** | **arn:aws:lambda:us-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`us-east-2`** | **arn:aws:lambda:us-east-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`us-west-1`** | **arn:aws:lambda:us-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | | **`us-west-2`** | **arn:aws:lambda:us-west-2:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:15** | **Want to inspect the contents of the Layer?** The pre-signed URL to download this Lambda Layer will be within `Location` key in the CLI output. The CLI output will also contain the Powertools for AWS Lambda version it contains. ``` aws lambda get-layer-version-by-arn --arn arn:aws:lambda:eu-west-1:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15 --region eu-west-1 ``` #### SAR Serverless Application Repository (SAR) App deploys a CloudFormation stack with a copy of our Lambda Layer in your AWS account and region. Compared with the [public Layer ARN](#lambda-layer) option, SAR allows you to choose a semantic version and deploys a Layer in your target account. | App | ARN | Python version | Architecture | | --- | --- | --- | --- | | [aws-lambda-powertools-python-layer-v3-python39-x86-64](https://serverlessrepo.aws.amazon.com/applications/eu-west-1/057560766410/aws-lambda-powertools-python-layer-v3-python39-x86-64) | [arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python39-x86-64](#) | Python 3.9 | X86_64 | | [aws-lambda-powertools-python-layer-v3-python310-x86-64](https://serverlessrepo.aws.amazon.com/applications/eu-west-1/057560766410/aws-lambda-powertools-python-layer-v3-python310-x86-64) | [arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python310-x86-64](#) | Python 3.10 | X86_64 | | [aws-lambda-powertools-python-layer-v3-python311-x86-64](https://serverlessrepo.aws.amazon.com/applications/eu-west-1/057560766410/aws-lambda-powertools-python-layer-v3-python11-x86-64) | [arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python311-x86-64](#) | Python 3.11 | X86_64 | | [aws-lambda-powertools-python-layer-v3-python312-x86-64](https://serverlessrepo.aws.amazon.com/applications/eu-west-1/057560766410/aws-lambda-powertools-python-layer-v3-python12-x86-64) | [arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python312-x86-64](#) | Python 3.12 | X86_64 | | [aws-lambda-powertools-python-layer-v3-python313-x86-64](https://serverlessrepo.aws.amazon.com/applications/eu-west-1/057560766410/aws-lambda-powertools-python-layer-v3-python313-x86-64) | [arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python313-x86-64](#) | Python 3.13 | X86_64 | | [aws-lambda-powertools-python-layer-v3-python39-arm64](https://serverlessrepo.aws.amazon.com/applications/eu-west-1/057560766410/aws-lambda-powertools-python-layer-v3-python39-arm64) | [arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python39-arm64](#) | Python 3.9 | ARM64 | | [aws-lambda-powertools-python-layer-v3-python310-arm64](https://serverlessrepo.aws.amazon.com/applications/eu-west-1/057560766410/aws-lambda-powertools-python-layer-v3-python310-arm64) | [arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python310-arm64](#) | Python 3.10 | ARM64 | | [aws-lambda-powertools-python-layer-v3-python311-arm64](https://serverlessrepo.aws.amazon.com/applications/eu-west-1/057560766410/aws-lambda-powertools-python-layer-v3-python11-arm64) | [arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python311-arm64](#) | Python 3.11 | ARM64 | | [aws-lambda-powertools-python-layer-v3-python312-arm64](https://serverlessrepo.aws.amazon.com/applications/eu-west-1/057560766410/aws-lambda-powertools-python-layer-v3-python12-arm64) | [arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python312-arm64](#) | Python 3.12 | ARM64 | | [aws-lambda-powertools-python-layer-v3-python313-arm64](https://serverlessrepo.aws.amazon.com/applications/eu-west-1/057560766410/aws-lambda-powertools-python-layer-v3-python313-arm64) | [arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python313-arm64](#) | Python 3.13 | ARM64 | Click to expand and copy SAR code snippets for popular frameworks You can create a shared Lambda Layers stack and make this along with other account level layers stack. ``` AWSTemplateFormatVersion: '2010-09-09' Transform: AWS::Serverless-2016-10-31 Resources: AwsLambdaPowertoolsPythonLayer: Type: AWS::Serverless::Application Properties: Location: ApplicationId: arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python313-x86-64 SemanticVersion: 3.0.9 # change to latest semantic version available in SAR MyLambdaFunction: Type: AWS::Serverless::Function Properties: Runtime: python3.13 Handler: app.lambda_handler Layers: # fetch Layer ARN from SAR App stack output - !GetAtt AwsLambdaPowertoolsPythonLayer.Outputs.LayerVersionArn ``` ``` service: powertools-lambda provider: name: aws runtime: python3.13 region: us-east-1 functions: powertools: handler: lambda_function.lambda_handler layers: - !GetAtt AwsLambdaPowertoolsPythonLayer.Outputs.LayerVersionArn resources: - AwsLambdaPowertoolsPythonLayer: Type: AWS::Serverless::Application Properties: Location: ApplicationId: arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python313-x86-64 SemanticVersion: 3.0.9 ``` ``` from aws_cdk import Stack, aws_lambda, aws_sam from constructs import Construct POWERTOOLS_BASE_NAME = "AWSLambdaPowertools" # Find latest from github.com/aws-powertools/powertools-lambda-python/releases POWERTOOLS_VER = "3.0.9" POWERTOOLS_ARN = ( "arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python313-x86-64" ) class SampleApp(Stack): def __init__(self, scope: Construct, id_: str) -> None: super().__init__(scope, id_) # Launches SAR App as CloudFormation nested stack and return Lambda Layer powertools_app = aws_sam.CfnApplication( self, f"{POWERTOOLS_BASE_NAME}Application", location={"applicationId": POWERTOOLS_ARN, "semanticVersion": POWERTOOLS_VER}, ) powertools_layer_arn = powertools_app.get_att("Outputs.LayerVersionArn").to_string() powertools_layer_version = aws_lambda.LayerVersion.from_layer_version_arn( self, f"{POWERTOOLS_BASE_NAME}", powertools_layer_arn, ) aws_lambda.Function( self, "sample-app-lambda", runtime=aws_lambda.Runtime.PYTHON_3_13, function_name="sample-lambda", code=aws_lambda.Code.from_asset("lambda"), handler="hello.handler", layers=[powertools_layer_version], ) ``` > Credits to [Dani Comnea](https://github.com/DanyC97) for providing the Terraform equivalent. ``` terraform { required_version = "~> 0.13" required_providers { aws = "~> 3.50.0" } } provider "aws" { region = "us-east-1" } resource "aws_serverlessapplicationrepository_cloudformation_stack" "deploy_sar_stack" { name = "aws-lambda-powertools-python-layer" application_id = data.aws_serverlessapplicationrepository_application.sar_app.application_id semantic_version = data.aws_serverlessapplicationrepository_application.sar_app.semantic_version capabilities = [ "CAPABILITY_IAM", "CAPABILITY_NAMED_IAM" ] } data "aws_serverlessapplicationrepository_application" "sar_app" { application_id = "arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python313-x86-64" semantic_version = var.aws_powertools_version } variable "aws_powertools_version" { type = string default = "3.0.9" description = "The Powertools for AWS Lambda (Python) release version" } output "deployed_powertools_sar_version" { value = data.aws_serverlessapplicationrepository_application.sar_app.semantic_version } # Fetch Powertools for AWS Lambda (Python) Layer ARN from deployed SAR App output "aws_lambda_powertools_layer_arn" { value = aws_serverlessapplicationrepository_cloudformation_stack.deploy_sar_stack.outputs.LayerVersionArn } ``` Credits to [mwarkentin](https://github.com/mwarkentin) for providing the scoped down IAM permissions below. ``` AWSTemplateFormatVersion: "2010-09-09" Resources: PowertoolsLayerIamRole: Type: "AWS::IAM::Role" Properties: AssumeRolePolicyDocument: Version: "2012-10-17" Statement: - Effect: "Allow" Principal: Service: - "cloudformation.amazonaws.com" Action: - "sts:AssumeRole" Path: "/" PowertoolsLayerIamPolicy: Type: "AWS::IAM::Policy" Properties: PolicyName: PowertoolsLambdaLayerPolicy PolicyDocument: Version: "2012-10-17" Statement: - Sid: CloudFormationTransform Effect: Allow Action: cloudformation:CreateChangeSet Resource: - arn:aws:cloudformation:us-east-1:aws:transform/Serverless-2016-10-31 - Sid: GetCfnTemplate Effect: Allow Action: - serverlessrepo:CreateCloudFormationTemplate - serverlessrepo:GetCloudFormationTemplate Resource: # this is arn of the Powertools for AWS Lambda (Python) SAR app - arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer-v3-python313-x86-64 - Sid: S3AccessLayer Effect: Allow Action: - s3:GetObject Resource: # AWS publishes to an external S3 bucket locked down to your account ID # The below example is us publishing Powertools for AWS Lambda (Python) # Bucket: awsserverlessrepo-changesets-plntc6bfnfj # Key: *****/arn:aws:serverlessrepo:eu-west-1:057560766410:applications-aws-lambda-powertools-python-layer-v3-python313-x86-64-3.0.9/aeeccf50-****-****-****-********* - arn:aws:s3:::awsserverlessrepo-changesets-*/* - Sid: GetLayerVersion Effect: Allow Action: - lambda:PublishLayerVersion - lambda:GetLayerVersion Resource: - !Sub arn:aws:lambda:${AWS::Region}:${AWS::AccountId}:layer:aws-lambda-powertools-python-layer-v3* Roles: - Ref: "PowertoolsLayerIamRole" ``` ## Quick getting started ``` sam init --app-template hello-world-powertools-python --name sam-app --package-type Zip --runtime python3.11 --no-tracing ``` ## Features Core utilities such as Tracing, Logging, Metrics, and Event Handler will be available across all Powertools for AWS Lambda languages. Additional utilities are subjective to each language ecosystem and customer demand. | Utility | Description | | --- | --- | | [**Tracing**](core/tracer/) | Decorators and utilities to trace Lambda function handlers, and both synchronous and asynchronous functions | | [**Logger**](core/logger/) | Structured logging made easier, and decorator to enrich structured logging with key Lambda context details | | [**Metrics**](core/metrics/) | Custom Metrics created asynchronously via CloudWatch Embedded Metric Format (EMF) | | [**Event handler: AppSync**](core/event_handler/appsync/) | AppSync event handler for Lambda Direct Resolver and Amplify GraphQL Transformer function | | [**Event handler: API Gateway, ALB and Lambda Function URL**](https://docs.powertools.aws.dev/lambda/python/latest/core/event_handler/api_gateway/) | Amazon API Gateway REST/HTTP API and ALB event handler for Lambda functions invoked using Proxy integration, and Lambda Function URL | | [**Middleware factory**](utilities/middleware_factory/) | Decorator factory to create your own middleware to run logic before, and after each Lambda invocation | | [**Parameters**](utilities/parameters/) | Retrieve parameter values from AWS Systems Manager Parameter Store, AWS Secrets Manager, or Amazon DynamoDB, and cache them for a specific amount of time | | [**Batch processing**](utilities/batch/) | Handle partial failures for AWS SQS batch processing | | [**Typing**](utilities/typing/) | Static typing classes to speedup development in your IDE | | [**Validation**](utilities/validation/) | JSON Schema validator for inbound events and responses | | [**Event source data classes**](utilities/data_classes/) | Data classes describing the schema of common Lambda event triggers | | [**Parser**](utilities/parser/) | Data parsing and deep validation using Pydantic | | [**Idempotency**](utilities/idempotency/) | Idempotent Lambda handler | | [**Data Masking**](utilities/data_masking/) | Protect confidential data with easy removal or encryption | | [**Feature Flags**](utilities/feature_flags/) | A simple rule engine to evaluate when one or multiple features should be enabled depending on the input | | [**Streaming**](utilities/streaming/) | Streams datasets larger than the available memory as streaming data. | ## Environment variables Info Explicit parameters take precedence over environment variables | Environment variable | Description | Utility | Default | | --- | --- | --- | --- | | **POWERTOOLS_SERVICE_NAME** | Sets service name used for tracing namespace, metrics dimension and structured logging | All | `"service_undefined"` | | **POWERTOOLS_METRICS_NAMESPACE** | Sets namespace used for metrics | [Metrics](core/metrics/) | `None` | | **POWERTOOLS_METRICS_FUNCTION_NAME** | Function name used as dimension for the **ColdStart** metric metrics | [Metrics](core/metrics/) | `None` | | **POWERTOOLS_METRICS_DISABLED** | **Disables** all metrics emitted by Powertools metrics | [Metrics](core/metrics/) | `None` | | **POWERTOOLS_TRACE_DISABLED** | Explicitly disables tracing | [Tracing](core/tracer/) | `false` | | **POWERTOOLS_TRACER_CAPTURE_RESPONSE** | Captures Lambda or method return as metadata. | [Tracing](core/tracer/) | `true` | | **POWERTOOLS_TRACER_CAPTURE_ERROR** | Captures Lambda or method exception as metadata. | [Tracing](core/tracer/) | `true` | | **POWERTOOLS_TRACE_MIDDLEWARES** | Creates sub-segment for each custom middleware | [Middleware factory](utilities/middleware_factory/) | `false` | | **POWERTOOLS_LOGGER_LOG_EVENT** | Logs incoming event | [Logging](core/logger/) | `false` | | **POWERTOOLS_LOGGER_SAMPLE_RATE** | Debug log sampling | [Logging](core/logger/) | `0` | | **POWERTOOLS_LOG_DEDUPLICATION_DISABLED** | Disables log deduplication filter protection to use Pytest Live Log feature | [Logging](core/logger/) | `false` | | **POWERTOOLS_PARAMETERS_MAX_AGE** | Adjust how long values are kept in cache (in seconds) | [Parameters](utilities/parameters/#adjusting-cache-ttl) | `5` | | **POWERTOOLS_PARAMETERS_SSM_DECRYPT** | Sets whether to decrypt or not values retrieved from AWS SSM Parameters Store | [Parameters](utilities/parameters/#ssmprovider) | `false` | | **POWERTOOLS_DEV** | Increases verbosity across utilities | Multiple; see [POWERTOOLS_DEV effect below](#optimizing-for-non-production-environments) | `false` | | **POWERTOOLS_LOG_LEVEL** | Sets logging level | [Logging](core/logger/) | `INFO` | ### Optimizing for non-production environments We will emit a warning when this feature is used to help you detect misuse in production. Whether you're prototyping locally or against a non-production environment, you can use `POWERTOOLS_DEV` to increase verbosity across multiple utilities. When `POWERTOOLS_DEV` is set to a truthy value (`1`, `true`), it'll have the following effects: | Utility | Effect | | --- | --- | | **Logger** | Increase JSON indentation to 4. This will ease local debugging when running functions locally under emulators or direct calls while not affecting unit tests. However, Amazon CloudWatch Logs view will degrade as each new line is treated as a new message. | | **Event Handler** | Enable full traceback errors in the response, indent request/responses, and CORS in dev mode (`*`). | | **Tracer** | Future-proof safety to disables tracing operations in non-Lambda environments. This already happens automatically in the Tracer utility. | | **Metrics** | Disables Powertools metrics emission by default. However, this can be overridden by explicitly setting POWERTOOLS_METRICS_DISABLED=false, which takes precedence over the dev mode setting. | ## Debug mode As a best practice for libraries, Powertools module logging statements are suppressed. When necessary, you can use `POWERTOOLS_DEBUG` environment variable to enable debugging. This will provide additional information on every internal operation. ## Support Powertools for AWS Lambda (Python) There are many ways you can help us gain future investments to improve everyone's experience: - **Become a public reference** ______________________________________________________________________ Add your company name and logo on our [landing page](https://powertools.aws.dev). [GitHub Issue template](https://github.com/aws-powertools/powertools-lambda-python/issues/new?assignees=&labels=customer-reference&template=support_powertools.yml&title=%5BSupport+Lambda+Powertools%5D%3A+%3Cyour+organization+name%3E) - **Share your work** ______________________________________________________________________ Blog posts, video, and sample projects about Powertools for AWS Lambda. [GitHub Issue template](https://github.com/aws-powertools/powertools-lambda-python/issues/new?assignees=&labels=community-content&template=share_your_work.yml&title=%5BI+Made+This%5D%3A+%3CTITLE%3E) - **Join the community** ______________________________________________________________________ Connect, ask questions, and share what features you use. [Discord invite](https://discord.gg/B8zZKbbyET) ### Becoming a reference customer Knowing which companies are using this library is important to help prioritize the project internally. The following companies, among others, use Powertools: [**Alma Media**](https://www.almamedia.fi/en/) [**Banxware**](https://www.banxware.com) [**Brsk**](https://www.brsk.co.uk/) [**BusPatrol**](https://buspatrol.com/) [**Capital One**](https://www.capitalone.com/) [**Caylent**](https://caylent.com/) [**CHS Inc.**](https://www.chsinc.com/) [**CPQi (Exadel Financial Services)**](https://cpqi.com/) [**CloudZero**](https://www.cloudzero.com/) [**CyberArk**](https://www.cyberark.com/) [**Flyweight**](https://flyweight.io/) [**globaldatanet**](https://globaldatanet.com/) [**Guild**](https://guild.com/) [**IMS**](https://ims.tech/) [**Jit Security**](https://www.jit.io/) [**LocalStack**](https://www.localstack.cloud/) [**Propellor.ai**](https://www.propellor.ai/) [**Pushpay**](https://pushpay.com/) [**Recast**](https://getrecast.com/) [**TopSport**](https://www.topsport.com.au/) [**Transformity**](https://transformity.tech/) [**Trek10**](https://www.trek10.com/) [**Vertex Pharmaceuticals**](https://www.vrtx.com/) ### Using Lambda Layers Layers help us understand who uses Powertools for AWS Lambda (Python) in a non-intrusive way. When [using Layers](#lambda-layer), you can add Powertools for AWS Lambda (Python) as a dev dependency to not impact the development process. For Layers, we pre-package all dependencies, compile and optimize for storage and both x86_64 and ARM architecture. ## Tenets These are our core principles to guide our decision making. - **AWS Lambda only**. We optimise for AWS Lambda function environments and supported runtimes only. Utilities might work with web frameworks and non-Lambda environments, though they are not officially supported. - **Eases the adoption of best practices**. The main priority of the utilities is to facilitate best practices adoption, as defined in the AWS Well-Architected Serverless Lens; all other functionality is optional. - **Keep it lean**. Additional dependencies are carefully considered for security and ease of maintenance, and prevent negatively impacting startup time. - **We strive for backwards compatibility**. New features and changes should keep backwards compatibility. If a breaking change cannot be avoided, the deprecation and migration process should be clearly defined. - **We work backwards from the community**. We aim to strike a balance of what would work best for 80% of customers. Emerging practices are considered and discussed via Requests for Comment (RFCs) - **Progressive**. Utilities are designed to be incrementally adoptable for customers at any stage of their Serverless journey. They follow language idioms and their community’s common practices. # Unreleased ## Bug Fixes - **event_handler:** fix OpenAPI schema response for disabled validation ([#6720](https://github.com/aws-powertools/powertools-lambda-python/issues/6720)) ## Features - **event_handler:** enable support for custom deserializer to parse the request body ([#6601](https://github.com/aws-powertools/powertools-lambda-python/issues/6601)) ## Maintenance - **ci:** new pre-release 3.13.1a4 ([#6738](https://github.com/aws-powertools/powertools-lambda-python/issues/6738)) - **ci:** new pre-release 3.13.1a0 ([#6696](https://github.com/aws-powertools/powertools-lambda-python/issues/6696)) - **ci:** new pre-release 3.13.1a1 ([#6704](https://github.com/aws-powertools/powertools-lambda-python/issues/6704)) - **ci:** new pre-release 3.13.1a2 ([#6709](https://github.com/aws-powertools/powertools-lambda-python/issues/6709)) - **ci:** add missing dependency to build docs ([#6717](https://github.com/aws-powertools/powertools-lambda-python/issues/6717)) - **ci:** new pre-release 3.13.1a3 ([#6732](https://github.com/aws-powertools/powertools-lambda-python/issues/6732)) - **deps:** bump pydantic from 2.11.4 to 2.11.5 ([#6711](https://github.com/aws-powertools/powertools-lambda-python/issues/6711)) - **deps:** bump redis from 6.1.0 to 6.2.0 ([#6736](https://github.com/aws-powertools/powertools-lambda-python/issues/6736)) - **deps:** bump datadog-lambda from 6.109.0 to 6.110.0 ([#6714](https://github.com/aws-powertools/powertools-lambda-python/issues/6714)) - **deps:** bump mkdocstrings-python from 1.16.10 to 1.16.11 in /docs ([#6722](https://github.com/aws-powertools/powertools-lambda-python/issues/6722)) - **deps:** bump mkdocstrings-python from 1.16.10 to 1.16.11 ([#6724](https://github.com/aws-powertools/powertools-lambda-python/issues/6724)) - **deps-dev:** bump boto3-stubs from 1.38.22 to 1.38.23 ([#6712](https://github.com/aws-powertools/powertools-lambda-python/issues/6712)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.197.0a0 to 2.198.0a0 ([#6715](https://github.com/aws-powertools/powertools-lambda-python/issues/6715)) - **deps-dev:** bump pytest-xdist from 3.6.1 to 3.7.0 ([#6730](https://github.com/aws-powertools/powertools-lambda-python/issues/6730)) - **deps-dev:** bump cfn-lint from 1.35.1 to 1.35.3 ([#6708](https://github.com/aws-powertools/powertools-lambda-python/issues/6708)) - **deps-dev:** bump ruff from 0.11.10 to 0.11.11 ([#6706](https://github.com/aws-powertools/powertools-lambda-python/issues/6706)) - **deps-dev:** bump boto3-stubs from 1.38.21 to 1.38.22 ([#6707](https://github.com/aws-powertools/powertools-lambda-python/issues/6707)) - **deps-dev:** bump aws-cdk from 2.1016.0 to 2.1016.1 ([#6703](https://github.com/aws-powertools/powertools-lambda-python/issues/6703)) - **deps-dev:** bump coverage from 7.8.1 to 7.8.2 ([#6713](https://github.com/aws-powertools/powertools-lambda-python/issues/6713)) - **deps-dev:** bump aws-cdk-lib from 2.198.0 to 2.199.0 ([#6731](https://github.com/aws-powertools/powertools-lambda-python/issues/6731)) - **deps-dev:** bump coverage from 7.8.0 to 7.8.1 ([#6701](https://github.com/aws-powertools/powertools-lambda-python/issues/6701)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.196.1a0 to 2.197.0a0 ([#6700](https://github.com/aws-powertools/powertools-lambda-python/issues/6700)) - **deps-dev:** bump aws-cdk-lib from 2.196.1 to 2.197.0 ([#6699](https://github.com/aws-powertools/powertools-lambda-python/issues/6699)) - **deps-dev:** bump boto3-stubs from 1.38.19 to 1.38.21 ([#6698](https://github.com/aws-powertools/powertools-lambda-python/issues/6698)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.198.0a0 to 2.199.0a0 ([#6729](https://github.com/aws-powertools/powertools-lambda-python/issues/6729)) - **deps-dev:** bump aws-cdk from 2.1016.1 to 2.1017.0 ([#6734](https://github.com/aws-powertools/powertools-lambda-python/issues/6734)) - **deps-dev:** bump aws-cdk-lib from 2.196.0 to 2.196.1 ([#6695](https://github.com/aws-powertools/powertools-lambda-python/issues/6695)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.196.0a0 to 2.196.1a0 ([#6694](https://github.com/aws-powertools/powertools-lambda-python/issues/6694)) - **deps-dev:** bump boto3-stubs from 1.38.23 to 1.38.25 ([#6735](https://github.com/aws-powertools/powertools-lambda-python/issues/6735)) - **deps-dev:** bump pytest-mock from 3.14.0 to 3.14.1 ([#6723](https://github.com/aws-powertools/powertools-lambda-python/issues/6723)) - **docs:** Add llms.txt to documentation ([#6693](https://github.com/aws-powertools/powertools-lambda-python/issues/6693)) ## [v3.13.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v3.12.0...v3.13.0) - 2025-05-20 ## Code Refactoring - **idempotency:** replace Redis name with Cache and add valkey-glide support ([#6685](https://github.com/aws-powertools/powertools-lambda-python/issues/6685)) ## Features - **event_source:** add support for tumbling windows in Kinesis and DynamoDB events ([#6658](https://github.com/aws-powertools/powertools-lambda-python/issues/6658)) - **event_source:** export SQSRecord in data_classes module ([#6639](https://github.com/aws-powertools/powertools-lambda-python/issues/6639)) - **parser:** add support to decompress Kinesis CloudWatch logs in Kinesis envelope ([#6656](https://github.com/aws-powertools/powertools-lambda-python/issues/6656)) ## Maintenance - version bump - **ci:** new pre-release 3.12.1a2 ([#6638](https://github.com/aws-powertools/powertools-lambda-python/issues/6638)) - **ci:** include allowed licenses file in dependency review workflow ([#6618](https://github.com/aws-powertools/powertools-lambda-python/issues/6618)) - **ci:** new pre-release 3.12.1a8 ([#6683](https://github.com/aws-powertools/powertools-lambda-python/issues/6683)) - **ci:** new pre-release 3.12.1a3 ([#6647](https://github.com/aws-powertools/powertools-lambda-python/issues/6647)) - **ci:** new pre-release 3.12.1a7 ([#6675](https://github.com/aws-powertools/powertools-lambda-python/issues/6675)) - **ci:** new pre-release 3.12.1a0 ([#6621](https://github.com/aws-powertools/powertools-lambda-python/issues/6621)) - **ci:** new pre-release 3.12.1a6 ([#6670](https://github.com/aws-powertools/powertools-lambda-python/issues/6670)) - **ci:** new pre-release 3.12.1a1 ([#6626](https://github.com/aws-powertools/powertools-lambda-python/issues/6626)) - **ci:** new pre-release 3.12.1a4 ([#6655](https://github.com/aws-powertools/powertools-lambda-python/issues/6655)) - **ci:** new pre-release 3.12.1a5 ([#6664](https://github.com/aws-powertools/powertools-lambda-python/issues/6664)) - **deps:** bump aws-actions/configure-aws-credentials from 4.2.0 to 4.2.1 ([#6667](https://github.com/aws-powertools/powertools-lambda-python/issues/6667)) - **deps:** bump squidfunk/mkdocs-material from `f6c81d5` to `eb04b60` in /docs ([#6659](https://github.com/aws-powertools/powertools-lambda-python/issues/6659)) - **deps:** bump datadog-lambda from 6.107.0 to 6.108.0 ([#6634](https://github.com/aws-powertools/powertools-lambda-python/issues/6634)) - **deps:** bump actions/setup-go from 5.4.0 to 5.5.0 ([#6630](https://github.com/aws-powertools/powertools-lambda-python/issues/6630)) - **deps:** bump actions/dependency-review-action from 4.7.0 to 4.7.1 ([#6663](https://github.com/aws-powertools/powertools-lambda-python/issues/6663)) - **deps:** bump redis from 5.2.1 to 6.1.0 ([#6662](https://github.com/aws-powertools/powertools-lambda-python/issues/6662)) - **deps:** bump actions/dependency-review-action from 4.6.0 to 4.7.0 ([#6629](https://github.com/aws-powertools/powertools-lambda-python/issues/6629)) - **deps:** bump codecov/codecov-action from 5.4.2 to 5.4.3 ([#6672](https://github.com/aws-powertools/powertools-lambda-python/issues/6672)) - **deps:** bump squidfunk/mkdocs-material from `95f2ff4` to `f6c81d5` in /docs ([#6650](https://github.com/aws-powertools/powertools-lambda-python/issues/6650)) - **deps:** bump aws-actions/configure-aws-credentials from 4.1.0 to 4.2.0 ([#6619](https://github.com/aws-powertools/powertools-lambda-python/issues/6619)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.24 to 3.0.25 ([#6686](https://github.com/aws-powertools/powertools-lambda-python/issues/6686)) - **deps:** bump datadog-lambda from 6.108.0 to 6.109.0 ([#6641](https://github.com/aws-powertools/powertools-lambda-python/issues/6641)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.308 to 0.1.309 ([#6651](https://github.com/aws-powertools/powertools-lambda-python/issues/6651)) - **deps-dev:** bump boto3-stubs from 1.38.12 to 1.38.13 ([#6644](https://github.com/aws-powertools/powertools-lambda-python/issues/6644)) - **deps-dev:** bump cfn-lint from 1.35.0 to 1.35.1 ([#6642](https://github.com/aws-powertools/powertools-lambda-python/issues/6642)) - **deps-dev:** bump ruff from 0.11.8 to 0.11.9 ([#6643](https://github.com/aws-powertools/powertools-lambda-python/issues/6643)) - **deps-dev:** bump boto3-stubs from 1.38.13 to 1.38.14 ([#6653](https://github.com/aws-powertools/powertools-lambda-python/issues/6653)) - **deps-dev:** bump sentry-sdk from 2.27.0 to 2.28.0 ([#6652](https://github.com/aws-powertools/powertools-lambda-python/issues/6652)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.194.0a0 to 2.195.0a0 ([#6635](https://github.com/aws-powertools/powertools-lambda-python/issues/6635)) - **deps-dev:** bump aws-cdk from 2.1013.0 to 2.1014.0 ([#6636](https://github.com/aws-powertools/powertools-lambda-python/issues/6636)) - **deps-dev:** bump mkdocs-material from 9.6.12 to 9.6.13 ([#6654](https://github.com/aws-powertools/powertools-lambda-python/issues/6654)) - **deps-dev:** bump boto3-stubs from 1.38.11 to 1.38.12 ([#6633](https://github.com/aws-powertools/powertools-lambda-python/issues/6633)) - **deps-dev:** bump aws-cdk-lib from 2.194.0 to 2.195.0 ([#6632](https://github.com/aws-powertools/powertools-lambda-python/issues/6632)) - **deps-dev:** bump boto3-stubs from 1.38.14 to 1.38.15 ([#6660](https://github.com/aws-powertools/powertools-lambda-python/issues/6660)) - **deps-dev:** bump ijson from 3.3.0 to 3.4.0 ([#6631](https://github.com/aws-powertools/powertools-lambda-python/issues/6631)) - **deps-dev:** bump mkdocs-material from 9.6.13 to 9.6.14 ([#6661](https://github.com/aws-powertools/powertools-lambda-python/issues/6661)) - **deps-dev:** bump boto3-stubs from 1.38.15 to 1.38.16 ([#6669](https://github.com/aws-powertools/powertools-lambda-python/issues/6669)) - **deps-dev:** bump aws-cdk from 2.1014.0 to 2.1015.0 ([#6668](https://github.com/aws-powertools/powertools-lambda-python/issues/6668)) - **deps-dev:** bump cfn-lint from 1.34.2 to 1.35.0 ([#6623](https://github.com/aws-powertools/powertools-lambda-python/issues/6623)) - **deps-dev:** bump types-python-dateutil from 2.9.0.20241206 to 2.9.0.20250516 ([#6678](https://github.com/aws-powertools/powertools-lambda-python/issues/6678)) - **deps-dev:** bump ruff from 0.11.9 to 0.11.10 ([#6673](https://github.com/aws-powertools/powertools-lambda-python/issues/6673)) - **deps-dev:** bump boto3-stubs from 1.38.16 to 1.38.17 ([#6674](https://github.com/aws-powertools/powertools-lambda-python/issues/6674)) - **deps-dev:** bump boto3-stubs from 1.38.9 to 1.38.10 ([#6620](https://github.com/aws-powertools/powertools-lambda-python/issues/6620)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.195.0a0 to 2.196.0a0 ([#6677](https://github.com/aws-powertools/powertools-lambda-python/issues/6677)) - **deps-dev:** bump aws-cdk from 2.1015.0 to 2.1016.0 ([#6680](https://github.com/aws-powertools/powertools-lambda-python/issues/6680)) - **deps-dev:** bump boto3-stubs from 1.38.18 to 1.38.19 ([#6687](https://github.com/aws-powertools/powertools-lambda-python/issues/6687)) - **deps-dev:** bump boto3-stubs from 1.38.10 to 1.38.11 ([#6624](https://github.com/aws-powertools/powertools-lambda-python/issues/6624)) ## [v3.12.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v3.11.0...v3.12.0) - 2025-05-06 ## Documentation - **appsync_events:** improve AppSync events documentation ([#6572](https://github.com/aws-powertools/powertools-lambda-python/issues/6572)) - **community:** add Ran Isenberg blog post ([#6610](https://github.com/aws-powertools/powertools-lambda-python/issues/6610)) - **i-made-this:** adding Michael's MCP server ([#6591](https://github.com/aws-powertools/powertools-lambda-python/issues/6591)) ## Features - **bedrock_agents:** add optional fields to response payload ([#6336](https://github.com/aws-powertools/powertools-lambda-python/issues/6336)) ## Maintenance - version bump - **ci:** new pre-release 3.11.1a5 ([#6598](https://github.com/aws-powertools/powertools-lambda-python/issues/6598)) - **ci:** new pre-release 3.11.1a0 ([#6561](https://github.com/aws-powertools/powertools-lambda-python/issues/6561)) - **ci:** new pre-release 3.11.1a6 ([#6606](https://github.com/aws-powertools/powertools-lambda-python/issues/6606)) - **ci:** new pre-release 3.11.1a1 ([#6574](https://github.com/aws-powertools/powertools-lambda-python/issues/6574)) - **ci:** new pre-release 3.11.1a2 ([#6578](https://github.com/aws-powertools/powertools-lambda-python/issues/6578)) - **ci:** new pre-release 3.11.1a4 ([#6589](https://github.com/aws-powertools/powertools-lambda-python/issues/6589)) - **ci:** new pre-release 3.11.1a3 ([#6582](https://github.com/aws-powertools/powertools-lambda-python/issues/6582)) - **deps:** bump pydantic from 2.11.3 to 2.11.4 ([#6585](https://github.com/aws-powertools/powertools-lambda-python/issues/6585)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.23 to 3.0.24 ([#6611](https://github.com/aws-powertools/powertools-lambda-python/issues/6611)) - **deps-dev:** bump ruff from 0.11.7 to 0.11.8 ([#6595](https://github.com/aws-powertools/powertools-lambda-python/issues/6595)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.306 to 0.1.307 ([#6580](https://github.com/aws-powertools/powertools-lambda-python/issues/6580)) - **deps-dev:** bump boto3-stubs from 1.38.4 to 1.38.5 ([#6581](https://github.com/aws-powertools/powertools-lambda-python/issues/6581)) - **deps-dev:** bump aws-cdk from 2.1012.0 to 2.1013.0 ([#6588](https://github.com/aws-powertools/powertools-lambda-python/issues/6588)) - **deps-dev:** bump boto3-stubs from 1.38.6 to 1.38.7 ([#6594](https://github.com/aws-powertools/powertools-lambda-python/issues/6594)) - **deps-dev:** bump boto3-stubs from 1.38.3 to 1.38.4 ([#6577](https://github.com/aws-powertools/powertools-lambda-python/issues/6577)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.307 to 0.1.308 ([#6597](https://github.com/aws-powertools/powertools-lambda-python/issues/6597)) - **deps-dev:** bump h11 from 0.14.0 to 0.16.0 ([#6575](https://github.com/aws-powertools/powertools-lambda-python/issues/6575)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.192.0a0 to 2.193.0a0 ([#6586](https://github.com/aws-powertools/powertools-lambda-python/issues/6586)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.193.0a0 to 2.194.0a0 ([#6602](https://github.com/aws-powertools/powertools-lambda-python/issues/6602)) - **deps-dev:** bump boto3-stubs from 1.38.2 to 1.38.3 ([#6569](https://github.com/aws-powertools/powertools-lambda-python/issues/6569)) - **deps-dev:** bump cfn-lint from 1.34.1 to 1.34.2 ([#6568](https://github.com/aws-powertools/powertools-lambda-python/issues/6568)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.305 to 0.1.306 ([#6567](https://github.com/aws-powertools/powertools-lambda-python/issues/6567)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.191.0a0 to 2.192.0a0 ([#6566](https://github.com/aws-powertools/powertools-lambda-python/issues/6566)) - **deps-dev:** bump aws-cdk-lib from 2.191.0 to 2.192.0 ([#6565](https://github.com/aws-powertools/powertools-lambda-python/issues/6565)) - **deps-dev:** bump aws-cdk-lib from 2.193.0 to 2.194.0 ([#6603](https://github.com/aws-powertools/powertools-lambda-python/issues/6603)) - **deps-dev:** bump boto3-stubs from 1.38.7 to 1.38.9 ([#6612](https://github.com/aws-powertools/powertools-lambda-python/issues/6612)) - **deps-dev:** bump boto3-stubs from 1.38.5 to 1.38.6 ([#6587](https://github.com/aws-powertools/powertools-lambda-python/issues/6587)) - **docs:** fix youtube embed link in we made this ([#6593](https://github.com/aws-powertools/powertools-lambda-python/issues/6593)) ## [v3.11.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v3.10.0...v3.11.0) - 2025-04-24 ## Bug Fixes - **logger:** warn customers when the ALC log level is less verbose than log buffer ([#6509](https://github.com/aws-powertools/powertools-lambda-python/issues/6509)) - **parser:** make key attribute optional in Kafka model ([#6523](https://github.com/aws-powertools/powertools-lambda-python/issues/6523)) ## Code Refactoring - **batch:** use standard collections for types ([#6475](https://github.com/aws-powertools/powertools-lambda-python/issues/6475)) - **data_masking:** use standard collections for types ([#6493](https://github.com/aws-powertools/powertools-lambda-python/issues/6493)) - **e2e-tests:** use standard collections for types + refactor code ([#6505](https://github.com/aws-powertools/powertools-lambda-python/issues/6505)) - **event_handler:** use standard collections for types + refactor code ([#6495](https://github.com/aws-powertools/powertools-lambda-python/issues/6495)) - **event_source:** use standard collections for types ([#6479](https://github.com/aws-powertools/powertools-lambda-python/issues/6479)) - **feature_flags:** use standard collections for type ([#6489](https://github.com/aws-powertools/powertools-lambda-python/issues/6489)) - **general:** add support for `ruff format` ([#6512](https://github.com/aws-powertools/powertools-lambda-python/issues/6512)) - **idempotency:** use standard collections for types ([#6487](https://github.com/aws-powertools/powertools-lambda-python/issues/6487)) - **logger:** use standard collections for types ([#6471](https://github.com/aws-powertools/powertools-lambda-python/issues/6471)) - **metrics:** use standard collections for types ([#6472](https://github.com/aws-powertools/powertools-lambda-python/issues/6472)) - **middleware_factory:** use standard collections for types ([#6485](https://github.com/aws-powertools/powertools-lambda-python/issues/6485)) - **parameters:** use standard collections for types ([#6481](https://github.com/aws-powertools/powertools-lambda-python/issues/6481)) - **streaming:** use standard collections for types ([#6483](https://github.com/aws-powertools/powertools-lambda-python/issues/6483)) - **tests:** use standard collections for types + refactor code ([#6497](https://github.com/aws-powertools/powertools-lambda-python/issues/6497)) - **tracer:** use standard collections for types ([#6473](https://github.com/aws-powertools/powertools-lambda-python/issues/6473)) - **validation:** use standard collections for types ([#6491](https://github.com/aws-powertools/powertools-lambda-python/issues/6491)) ## Documentation - **bedrock:** fix BedrockServiceRole in template.yaml ([#6436](https://github.com/aws-powertools/powertools-lambda-python/issues/6436)) - **bedrock_agents:** remove Pydantic v1 recommendation ([#6468](https://github.com/aws-powertools/powertools-lambda-python/issues/6468)) - **event_handler:** add docs for AppSync event resolver ([#6557](https://github.com/aws-powertools/powertools-lambda-python/issues/6557)) - **event_handler:** fix typo in api keys swagger url ([#6536](https://github.com/aws-powertools/powertools-lambda-python/issues/6536)) ## Features - **bedrock:** add `openapi_extensions` in BedrockAgentResolver ([#6510](https://github.com/aws-powertools/powertools-lambda-python/issues/6510)) - **data-masking:** add support for Pydantic models, dataclasses, and standard classes ([#6413](https://github.com/aws-powertools/powertools-lambda-python/issues/6413)) - **event_handler:** add AppSync events resolver ([#6558](https://github.com/aws-powertools/powertools-lambda-python/issues/6558)) - **event_handler:** add extras HTTP Error Code Exceptions ([#6454](https://github.com/aws-powertools/powertools-lambda-python/issues/6454)) - **event_handler:** add route-level custom response validation in OpenAPI utility ([#6341](https://github.com/aws-powertools/powertools-lambda-python/issues/6341)) - **logger:** add support for exception notes ([#6465](https://github.com/aws-powertools/powertools-lambda-python/issues/6465)) ## Maintenance - version bump - **ci:** new pre-release 3.10.1a7 ([#6518](https://github.com/aws-powertools/powertools-lambda-python/issues/6518)) - **ci:** new pre-release 3.10.1a0 ([#6431](https://github.com/aws-powertools/powertools-lambda-python/issues/6431)) - **ci:** new pre-release 3.10.1a1 ([#6437](https://github.com/aws-powertools/powertools-lambda-python/issues/6437)) - **ci:** new pre-release 3.10.1a2 ([#6446](https://github.com/aws-powertools/powertools-lambda-python/issues/6446)) - **ci:** new pre-release 3.10.1a10 ([#6538](https://github.com/aws-powertools/powertools-lambda-python/issues/6538)) - **ci:** new pre-release 3.10.1a3 ([#6455](https://github.com/aws-powertools/powertools-lambda-python/issues/6455)) - **ci:** new pre-release 3.10.1a4 ([#6463](https://github.com/aws-powertools/powertools-lambda-python/issues/6463)) - **ci:** new pre-release 3.10.1a9 ([#6533](https://github.com/aws-powertools/powertools-lambda-python/issues/6533)) - **ci:** new pre-release 3.10.1a5 ([#6498](https://github.com/aws-powertools/powertools-lambda-python/issues/6498)) - **ci:** new pre-release 3.10.1a11 ([#6546](https://github.com/aws-powertools/powertools-lambda-python/issues/6546)) - **ci:** new pre-release 3.10.1a8 ([#6526](https://github.com/aws-powertools/powertools-lambda-python/issues/6526)) - **ci:** new pre-release 3.10.1a6 ([#6506](https://github.com/aws-powertools/powertools-lambda-python/issues/6506)) - **deps:** bump pydantic-settings from 2.8.1 to 2.9.1 ([#6530](https://github.com/aws-powertools/powertools-lambda-python/issues/6530)) - **deps:** bump pydantic from 2.11.2 to 2.11.3 ([#6427](https://github.com/aws-powertools/powertools-lambda-python/issues/6427)) - **deps:** bump squidfunk/mkdocs-material from sha256:23b69789b1dd836c53ea25b32f62ef8e1a23366037acd07c90959a219fd1f285 to sha256:95f2ff42251979c043d6cb5b1c82e6ae8189e57e02105813dd1ce124021a418b in /docs ([#6513](https://github.com/aws-powertools/powertools-lambda-python/issues/6513)) - **deps:** bump actions/download-artifact from 4.2.1 to 4.3.0 ([#6550](https://github.com/aws-powertools/powertools-lambda-python/issues/6550)) - **deps:** bump actions/setup-python from 5.5.0 to 5.6.0 ([#6549](https://github.com/aws-powertools/powertools-lambda-python/issues/6549)) - **deps:** bump typing-extensions from 4.13.1 to 4.13.2 ([#6451](https://github.com/aws-powertools/powertools-lambda-python/issues/6451)) - **deps:** bump actions/setup-node from 4.3.0 to 4.4.0 ([#6457](https://github.com/aws-powertools/powertools-lambda-python/issues/6457)) - **deps:** bump codecov/codecov-action from 5.4.0 to 5.4.2 ([#6458](https://github.com/aws-powertools/powertools-lambda-python/issues/6458)) - **deps-dev:** bump mkdocs-material from 9.6.11 to 9.6.12 ([#6514](https://github.com/aws-powertools/powertools-lambda-python/issues/6514)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.302 to 0.1.304 ([#6531](https://github.com/aws-powertools/powertools-lambda-python/issues/6531)) - **deps-dev:** bump sentry-sdk from 2.25.1 to 2.26.1 ([#6477](https://github.com/aws-powertools/powertools-lambda-python/issues/6477)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.189.1a0 to 2.190.0a0 ([#6529](https://github.com/aws-powertools/powertools-lambda-python/issues/6529)) - **deps-dev:** bump boto3-stubs from 1.37.37 to 1.37.38 ([#6537](https://github.com/aws-powertools/powertools-lambda-python/issues/6537)) - **deps-dev:** bump aws-cdk-lib from 2.189.0 to 2.189.1 ([#6461](https://github.com/aws-powertools/powertools-lambda-python/issues/6461)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.189.0a0 to 2.189.1a0 ([#6462](https://github.com/aws-powertools/powertools-lambda-python/issues/6462)) - **deps-dev:** bump boto3-stubs from 1.37.33 to 1.37.34 ([#6459](https://github.com/aws-powertools/powertools-lambda-python/issues/6459)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.301 to 0.1.302 ([#6460](https://github.com/aws-powertools/powertools-lambda-python/issues/6460)) - **deps-dev:** bump cfn-lint from 1.34.0 to 1.34.1 ([#6528](https://github.com/aws-powertools/powertools-lambda-python/issues/6528)) - **deps-dev:** bump cfn-lint from 1.33.2 to 1.34.0 ([#6502](https://github.com/aws-powertools/powertools-lambda-python/issues/6502)) - **deps-dev:** bump aws-cdk from 2.1010.0 to 2.1012.0 ([#6540](https://github.com/aws-powertools/powertools-lambda-python/issues/6540)) - **deps-dev:** bump mypy-boto3-appconfigdata from 1.37.0 to 1.38.0 in the boto-typing group ([#6541](https://github.com/aws-powertools/powertools-lambda-python/issues/6541)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.304 to 0.1.305 ([#6545](https://github.com/aws-powertools/powertools-lambda-python/issues/6545)) - **deps-dev:** bump cfn-lint from 1.33.1 to 1.33.2 ([#6450](https://github.com/aws-powertools/powertools-lambda-python/issues/6450)) - **deps-dev:** bump boto3-stubs from 1.37.31 to 1.37.33 ([#6449](https://github.com/aws-powertools/powertools-lambda-python/issues/6449)) - **deps-dev:** bump boto3-stubs from 1.37.35 to 1.37.37 ([#6521](https://github.com/aws-powertools/powertools-lambda-python/issues/6521)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.190.0a0 to 2.191.0a0 ([#6543](https://github.com/aws-powertools/powertools-lambda-python/issues/6543)) - **deps-dev:** bump h11 from 0.14.0 to 0.16.0 ([#6548](https://github.com/aws-powertools/powertools-lambda-python/issues/6548)) - **deps-dev:** bump ruff from 0.11.4 to 0.11.5 ([#6443](https://github.com/aws-powertools/powertools-lambda-python/issues/6443)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.188.0a0 to 2.189.0a0 ([#6444](https://github.com/aws-powertools/powertools-lambda-python/issues/6444)) - **deps-dev:** bump aws-cdk-lib from 2.188.0 to 2.189.0 ([#6445](https://github.com/aws-powertools/powertools-lambda-python/issues/6445)) - **deps-dev:** bump cfn-lint from 1.33.0 to 1.33.1 ([#6442](https://github.com/aws-powertools/powertools-lambda-python/issues/6442)) - **deps-dev:** bump ruff from 0.11.5 to 0.11.6 ([#6515](https://github.com/aws-powertools/powertools-lambda-python/issues/6515)) - **deps-dev:** bump aws-cdk from 2.1007.0 to 2.1010.0 ([#6501](https://github.com/aws-powertools/powertools-lambda-python/issues/6501)) - **deps-dev:** bump httpx from 0.25.1 to 0.28.1 ([#6554](https://github.com/aws-powertools/powertools-lambda-python/issues/6554)) - **deps-dev:** bump boto3-stubs from 1.38.1 to 1.38.2 ([#6556](https://github.com/aws-powertools/powertools-lambda-python/issues/6556)) - **deps-dev:** bump boto3-stubs from 1.37.29 to 1.37.31 ([#6433](https://github.com/aws-powertools/powertools-lambda-python/issues/6433)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.187.0a0 to 2.188.0a0 ([#6434](https://github.com/aws-powertools/powertools-lambda-python/issues/6434)) - **deps-dev:** bump ruff from 0.11.3 to 0.11.4 ([#6428](https://github.com/aws-powertools/powertools-lambda-python/issues/6428)) - **deps-dev:** bump pytest-cov from 6.1.0 to 6.1.1 ([#6429](https://github.com/aws-powertools/powertools-lambda-python/issues/6429)) - **deps-dev:** bump cfn-lint from 1.32.4 to 1.33.0 ([#6430](https://github.com/aws-powertools/powertools-lambda-python/issues/6430)) - **deps-dev:** bump multiprocess from 0.70.17 to 0.70.18 ([#6516](https://github.com/aws-powertools/powertools-lambda-python/issues/6516)) - **deps-dev:** bump ruff from 0.11.6 to 0.11.7 ([#6555](https://github.com/aws-powertools/powertools-lambda-python/issues/6555)) - **deps-dev:** bump sentry-sdk from 2.26.1 to 2.27.0 ([#6553](https://github.com/aws-powertools/powertools-lambda-python/issues/6553)) - **deps-dev:** bump boto3-stubs from 1.37.34 to 1.37.35 ([#6504](https://github.com/aws-powertools/powertools-lambda-python/issues/6504)) ## [v3.10.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v3.9.0...v3.10.0) - 2025-04-08 ## Bug Fixes - **event_source:** Added missing properties in APIGatewayWebSocketEvent class ([#6411](https://github.com/aws-powertools/powertools-lambda-python/issues/6411)) - **event_source:** fix HomeDirectoryDetails type in TransferFamilyAuthorizerResponse method ([#6403](https://github.com/aws-powertools/powertools-lambda-python/issues/6403)) - **logger:** improve behavior with `exc_info=True` to prevent errors ([#6417](https://github.com/aws-powertools/powertools-lambda-python/issues/6417)) ## Documentation - **homepage:** add SAR documentation ([#6347](https://github.com/aws-powertools/powertools-lambda-python/issues/6347)) ## Features - **parser:** add AppSyncResolver model ([#6400](https://github.com/aws-powertools/powertools-lambda-python/issues/6400)) ## Maintenance - version bump - **ci:** new pre-release 3.9.1a4 ([#6377](https://github.com/aws-powertools/powertools-lambda-python/issues/6377)) - **ci:** new pre-release 3.9.1a8 ([#6415](https://github.com/aws-powertools/powertools-lambda-python/issues/6415)) - **ci:** new pre-release 3.9.1a9 ([#6422](https://github.com/aws-powertools/powertools-lambda-python/issues/6422)) - **ci:** new pre-release 3.9.1a0 ([#6354](https://github.com/aws-powertools/powertools-lambda-python/issues/6354)) - **ci:** new pre-release 3.9.1a5 ([#6385](https://github.com/aws-powertools/powertools-lambda-python/issues/6385)) - **ci:** new pre-release 3.9.1a7 ([#6401](https://github.com/aws-powertools/powertools-lambda-python/issues/6401)) - **ci:** new pre-release 3.9.1a1 ([#6356](https://github.com/aws-powertools/powertools-lambda-python/issues/6356)) - **ci:** new pre-release 3.9.1a2 ([#6364](https://github.com/aws-powertools/powertools-lambda-python/issues/6364)) - **ci:** new pre-release 3.9.1a6 ([#6392](https://github.com/aws-powertools/powertools-lambda-python/issues/6392)) - **ci:** new pre-release 3.9.1a3 ([#6369](https://github.com/aws-powertools/powertools-lambda-python/issues/6369)) - **deps:** bump aws-encryption-sdk from 4.0.0 to 4.0.1 ([#6360](https://github.com/aws-powertools/powertools-lambda-python/issues/6360)) - **deps:** bump pydantic from 2.11.1 to 2.11.2 ([#6395](https://github.com/aws-powertools/powertools-lambda-python/issues/6395)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.22 to 3.0.23 ([#6371](https://github.com/aws-powertools/powertools-lambda-python/issues/6371)) - **deps:** bump squidfunk/mkdocs-material from `3555052` to `23b6978` in /docs ([#6404](https://github.com/aws-powertools/powertools-lambda-python/issues/6404)) - **deps:** bump datadog-lambda from 6.106.0 to 6.107.0 ([#6405](https://github.com/aws-powertools/powertools-lambda-python/issues/6405)) - **deps:** bump squidfunk/mkdocs-material from `f226a2d` to `3555052` in /docs ([#6372](https://github.com/aws-powertools/powertools-lambda-python/issues/6372)) - **deps:** bump pydantic from 2.10.6 to 2.11.1 ([#6383](https://github.com/aws-powertools/powertools-lambda-python/issues/6383)) - **deps:** bump typing-extensions from 4.12.2 to 4.13.1 ([#6418](https://github.com/aws-powertools/powertools-lambda-python/issues/6418)) - **deps:** bump actions/setup-python from 5.4.0 to 5.5.0 ([#6349](https://github.com/aws-powertools/powertools-lambda-python/issues/6349)) - **deps:** bump actions/dependency-review-action from 4.5.0 to 4.6.0 ([#6380](https://github.com/aws-powertools/powertools-lambda-python/issues/6380)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.186.0a0 to 2.187.0a0 ([#6382](https://github.com/aws-powertools/powertools-lambda-python/issues/6382)) - **deps-dev:** bump pytest-cov from 6.0.0 to 6.1.0 ([#6381](https://github.com/aws-powertools/powertools-lambda-python/issues/6381)) - **deps-dev:** bump coverage from 7.7.1 to 7.8.0 ([#6376](https://github.com/aws-powertools/powertools-lambda-python/issues/6376)) - **deps-dev:** bump mkdocs-material from 9.6.9 to 9.6.10 ([#6375](https://github.com/aws-powertools/powertools-lambda-python/issues/6375)) - **deps-dev:** bump boto3-stubs from 1.37.23 to 1.37.24 ([#6374](https://github.com/aws-powertools/powertools-lambda-python/issues/6374)) - **deps-dev:** bump boto3-stubs from 1.37.24 to 1.37.25 ([#6384](https://github.com/aws-powertools/powertools-lambda-python/issues/6384)) - **deps-dev:** bump sentry-sdk from 2.24.1 to 2.25.0 ([#6373](https://github.com/aws-powertools/powertools-lambda-python/issues/6373)) - **deps-dev:** bump aws-cdk from 2.1006.0 to 2.1007.0 ([#6387](https://github.com/aws-powertools/powertools-lambda-python/issues/6387)) - **deps-dev:** bump boto3-stubs from 1.37.25 to 1.37.26 ([#6389](https://github.com/aws-powertools/powertools-lambda-python/issues/6389)) - **deps-dev:** bump sentry-sdk from 2.25.0 to 2.25.1 ([#6391](https://github.com/aws-powertools/powertools-lambda-python/issues/6391)) - **deps-dev:** bump boto3-stubs from 1.37.22 to 1.37.23 ([#6366](https://github.com/aws-powertools/powertools-lambda-python/issues/6366)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.298 to 0.1.299 ([#6390](https://github.com/aws-powertools/powertools-lambda-python/issues/6390)) - **deps-dev:** bump cfn-lint from 1.32.1 to 1.32.3 ([#6388](https://github.com/aws-powertools/powertools-lambda-python/issues/6388)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.185.0a0 to 2.186.0a0 ([#6363](https://github.com/aws-powertools/powertools-lambda-python/issues/6363)) - **deps-dev:** bump boto3-stubs from 1.37.20 to 1.37.22 ([#6362](https://github.com/aws-powertools/powertools-lambda-python/issues/6362)) - **deps-dev:** bump testcontainers from 4.9.2 to 4.10.0 ([#6397](https://github.com/aws-powertools/powertools-lambda-python/issues/6397)) - **deps-dev:** bump mkdocstrings-python from 1.16.8 to 1.16.10 ([#6399](https://github.com/aws-powertools/powertools-lambda-python/issues/6399)) - **deps-dev:** bump ruff from 0.11.2 to 0.11.3 ([#6398](https://github.com/aws-powertools/powertools-lambda-python/issues/6398)) - **deps-dev:** bump boto3-stubs from 1.37.26 to 1.37.28 ([#6406](https://github.com/aws-powertools/powertools-lambda-python/issues/6406)) - **deps-dev:** bump pytest-asyncio from 0.25.3 to 0.26.0 ([#6352](https://github.com/aws-powertools/powertools-lambda-python/issues/6352)) - **deps-dev:** bump aws-cdk-lib from 2.187.0 to 2.188.0 ([#6407](https://github.com/aws-powertools/powertools-lambda-python/issues/6407)) - **deps-dev:** bump cfn-lint from 1.32.0 to 1.32.1 ([#6351](https://github.com/aws-powertools/powertools-lambda-python/issues/6351)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.299 to 0.1.300 ([#6408](https://github.com/aws-powertools/powertools-lambda-python/issues/6408)) - **deps-dev:** bump aws-cdk from 2.1005.0 to 2.1006.0 ([#6350](https://github.com/aws-powertools/powertools-lambda-python/issues/6350)) - **deps-dev:** bump cfn-lint from 1.32.3 to 1.32.4 ([#6419](https://github.com/aws-powertools/powertools-lambda-python/issues/6419)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.300 to 0.1.301 ([#6420](https://github.com/aws-powertools/powertools-lambda-python/issues/6420)) - **deps-dev:** bump boto3-stubs from 1.37.28 to 1.37.29 ([#6421](https://github.com/aws-powertools/powertools-lambda-python/issues/6421)) - **deps-dev:** bump boto3-stubs from 1.37.19 to 1.37.20 ([#6353](https://github.com/aws-powertools/powertools-lambda-python/issues/6353)) ## [v3.9.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v3.8.0...v3.9.0) - 2025-03-25 ## Bug Fixes - **idempotency:** include sk in error msgs when using composite key ([#6325](https://github.com/aws-powertools/powertools-lambda-python/issues/6325)) - **metrics:** ensure proper type conversion for `DD_FLUSH_TO_LOG` env var ([#6280](https://github.com/aws-powertools/powertools-lambda-python/issues/6280)) ## Code Refactoring - **data_classes:** Add base class with common code ([#6297](https://github.com/aws-powertools/powertools-lambda-python/issues/6297)) - **data_classes:** remove duplicated code ([#6288](https://github.com/aws-powertools/powertools-lambda-python/issues/6288)) - **data_classes:** simplify nested data classes ([#6289](https://github.com/aws-powertools/powertools-lambda-python/issues/6289)) - **tests:** add LambdaContext type in tests ([#6214](https://github.com/aws-powertools/powertools-lambda-python/issues/6214)) ## Documentation - **homepage:** update layer instructions link ([#6242](https://github.com/aws-powertools/powertools-lambda-python/issues/6242)) - **public_reference:** add Guild as a public reference ([#6342](https://github.com/aws-powertools/powertools-lambda-python/issues/6342)) ## Features - **data_classes:** add API Gateway Websocket event ([#6287](https://github.com/aws-powertools/powertools-lambda-python/issues/6287)) - **event_handler:** add custom method for OpenAPI configuration ([#6204](https://github.com/aws-powertools/powertools-lambda-python/issues/6204)) - **event_handler:** add custom response validation in OpenAPI utility ([#6189](https://github.com/aws-powertools/powertools-lambda-python/issues/6189)) - **general:** make logger, tracer and metrics utilities aware of provisioned concurrency ([#6324](https://github.com/aws-powertools/powertools-lambda-python/issues/6324)) - **metrics:** allow change ColdStart function_name dimension ([#6315](https://github.com/aws-powertools/powertools-lambda-python/issues/6315)) ## Maintenance - version bump - **ci:** new pre-release 3.8.1a8 ([#6307](https://github.com/aws-powertools/powertools-lambda-python/issues/6307)) - **ci:** new pre-release 3.8.1a11 ([#6340](https://github.com/aws-powertools/powertools-lambda-python/issues/6340)) - **ci:** new pre-release 3.8.1a0 ([#6244](https://github.com/aws-powertools/powertools-lambda-python/issues/6244)) - **ci:** new pre-release 3.8.1a10 ([#6332](https://github.com/aws-powertools/powertools-lambda-python/issues/6332)) - **ci:** new pre-release 3.8.1a1 ([#6250](https://github.com/aws-powertools/powertools-lambda-python/issues/6250)) - **ci:** new pre-release 3.8.1a2 ([#6253](https://github.com/aws-powertools/powertools-lambda-python/issues/6253)) - **ci:** new pre-release 3.8.1a9 ([#6322](https://github.com/aws-powertools/powertools-lambda-python/issues/6322)) - **ci:** new pre-release 3.8.1a3 ([#6259](https://github.com/aws-powertools/powertools-lambda-python/issues/6259)) - **ci:** new pre-release 3.8.1a4 ([#6268](https://github.com/aws-powertools/powertools-lambda-python/issues/6268)) - **ci:** Fix SAR pipeline ([#6313](https://github.com/aws-powertools/powertools-lambda-python/issues/6313)) - **ci:** new pre-release 3.8.1a5 ([#6276](https://github.com/aws-powertools/powertools-lambda-python/issues/6276)) - **ci:** new pre-release 3.8.1a6 ([#6290](https://github.com/aws-powertools/powertools-lambda-python/issues/6290)) - **ci:** new pre-release 3.8.1a7 ([#6298](https://github.com/aws-powertools/powertools-lambda-python/issues/6298)) - **deps:** bump actions/setup-go from 5.3.0 to 5.4.0 ([#6304](https://github.com/aws-powertools/powertools-lambda-python/issues/6304)) - **deps:** bump actions/upload-artifact from 4.6.1 to 4.6.2 ([#6302](https://github.com/aws-powertools/powertools-lambda-python/issues/6302)) - **deps:** bump squidfunk/mkdocs-material from `047452c` to `479a06a` in /docs ([#6261](https://github.com/aws-powertools/powertools-lambda-python/issues/6261)) - **deps:** bump squidfunk/mkdocs-material from `479a06a` to `f226a2d` in /docs ([#6279](https://github.com/aws-powertools/powertools-lambda-python/issues/6279)) - **deps:** bump actions/download-artifact from 4.1.9 to 4.2.0 ([#6294](https://github.com/aws-powertools/powertools-lambda-python/issues/6294)) - **deps:** bump actions/download-artifact from 4.2.0 to 4.2.1 ([#6303](https://github.com/aws-powertools/powertools-lambda-python/issues/6303)) - **deps:** bump actions/setup-node from 4.2.0 to 4.3.0 ([#6278](https://github.com/aws-powertools/powertools-lambda-python/issues/6278)) - **deps-dev:** bump mkdocs-material from 9.6.7 to 9.6.8 ([#6264](https://github.com/aws-powertools/powertools-lambda-python/issues/6264)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.296 to 0.1.297 ([#6281](https://github.com/aws-powertools/powertools-lambda-python/issues/6281)) - **deps-dev:** bump boto3-stubs from 1.37.12 to 1.37.14 ([#6282](https://github.com/aws-powertools/powertools-lambda-python/issues/6282)) - **deps-dev:** bump aws-cdk from 2.1004.0 to 2.1005.0 ([#6301](https://github.com/aws-powertools/powertools-lambda-python/issues/6301)) - **deps-dev:** bump boto3-stubs from 1.37.15 to 1.37.16 ([#6305](https://github.com/aws-powertools/powertools-lambda-python/issues/6305)) - **deps-dev:** bump mkdocs-material from 9.6.8 to 9.6.9 ([#6285](https://github.com/aws-powertools/powertools-lambda-python/issues/6285)) - **deps-dev:** bump cfn-lint from 1.31.0 to 1.31.3 ([#6306](https://github.com/aws-powertools/powertools-lambda-python/issues/6306)) - **deps-dev:** bump ruff from 0.9.10 to 0.11.0 ([#6273](https://github.com/aws-powertools/powertools-lambda-python/issues/6273)) - **deps-dev:** bump sentry-sdk from 2.24.0 to 2.24.1 ([#6339](https://github.com/aws-powertools/powertools-lambda-python/issues/6339)) - **deps-dev:** bump aws-cdk-lib from 2.183.0 to 2.184.1 ([#6272](https://github.com/aws-powertools/powertools-lambda-python/issues/6272)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.183.0a0 to 2.184.1a0 ([#6271](https://github.com/aws-powertools/powertools-lambda-python/issues/6271)) - **deps-dev:** bump filelock from 3.17.0 to 3.18.0 ([#6270](https://github.com/aws-powertools/powertools-lambda-python/issues/6270)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.184.1a0 to 2.185.0a0 ([#6317](https://github.com/aws-powertools/powertools-lambda-python/issues/6317)) - **deps-dev:** bump boto3-stubs from 1.37.11 to 1.37.12 ([#6266](https://github.com/aws-powertools/powertools-lambda-python/issues/6266)) - **deps-dev:** bump cfn-lint from 1.31.3 to 1.32.0 ([#6316](https://github.com/aws-powertools/powertools-lambda-python/issues/6316)) - **deps-dev:** bump cfn-lint from 1.30.0 to 1.31.0 ([#6296](https://github.com/aws-powertools/powertools-lambda-python/issues/6296)) - **deps-dev:** bump cfn-lint from 1.29.1 to 1.30.0 ([#6263](https://github.com/aws-powertools/powertools-lambda-python/issues/6263)) - **deps-dev:** bump aws-cdk from 2.1003.0 to 2.1004.0 ([#6262](https://github.com/aws-powertools/powertools-lambda-python/issues/6262)) - **deps-dev:** bump boto3-stubs from 1.37.14 to 1.37.15 ([#6295](https://github.com/aws-powertools/powertools-lambda-python/issues/6295)) - **deps-dev:** bump boto3-stubs from 1.37.8 to 1.37.10 ([#6248](https://github.com/aws-powertools/powertools-lambda-python/issues/6248)) - **deps-dev:** bump mkdocstrings-python from 1.16.6 to 1.16.7 ([#6319](https://github.com/aws-powertools/powertools-lambda-python/issues/6319)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.182.0a0 to 2.183.0a0 ([#6258](https://github.com/aws-powertools/powertools-lambda-python/issues/6258)) - **deps-dev:** bump aws-cdk-lib from 2.182.0 to 2.183.0 ([#6257](https://github.com/aws-powertools/powertools-lambda-python/issues/6257)) - **deps-dev:** bump ruff from 0.11.0 to 0.11.1 ([#6320](https://github.com/aws-powertools/powertools-lambda-python/issues/6320)) - **deps-dev:** bump ruff from 0.11.1 to 0.11.2 ([#6326](https://github.com/aws-powertools/powertools-lambda-python/issues/6326)) - **deps-dev:** bump boto3-stubs from 1.37.10 to 1.37.11 ([#6252](https://github.com/aws-powertools/powertools-lambda-python/issues/6252)) - **deps-dev:** bump coverage from 7.7.0 to 7.7.1 ([#6328](https://github.com/aws-powertools/powertools-lambda-python/issues/6328)) - **deps-dev:** bump cfn-lint from 1.28.0 to 1.29.1 ([#6249](https://github.com/aws-powertools/powertools-lambda-python/issues/6249)) - **deps-dev:** bump boto3-stubs from 1.37.16 to 1.37.18 ([#6327](https://github.com/aws-powertools/powertools-lambda-python/issues/6327)) - **deps-dev:** bump sentry-sdk from 2.23.1 to 2.24.0 ([#6329](https://github.com/aws-powertools/powertools-lambda-python/issues/6329)) - **deps-dev:** bump boto3-stubs from 1.37.18 to 1.37.19 ([#6337](https://github.com/aws-powertools/powertools-lambda-python/issues/6337)) - **deps-dev:** bump mkdocstrings-python from 1.16.7 to 1.16.8 ([#6338](https://github.com/aws-powertools/powertools-lambda-python/issues/6338)) - **deps-dev:** bump ruff from 0.9.9 to 0.9.10 ([#6241](https://github.com/aws-powertools/powertools-lambda-python/issues/6241)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.295 to 0.1.296 ([#6240](https://github.com/aws-powertools/powertools-lambda-python/issues/6240)) - **deps-dev:** bump boto3-stubs from 1.37.7 to 1.37.8 ([#6239](https://github.com/aws-powertools/powertools-lambda-python/issues/6239)) - **deps-dev:** bump coverage from 7.6.12 to 7.7.0 ([#6284](https://github.com/aws-powertools/powertools-lambda-python/issues/6284)) - **documentation:** v2 end of support ([#6343](https://github.com/aws-powertools/powertools-lambda-python/issues/6343)) - **logger:** clear prev request buffers in manual mode ([#6314](https://github.com/aws-powertools/powertools-lambda-python/issues/6314)) ## [v3.8.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v3.7.0...v3.8.0) - 2025-03-07 ## Bug Fixes - **event_handler:** revert regression when validating response ([#6234](https://github.com/aws-powertools/powertools-lambda-python/issues/6234)) ## Code Refactoring - **tracer:** fix capture_lambda_handler return type annotation ([#6197](https://github.com/aws-powertools/powertools-lambda-python/issues/6197)) ## Documentation - **layer:** Fix SSM parameter name for looking up layer ARN ([#6221](https://github.com/aws-powertools/powertools-lambda-python/issues/6221)) ## Features - **logger:** add logger buffer feature ([#6060](https://github.com/aws-powertools/powertools-lambda-python/issues/6060)) - **logger:** add new logic to sample debug logs ([#6142](https://github.com/aws-powertools/powertools-lambda-python/issues/6142)) ## Maintenance - version bump - **ci:** new pre-release 3.7.1a2 ([#6186](https://github.com/aws-powertools/powertools-lambda-python/issues/6186)) - **ci:** new pre-release 3.7.1a0 ([#6166](https://github.com/aws-powertools/powertools-lambda-python/issues/6166)) - **ci:** new pre-release 3.7.1a6 ([#6229](https://github.com/aws-powertools/powertools-lambda-python/issues/6229)) - **ci:** new pre-release 3.7.1a7 ([#6233](https://github.com/aws-powertools/powertools-lambda-python/issues/6233)) - **ci:** new pre-release 3.7.1a1 ([#6178](https://github.com/aws-powertools/powertools-lambda-python/issues/6178)) - **ci:** enable SAR deployment ([#6104](https://github.com/aws-powertools/powertools-lambda-python/issues/6104)) - **ci:** new pre-release 3.7.1a5 ([#6219](https://github.com/aws-powertools/powertools-lambda-python/issues/6219)) - **ci:** new pre-release 3.7.1a3 ([#6201](https://github.com/aws-powertools/powertools-lambda-python/issues/6201)) - **ci:** new pre-release 3.7.1a4 ([#6211](https://github.com/aws-powertools/powertools-lambda-python/issues/6211)) - **deps:** bump docker/setup-qemu-action from 3.5.0 to 3.6.0 ([#6190](https://github.com/aws-powertools/powertools-lambda-python/issues/6190)) - **deps:** bump actions/download-artifact from 4.1.8 to 4.1.9 ([#6174](https://github.com/aws-powertools/powertools-lambda-python/issues/6174)) - **deps:** bump squidfunk/mkdocs-material from `2615302` to `047452c` in /docs ([#6210](https://github.com/aws-powertools/powertools-lambda-python/issues/6210)) - **deps:** bump docker/setup-qemu-action from 3.4.0 to 3.5.0 ([#6176](https://github.com/aws-powertools/powertools-lambda-python/issues/6176)) - **deps:** bump docker/setup-buildx-action from 3.9.0 to 3.10.0 ([#6175](https://github.com/aws-powertools/powertools-lambda-python/issues/6175)) - **deps:** bump datadog-lambda from 6.105.0 to 6.106.0 ([#6218](https://github.com/aws-powertools/powertools-lambda-python/issues/6218)) - **deps:** bump codecov/codecov-action from 5.3.1 to 5.4.0 ([#6180](https://github.com/aws-powertools/powertools-lambda-python/issues/6180)) - **deps:** bump pydantic-settings from 2.8.0 to 2.8.1 ([#6182](https://github.com/aws-powertools/powertools-lambda-python/issues/6182)) - **deps:** bump jinja2 from 3.1.5 to 3.1.6 in /docs ([#6223](https://github.com/aws-powertools/powertools-lambda-python/issues/6223)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.294 to 0.1.295 ([#6207](https://github.com/aws-powertools/powertools-lambda-python/issues/6207)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.293 to 0.1.294 ([#6193](https://github.com/aws-powertools/powertools-lambda-python/issues/6193)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.181.0a0 to 2.181.1a0 ([#6194](https://github.com/aws-powertools/powertools-lambda-python/issues/6194)) - **deps-dev:** bump ruff from 0.9.8 to 0.9.9 ([#6195](https://github.com/aws-powertools/powertools-lambda-python/issues/6195)) - **deps-dev:** bump aws-cdk-lib from 2.181.1 to 2.182.0 ([#6222](https://github.com/aws-powertools/powertools-lambda-python/issues/6222)) - **deps-dev:** bump testcontainers from 4.9.1 to 4.9.2 ([#6225](https://github.com/aws-powertools/powertools-lambda-python/issues/6225)) - **deps-dev:** bump cfn-lint from 1.26.1 to 1.27.0 ([#6192](https://github.com/aws-powertools/powertools-lambda-python/issues/6192)) - **deps-dev:** bump boto3-stubs from 1.37.2 to 1.37.3 ([#6181](https://github.com/aws-powertools/powertools-lambda-python/issues/6181)) - **deps-dev:** bump isort from 6.0.0 to 6.0.1 ([#6183](https://github.com/aws-powertools/powertools-lambda-python/issues/6183)) - **deps-dev:** bump boto3-stubs from 1.37.5 to 1.37.6 ([#6227](https://github.com/aws-powertools/powertools-lambda-python/issues/6227)) - **deps-dev:** bump ruff from 0.9.7 to 0.9.8 ([#6184](https://github.com/aws-powertools/powertools-lambda-python/issues/6184)) - **deps-dev:** bump boto3-stubs from 1.37.4 to 1.37.5 ([#6217](https://github.com/aws-powertools/powertools-lambda-python/issues/6217)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.181.1a0 to 2.182.0a0 ([#6226](https://github.com/aws-powertools/powertools-lambda-python/issues/6226)) - **deps-dev:** bump cfn-lint from 1.27.0 to 1.28.0 ([#6228](https://github.com/aws-powertools/powertools-lambda-python/issues/6228)) - **deps-dev:** bump pytest from 8.3.4 to 8.3.5 ([#6206](https://github.com/aws-powertools/powertools-lambda-python/issues/6206)) - **deps-dev:** bump boto3-stubs from 1.37.0 to 1.37.1 ([#6170](https://github.com/aws-powertools/powertools-lambda-python/issues/6170)) - **deps-dev:** bump boto3-stubs from 1.37.3 to 1.37.4 ([#6205](https://github.com/aws-powertools/powertools-lambda-python/issues/6205)) - **deps-dev:** bump mkdocs-material from 9.6.5 to 9.6.7 ([#6208](https://github.com/aws-powertools/powertools-lambda-python/issues/6208)) - **deps-dev:** bump aws-cdk from 2.1000.3 to 2.1001.0 ([#6173](https://github.com/aws-powertools/powertools-lambda-python/issues/6173)) - **deps-dev:** bump cfn-lint from 1.26.0 to 1.26.1 ([#6169](https://github.com/aws-powertools/powertools-lambda-python/issues/6169)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.180.0a0 to 2.181.0a0 ([#6172](https://github.com/aws-powertools/powertools-lambda-python/issues/6172)) - **deps-dev:** bump jinja2 from 3.1.5 to 3.1.6 ([#6224](https://github.com/aws-powertools/powertools-lambda-python/issues/6224)) - **deps-dev:** bump aws-cdk from 2.1002.0 to 2.1003.0 ([#6232](https://github.com/aws-powertools/powertools-lambda-python/issues/6232)) - **deps-dev:** bump cfn-lint from 1.25.1 to 1.26.0 ([#6164](https://github.com/aws-powertools/powertools-lambda-python/issues/6164)) - **deps-dev:** bump boto3-stubs from 1.36.26 to 1.37.0 ([#6165](https://github.com/aws-powertools/powertools-lambda-python/issues/6165)) - **deps-dev:** bump mypy-boto3-appconfigdata from 1.36.0 to 1.37.0 in the boto-typing group ([#6163](https://github.com/aws-powertools/powertools-lambda-python/issues/6163)) - **deps-dev:** bump aws-cdk from 2.1000.2 to 2.1000.3 ([#6162](https://github.com/aws-powertools/powertools-lambda-python/issues/6162)) - **deps-dev:** bump boto3-stubs from 1.37.6 to 1.37.7 ([#6231](https://github.com/aws-powertools/powertools-lambda-python/issues/6231)) - **deps-dev:** bump aws-cdk from 2.1001.0 to 2.1002.0 ([#6209](https://github.com/aws-powertools/powertools-lambda-python/issues/6209)) ## [v3.7.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v3.6.0...v3.7.0) - 2025-02-25 ## Bug Fixes - **logger:** correctly pick powertools or custom handler in custom environments ([#6083](https://github.com/aws-powertools/powertools-lambda-python/issues/6083)) - **openapi:** validate response serialization when falsy ([#6119](https://github.com/aws-powertools/powertools-lambda-python/issues/6119)) - **parser:** fix data types for `sourceIPAddress` and `sequencer` fields in S3RecordModel Model ([#6154](https://github.com/aws-powertools/powertools-lambda-python/issues/6154)) - **parser:** fix EventBridgeModel when working with scheduled events ([#6134](https://github.com/aws-powertools/powertools-lambda-python/issues/6134)) - **security:** fix encryption_context handling in data masking operations ([#6074](https://github.com/aws-powertools/powertools-lambda-python/issues/6074)) ## Documentation - **roadmap:** update roadmap ([#6077](https://github.com/aws-powertools/powertools-lambda-python/issues/6077)) ## Features - **batch:** raise exception for invalid batch event ([#6088](https://github.com/aws-powertools/powertools-lambda-python/issues/6088)) - **event_handler:** add support for defining OpenAPI examples in parameters ([#6086](https://github.com/aws-powertools/powertools-lambda-python/issues/6086)) - **layers:** add new comercial region ap-southeast-7 and mx-central-1 ([#6109](https://github.com/aws-powertools/powertools-lambda-python/issues/6109)) - **parser:** Event source dataclasses for IoT Core Registry Events ([#6123](https://github.com/aws-powertools/powertools-lambda-python/issues/6123)) - **parser:** Add IoT registry events models ([#5892](https://github.com/aws-powertools/powertools-lambda-python/issues/5892)) ## Maintenance - version bump - **ci:** new pre-release 3.6.1a9 ([#6157](https://github.com/aws-powertools/powertools-lambda-python/issues/6157)) - **ci:** new pre-release 3.6.1a8 ([#6152](https://github.com/aws-powertools/powertools-lambda-python/issues/6152)) - **ci:** new pre-release 3.6.1a4 ([#6120](https://github.com/aws-powertools/powertools-lambda-python/issues/6120)) - **ci:** new pre-release 3.6.1a3 ([#6107](https://github.com/aws-powertools/powertools-lambda-python/issues/6107)) - **ci:** new pre-release 3.6.1a0 ([#6084](https://github.com/aws-powertools/powertools-lambda-python/issues/6084)) - **ci:** new pre-release 3.6.1a5 ([#6124](https://github.com/aws-powertools/powertools-lambda-python/issues/6124)) - **ci:** new pre-release 3.6.1a7 ([#6139](https://github.com/aws-powertools/powertools-lambda-python/issues/6139)) - **ci:** new pre-release 3.6.1a1 ([#6090](https://github.com/aws-powertools/powertools-lambda-python/issues/6090)) - **ci:** new pre-release 3.6.1a6 ([#6132](https://github.com/aws-powertools/powertools-lambda-python/issues/6132)) - **ci:** new pre-release 3.6.1a2 ([#6098](https://github.com/aws-powertools/powertools-lambda-python/issues/6098)) - **ci:** remove python3.8 runtime when bootstrapping a new region ([#6101](https://github.com/aws-powertools/powertools-lambda-python/issues/6101)) - **deps:** bump squidfunk/mkdocs-material from `f5bcec4` to `2615302` in /docs ([#6135](https://github.com/aws-powertools/powertools-lambda-python/issues/6135)) - **deps:** bump squidfunk/mkdocs-material from `c62453b` to `f5bcec4` in /docs ([#6087](https://github.com/aws-powertools/powertools-lambda-python/issues/6087)) - **deps:** bump actions/upload-artifact from 4.6.0 to 4.6.1 ([#6144](https://github.com/aws-powertools/powertools-lambda-python/issues/6144)) - **deps:** bump aws-actions/configure-aws-credentials from 4.0.3 to 4.1.0 ([#6082](https://github.com/aws-powertools/powertools-lambda-python/issues/6082)) - **deps:** bump pydantic-settings from 2.7.1 to 2.8.0 ([#6147](https://github.com/aws-powertools/powertools-lambda-python/issues/6147)) - **deps:** bump ossf/scorecard-action from 2.4.0 to 2.4.1 ([#6143](https://github.com/aws-powertools/powertools-lambda-python/issues/6143)) - **deps:** bump slsa-framework/slsa-github-generator from 2.0.0 to 2.1.0 ([#6155](https://github.com/aws-powertools/powertools-lambda-python/issues/6155)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.21 to 3.0.22 ([#6113](https://github.com/aws-powertools/powertools-lambda-python/issues/6113)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.292 to 0.1.293 ([#6129](https://github.com/aws-powertools/powertools-lambda-python/issues/6129)) - **deps-dev:** bump sentry-sdk from 2.21.0 to 2.22.0 ([#6114](https://github.com/aws-powertools/powertools-lambda-python/issues/6114)) - **deps-dev:** bump bandit from 1.8.2 to 1.8.3 ([#6117](https://github.com/aws-powertools/powertools-lambda-python/issues/6117)) - **deps-dev:** bump mkdocstrings-python from 1.15.0 to 1.16.0 ([#6118](https://github.com/aws-powertools/powertools-lambda-python/issues/6118)) - **deps-dev:** bump boto3-stubs from 1.36.19 to 1.36.22 ([#6116](https://github.com/aws-powertools/powertools-lambda-python/issues/6116)) - **deps-dev:** bump cfn-lint from 1.24.0 to 1.25.1 ([#6115](https://github.com/aws-powertools/powertools-lambda-python/issues/6115)) - **deps-dev:** bump mkdocstrings-python from 1.16.0 to 1.16.1 ([#6128](https://github.com/aws-powertools/powertools-lambda-python/issues/6128)) - **deps-dev:** bump boto3-stubs from 1.36.22 to 1.36.24 ([#6131](https://github.com/aws-powertools/powertools-lambda-python/issues/6131)) - **deps-dev:** bump aws-cdk from 2.178.2 to 2.1000.2 ([#6126](https://github.com/aws-powertools/powertools-lambda-python/issues/6126)) - **deps-dev:** bump sentry-sdk from 2.20.0 to 2.21.0 ([#6096](https://github.com/aws-powertools/powertools-lambda-python/issues/6096)) - **deps-dev:** bump mkdocs-material from 9.6.3 to 9.6.4 ([#6097](https://github.com/aws-powertools/powertools-lambda-python/issues/6097)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.178.2a0 to 2.179.0a0 ([#6127](https://github.com/aws-powertools/powertools-lambda-python/issues/6127)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.178.1a0 to 2.178.2a0 ([#6095](https://github.com/aws-powertools/powertools-lambda-python/issues/6095)) - **deps-dev:** bump boto3-stubs from 1.36.17 to 1.36.19 ([#6093](https://github.com/aws-powertools/powertools-lambda-python/issues/6093)) - **deps-dev:** bump aws-cdk-lib from 2.178.2 to 2.179.0 ([#6130](https://github.com/aws-powertools/powertools-lambda-python/issues/6130)) - **deps-dev:** bump ruff from 0.9.6 to 0.9.7 ([#6138](https://github.com/aws-powertools/powertools-lambda-python/issues/6138)) - **deps-dev:** bump aws-cdk from 2.178.1 to 2.178.2 ([#6089](https://github.com/aws-powertools/powertools-lambda-python/issues/6089)) - **deps-dev:** bump mkdocs-material from 9.6.4 to 9.6.5 ([#6136](https://github.com/aws-powertools/powertools-lambda-python/issues/6136)) - **deps-dev:** bump boto3-stubs from 1.36.24 to 1.36.25 ([#6137](https://github.com/aws-powertools/powertools-lambda-python/issues/6137)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.179.0a0 to 2.180.0a0 ([#6145](https://github.com/aws-powertools/powertools-lambda-python/issues/6145)) - **deps-dev:** bump aws-cdk-lib from 2.179.0 to 2.180.0 ([#6148](https://github.com/aws-powertools/powertools-lambda-python/issues/6148)) - **deps-dev:** bump coverage from 7.6.11 to 7.6.12 ([#6080](https://github.com/aws-powertools/powertools-lambda-python/issues/6080)) - **deps-dev:** bump mkdocstrings-python from 1.14.6 to 1.15.0 ([#6079](https://github.com/aws-powertools/powertools-lambda-python/issues/6079)) - **deps-dev:** bump boto3-stubs from 1.36.16 to 1.36.17 ([#6078](https://github.com/aws-powertools/powertools-lambda-python/issues/6078)) - **deps-dev:** bump boto3-stubs from 1.36.25 to 1.36.26 ([#6146](https://github.com/aws-powertools/powertools-lambda-python/issues/6146)) - **docs:** enable sitemap generation ([#6103](https://github.com/aws-powertools/powertools-lambda-python/issues/6103)) ## [v3.6.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v3.5.0...v3.6.0) - 2025-02-11 ## Bug Fixes - **docs:** typo in a service name in Event Handler ([#5944](https://github.com/aws-powertools/powertools-lambda-python/issues/5944)) - **logger:** child logger must respect log level ([#5950](https://github.com/aws-powertools/powertools-lambda-python/issues/5950)) ## Code Refactoring - **metrics:** Improve type annotations for metrics decorator ([#6000](https://github.com/aws-powertools/powertools-lambda-python/issues/6000)) ## Documentation - **api:** migrating the event handler utility to mkdocstrings ([#6023](https://github.com/aws-powertools/powertools-lambda-python/issues/6023)) - **api:** migrating the metrics utility to mkdocstrings ([#6022](https://github.com/aws-powertools/powertools-lambda-python/issues/6022)) - **api:** migrating the logger utility to mkdocstrings ([#6021](https://github.com/aws-powertools/powertools-lambda-python/issues/6021)) - **api:** migrating the Middleware Factory utility to mkdocstrings ([#6019](https://github.com/aws-powertools/powertools-lambda-python/issues/6019)) - **api:** migrating the tracer utility to mkdocstrings ([#6017](https://github.com/aws-powertools/powertools-lambda-python/issues/6017)) - **api:** migrating the batch utility to mkdocstrings ([#6016](https://github.com/aws-powertools/powertools-lambda-python/issues/6016)) - **api:** migrating the event source data classes utility to mkdocstrings ([#6015](https://github.com/aws-powertools/powertools-lambda-python/issues/6015)) - **api:** migrating the data masking utility to mkdocstrings ([#6013](https://github.com/aws-powertools/powertools-lambda-python/issues/6013)) - **api:** migrating the AppConfig utility to mkdocstrings ([#6008](https://github.com/aws-powertools/powertools-lambda-python/issues/6008)) - **api:** migrating the idempotency utility to mkdocstrings ([#6007](https://github.com/aws-powertools/powertools-lambda-python/issues/6007)) - **api:** migrating the jmespath utility to mkdocstrings ([#6006](https://github.com/aws-powertools/powertools-lambda-python/issues/6006)) - **api:** migrating the parameters utility to mkdocstrings ([#6005](https://github.com/aws-powertools/powertools-lambda-python/issues/6005)) - **api:** migrating the parser utility to mkdocstrings ([#6004](https://github.com/aws-powertools/powertools-lambda-python/issues/6004)) - **api:** migrating the streaming utility to mkdocstrings ([#6003](https://github.com/aws-powertools/powertools-lambda-python/issues/6003)) - **api:** migrating the typing utility to mkdocstrings ([#5996](https://github.com/aws-powertools/powertools-lambda-python/issues/5996)) - **api:** migrating the validation utility to mkdocstrings ([#5972](https://github.com/aws-powertools/powertools-lambda-python/issues/5972)) - **layer:** update layer version number - v3.5.0 ([#5952](https://github.com/aws-powertools/powertools-lambda-python/issues/5952)) ## Features - **data-masking:** add custom mask functionalities ([#5837](https://github.com/aws-powertools/powertools-lambda-python/issues/5837)) - **event_source:** add class APIGatewayAuthorizerResponseWebSocket ([#6058](https://github.com/aws-powertools/powertools-lambda-python/issues/6058)) - **logger:** add clear_state method ([#5956](https://github.com/aws-powertools/powertools-lambda-python/issues/5956)) - **metrics:** disable metrics flush via environment variables ([#6046](https://github.com/aws-powertools/powertools-lambda-python/issues/6046)) - **openapi:** enhance support for tuple return type validation ([#5997](https://github.com/aws-powertools/powertools-lambda-python/issues/5997)) ## Maintenance - version bump - **ci:** new pre-release 3.5.1a9 ([#6069](https://github.com/aws-powertools/powertools-lambda-python/issues/6069)) - **ci:** new pre-release 3.5.1a0 ([#5945](https://github.com/aws-powertools/powertools-lambda-python/issues/5945)) - **ci:** new pre-release 3.5.1a1 ([#5954](https://github.com/aws-powertools/powertools-lambda-python/issues/5954)) - **ci:** new pre-release 3.5.1a8 ([#6061](https://github.com/aws-powertools/powertools-lambda-python/issues/6061)) - **ci:** install & configure mkdocstrings plugin ([#5959](https://github.com/aws-powertools/powertools-lambda-python/issues/5959)) - **ci:** new pre-release 3.5.1a2 ([#5970](https://github.com/aws-powertools/powertools-lambda-python/issues/5970)) - **ci:** new pre-release 3.5.1a3 ([#5998](https://github.com/aws-powertools/powertools-lambda-python/issues/5998)) - **ci:** new pre-release 3.5.1a7 ([#6044](https://github.com/aws-powertools/powertools-lambda-python/issues/6044)) - **ci:** new pre-release 3.5.1a4 ([#6018](https://github.com/aws-powertools/powertools-lambda-python/issues/6018)) - **ci:** remove pdoc3 library ([#6024](https://github.com/aws-powertools/powertools-lambda-python/issues/6024)) - **ci:** new pre-release 3.5.1a5 ([#6026](https://github.com/aws-powertools/powertools-lambda-python/issues/6026)) - **ci:** add new script to bump Lambda layer version ([#6001](https://github.com/aws-powertools/powertools-lambda-python/issues/6001)) - **ci:** new pre-release 3.5.1a6 ([#6033](https://github.com/aws-powertools/powertools-lambda-python/issues/6033)) - **deps:** bump squidfunk/mkdocs-material from `471695f` to `7e841df` in /docs ([#6012](https://github.com/aws-powertools/powertools-lambda-python/issues/6012)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.20 to 3.0.21 ([#6064](https://github.com/aws-powertools/powertools-lambda-python/issues/6064)) - **deps:** bump actions/setup-python from 5.3.0 to 5.4.0 ([#5960](https://github.com/aws-powertools/powertools-lambda-python/issues/5960)) - **deps:** bump docker/setup-qemu-action from 3.2.0 to 3.3.0 ([#5961](https://github.com/aws-powertools/powertools-lambda-python/issues/5961)) - **deps:** bump codecov/codecov-action from 5.1.2 to 5.3.1 ([#5964](https://github.com/aws-powertools/powertools-lambda-python/issues/5964)) - **deps:** bump squidfunk/mkdocs-material from `7e841df` to `c62453b` in /docs ([#6052](https://github.com/aws-powertools/powertools-lambda-python/issues/6052)) - **deps:** bump actions/setup-node from 4.1.0 to 4.2.0 ([#5963](https://github.com/aws-powertools/powertools-lambda-python/issues/5963)) - **deps:** bump actions/upload-artifact from 4.5.0 to 4.6.0 ([#5962](https://github.com/aws-powertools/powertools-lambda-python/issues/5962)) - **deps:** bump release-drafter/release-drafter from 6.0.0 to 6.1.0 ([#5976](https://github.com/aws-powertools/powertools-lambda-python/issues/5976)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.18 to 3.0.20 ([#5977](https://github.com/aws-powertools/powertools-lambda-python/issues/5977)) - **deps:** bump pypa/gh-action-pypi-publish from 1.12.3 to 1.12.4 ([#5980](https://github.com/aws-powertools/powertools-lambda-python/issues/5980)) - **deps:** bump docker/setup-buildx-action from 3.8.0 to 3.9.0 ([#6042](https://github.com/aws-powertools/powertools-lambda-python/issues/6042)) - **deps:** bump docker/setup-qemu-action from 3.3.0 to 3.4.0 ([#6043](https://github.com/aws-powertools/powertools-lambda-python/issues/6043)) - **deps:** bump aws-actions/configure-aws-credentials from 4.0.2 to 4.0.3 ([#5975](https://github.com/aws-powertools/powertools-lambda-python/issues/5975)) - **deps:** bump squidfunk/mkdocs-material from `41942f7` to `471695f` in /docs ([#5979](https://github.com/aws-powertools/powertools-lambda-python/issues/5979)) - **deps:** bump actions/setup-go from 5.2.0 to 5.3.0 ([#5978](https://github.com/aws-powertools/powertools-lambda-python/issues/5978)) - **deps-dev:** bump aws-cdk from 2.178.0 to 2.178.1 ([#6053](https://github.com/aws-powertools/powertools-lambda-python/issues/6053)) - **deps-dev:** bump mkdocstrings-python from 1.13.0 to 1.14.2 ([#6011](https://github.com/aws-powertools/powertools-lambda-python/issues/6011)) - **deps-dev:** bump mkdocs-material from 9.6.1 to 9.6.2 ([#6009](https://github.com/aws-powertools/powertools-lambda-python/issues/6009)) - **deps-dev:** bump aws-cdk-lib from 2.178.0 to 2.178.1 ([#6047](https://github.com/aws-powertools/powertools-lambda-python/issues/6047)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.178.0a0 to 2.178.1a0 ([#6048](https://github.com/aws-powertools/powertools-lambda-python/issues/6048)) - **deps-dev:** bump boto3-stubs from 1.36.14 to 1.36.15 ([#6049](https://github.com/aws-powertools/powertools-lambda-python/issues/6049)) - **deps-dev:** bump boto3-stubs from 1.36.10 to 1.36.11 ([#6010](https://github.com/aws-powertools/powertools-lambda-python/issues/6010)) - **deps-dev:** bump boto3-stubs from 1.36.10 to 1.36.12 ([#6014](https://github.com/aws-powertools/powertools-lambda-python/issues/6014)) - **deps-dev:** bump ruff from 0.9.5 to 0.9.6 ([#6066](https://github.com/aws-powertools/powertools-lambda-python/issues/6066)) - **deps-dev:** bump mkdocstrings-python from 1.14.2 to 1.14.4 ([#6025](https://github.com/aws-powertools/powertools-lambda-python/issues/6025)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.177.0a0 to 2.178.0a0 ([#6041](https://github.com/aws-powertools/powertools-lambda-python/issues/6041)) - **deps-dev:** bump mkdocs-material from 9.5.50 to 9.6.1 ([#5966](https://github.com/aws-powertools/powertools-lambda-python/issues/5966)) - **deps-dev:** bump black from 24.10.0 to 25.1.0 ([#5968](https://github.com/aws-powertools/powertools-lambda-python/issues/5968)) - **deps-dev:** bump ruff from 0.9.3 to 0.9.4 ([#5969](https://github.com/aws-powertools/powertools-lambda-python/issues/5969)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.291 to 0.1.292 ([#6051](https://github.com/aws-powertools/powertools-lambda-python/issues/6051)) - **deps-dev:** bump cfn-lint from 1.22.7 to 1.23.1 ([#5967](https://github.com/aws-powertools/powertools-lambda-python/issues/5967)) - **deps-dev:** bump mkdocstrings-python from 1.14.5 to 1.14.6 ([#6050](https://github.com/aws-powertools/powertools-lambda-python/issues/6050)) - **deps-dev:** bump isort from 5.13.2 to 6.0.0 ([#5965](https://github.com/aws-powertools/powertools-lambda-python/issues/5965)) - **deps-dev:** bump ruff from 0.9.4 to 0.9.5 ([#6039](https://github.com/aws-powertools/powertools-lambda-python/issues/6039)) - **deps-dev:** bump aws-cdk-lib from 2.177.0 to 2.178.0 ([#6038](https://github.com/aws-powertools/powertools-lambda-python/issues/6038)) - **deps-dev:** bump mypy from 1.14.1 to 1.15.0 ([#6028](https://github.com/aws-powertools/powertools-lambda-python/issues/6028)) - **deps-dev:** bump mkdocstrings-python from 1.14.4 to 1.14.5 ([#6032](https://github.com/aws-powertools/powertools-lambda-python/issues/6032)) - **deps-dev:** bump cfn-lint from 1.23.1 to 1.24.0 ([#6030](https://github.com/aws-powertools/powertools-lambda-python/issues/6030)) - **deps-dev:** bump boto3-stubs from 1.36.14 to 1.36.16 ([#6057](https://github.com/aws-powertools/powertools-lambda-python/issues/6057)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.290 to 0.1.291 ([#6031](https://github.com/aws-powertools/powertools-lambda-python/issues/6031)) - **deps-dev:** bump boto3-stubs from 1.36.12 to 1.36.14 ([#6029](https://github.com/aws-powertools/powertools-lambda-python/issues/6029)) - **deps-dev:** bump mkdocs-material from 9.6.2 to 9.6.3 ([#6065](https://github.com/aws-powertools/powertools-lambda-python/issues/6065)) - **deps-dev:** bump coverage from 7.6.10 to 7.6.11 ([#6067](https://github.com/aws-powertools/powertools-lambda-python/issues/6067)) - **deps-dev:** bump aws-cdk from 2.177.0 to 2.178.0 ([#6040](https://github.com/aws-powertools/powertools-lambda-python/issues/6040)) - **docs:** enable privacy plugin in docs ([#6036](https://github.com/aws-powertools/powertools-lambda-python/issues/6036)) ## [v3.5.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v3.4.1...v3.5.0) - 2025-01-28 ## Bug Fixes - **event_handler:** fixes typo in variable name `fronzen_openapi_extensions` ([#5929](https://github.com/aws-powertools/powertools-lambda-python/issues/5929)) - **event_handler:** add tests for PEP 563 compatibility with OpenAPI ([#5886](https://github.com/aws-powertools/powertools-lambda-python/issues/5886)) - **event_handler:** fix forward references resolution in OpenAPI ([#5885](https://github.com/aws-powertools/powertools-lambda-python/issues/5885)) - **parser:** make identitySource optional for ApiGatewayAuthorizerRequestV2 model ([#5880](https://github.com/aws-powertools/powertools-lambda-python/issues/5880)) ## Documentation - **data_classes:** improve Event Source Data Classes documentation ([#5916](https://github.com/aws-powertools/powertools-lambda-python/issues/5916)) - **event_handler:** demonstrate handling optional security routes ([#5895](https://github.com/aws-powertools/powertools-lambda-python/issues/5895)) - **layer:** update layer version number - v3.4.1 ([#5869](https://github.com/aws-powertools/powertools-lambda-python/issues/5869)) - **parser:** improve documentation with Pydantic best practices ([#5925](https://github.com/aws-powertools/powertools-lambda-python/issues/5925)) ## Features - **event_source:** add AWS Transfer Family classes ([#5912](https://github.com/aws-powertools/powertools-lambda-python/issues/5912)) - **idempotency:** add support for custom Idempotency key prefix ([#5898](https://github.com/aws-powertools/powertools-lambda-python/issues/5898)) - **logger:** add context manager for logger keys ([#5883](https://github.com/aws-powertools/powertools-lambda-python/issues/5883)) - **parser:** add AWS Transfer Family model ([#5906](https://github.com/aws-powertools/powertools-lambda-python/issues/5906)) ## Maintenance - version bump - **ci:** adding poetry export plugin to support v2 ([#5941](https://github.com/aws-powertools/powertools-lambda-python/issues/5941)) - **ci:** adding poetry export plugin to support v2 ([#5938](https://github.com/aws-powertools/powertools-lambda-python/issues/5938)) - **ci:** adjust token permission ([#5867](https://github.com/aws-powertools/powertools-lambda-python/issues/5867)) - **ci:** new pre-release 3.4.2a0 ([#5873](https://github.com/aws-powertools/powertools-lambda-python/issues/5873)) - **ci:** make `pyproject.toml` fully compatible with Poetryv2 ([#5902](https://github.com/aws-powertools/powertools-lambda-python/issues/5902)) - **ci:** drop support for Python 3.8 ([#5896](https://github.com/aws-powertools/powertools-lambda-python/issues/5896)) - **ci:** update poetry version to v2 ([#5936](https://github.com/aws-powertools/powertools-lambda-python/issues/5936)) - **ci:** fix permissions for gh pages ([#5866](https://github.com/aws-powertools/powertools-lambda-python/issues/5866)) - **deps:** bump pydantic from 2.10.5 to 2.10.6 ([#5918](https://github.com/aws-powertools/powertools-lambda-python/issues/5918)) - **deps:** bump squidfunk/mkdocs-material from `ba73db5` to `41942f7` in /docs ([#5890](https://github.com/aws-powertools/powertools-lambda-python/issues/5890)) - **deps-dev:** bump boto3-stubs from 1.36.4 to 1.36.5 ([#5919](https://github.com/aws-powertools/powertools-lambda-python/issues/5919)) - **deps-dev:** bump boto3-stubs from 1.36.4 to 1.36.6 ([#5923](https://github.com/aws-powertools/powertools-lambda-python/issues/5923)) - **deps-dev:** bump cfn-lint from 1.22.6 to 1.22.7 ([#5910](https://github.com/aws-powertools/powertools-lambda-python/issues/5910)) - **deps-dev:** bump testcontainers from 3.7.1 to 4.9.1 ([#5907](https://github.com/aws-powertools/powertools-lambda-python/issues/5907)) - **deps-dev:** bump pytest-benchmark from 4.0.0 to 5.1.0 ([#5909](https://github.com/aws-powertools/powertools-lambda-python/issues/5909)) - **deps-dev:** bump aws-cdk from 2.176.0 to 2.177.0 ([#5930](https://github.com/aws-powertools/powertools-lambda-python/issues/5930)) - **deps-dev:** bump pytest-cov from 5.0.0 to 6.0.0 ([#5908](https://github.com/aws-powertools/powertools-lambda-python/issues/5908)) - **deps-dev:** bump aws-cdk-lib from 2.176.0 to 2.177.0 ([#5931](https://github.com/aws-powertools/powertools-lambda-python/issues/5931)) - **deps-dev:** bump cfn-lint from 1.22.5 to 1.22.6 ([#5900](https://github.com/aws-powertools/powertools-lambda-python/issues/5900)) - **deps-dev:** bump boto3-stubs from 1.36.6 to 1.36.7 ([#5932](https://github.com/aws-powertools/powertools-lambda-python/issues/5932)) - **deps-dev:** bump boto3-stubs from 1.36.2 to 1.36.3 ([#5894](https://github.com/aws-powertools/powertools-lambda-python/issues/5894)) - **deps-dev:** bump pytest-asyncio from 0.24.0 to 0.25.2 ([#5920](https://github.com/aws-powertools/powertools-lambda-python/issues/5920)) - **deps-dev:** bump mkdocs-material from 9.5.49 to 9.5.50 ([#5889](https://github.com/aws-powertools/powertools-lambda-python/issues/5889)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.175.1a0 to 2.176.0a0 ([#5882](https://github.com/aws-powertools/powertools-lambda-python/issues/5882)) - **deps-dev:** bump boto3-stubs from 1.36.1 to 1.36.2 ([#5881](https://github.com/aws-powertools/powertools-lambda-python/issues/5881)) - **deps-dev:** bump aws-cdk from 2.175.1 to 2.176.0 ([#5878](https://github.com/aws-powertools/powertools-lambda-python/issues/5878)) - **deps-dev:** bump ruff from 0.9.1 to 0.9.2 ([#5877](https://github.com/aws-powertools/powertools-lambda-python/issues/5877)) - **deps-dev:** bump aws-cdk-lib from 2.175.1 to 2.176.0 ([#5876](https://github.com/aws-powertools/powertools-lambda-python/issues/5876)) - **deps-dev:** bump mypy-boto3-appconfigdata from 1.35.93 to 1.36.0 in the boto-typing group ([#5875](https://github.com/aws-powertools/powertools-lambda-python/issues/5875)) - **deps-dev:** bump sentry-sdk from 2.19.2 to 2.20.0 ([#5870](https://github.com/aws-powertools/powertools-lambda-python/issues/5870)) - **deps-dev:** bump boto3-stubs from 1.35.97 to 1.35.99 ([#5874](https://github.com/aws-powertools/powertools-lambda-python/issues/5874)) - **deps-dev:** bump cfn-lint from 1.22.4 to 1.22.5 ([#5872](https://github.com/aws-powertools/powertools-lambda-python/issues/5872)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.176.0a0 to 2.177.0a0 ([#5933](https://github.com/aws-powertools/powertools-lambda-python/issues/5933)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.289 to 0.1.290 ([#5917](https://github.com/aws-powertools/powertools-lambda-python/issues/5917)) - **deps-dev:** bump ruff from 0.9.2 to 0.9.3 ([#5911](https://github.com/aws-powertools/powertools-lambda-python/issues/5911)) ## [v3.4.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v3.4.0...v3.4.1) - 2025-01-14 ## Bug Fixes - **appsync:** enhance consistency for custom resolver field naming in AppSync ([#5801](https://github.com/aws-powertools/powertools-lambda-python/issues/5801)) - **idempotency:** add support for Optional type when serializing output ([#5590](https://github.com/aws-powertools/powertools-lambda-python/issues/5590)) ## Documentation - **community:** data masking blog post ([#5831](https://github.com/aws-powertools/powertools-lambda-python/issues/5831)) - **home:** fix date typo and shorten message. ([#5798](https://github.com/aws-powertools/powertools-lambda-python/issues/5798)) - **layer:** update layer version number - v3.4.0 ([#5785](https://github.com/aws-powertools/powertools-lambda-python/issues/5785)) ## Maintenance - version bump - **ci:** new pre-release 3.4.1a7 ([#5816](https://github.com/aws-powertools/powertools-lambda-python/issues/5816)) - **ci:** new pre-release 3.4.1a0 ([#5783](https://github.com/aws-powertools/powertools-lambda-python/issues/5783)) - **ci:** change token permissions ([#5862](https://github.com/aws-powertools/powertools-lambda-python/issues/5862)) - **ci:** change token permissions / update aws-credentials action ([#5861](https://github.com/aws-powertools/powertools-lambda-python/issues/5861)) - **ci:** fix dependency resolution ([#5859](https://github.com/aws-powertools/powertools-lambda-python/issues/5859)) - **ci:** fix dependency resolution ([#5858](https://github.com/aws-powertools/powertools-lambda-python/issues/5858)) - **ci:** change token permissions ([#5865](https://github.com/aws-powertools/powertools-lambda-python/issues/5865)) - **ci:** new pre-release 3.4.1a1 ([#5789](https://github.com/aws-powertools/powertools-lambda-python/issues/5789)) - **ci:** new pre-release 3.4.1a2 ([#5791](https://github.com/aws-powertools/powertools-lambda-python/issues/5791)) - **ci:** new pre-release 3.4.1a3 ([#5794](https://github.com/aws-powertools/powertools-lambda-python/issues/5794)) - **ci:** new pre-release 3.4.1a10 ([#5845](https://github.com/aws-powertools/powertools-lambda-python/issues/5845)) - **ci:** new pre-release 3.4.1a4 ([#5796](https://github.com/aws-powertools/powertools-lambda-python/issues/5796)) - **ci:** new pre-release 3.4.1a5 ([#5807](https://github.com/aws-powertools/powertools-lambda-python/issues/5807)) - **ci:** new pre-release 3.4.1a8 ([#5818](https://github.com/aws-powertools/powertools-lambda-python/issues/5818)) - **ci:** new pre-release 3.4.1a6 ([#5813](https://github.com/aws-powertools/powertools-lambda-python/issues/5813)) - **ci:** new pre-release 3.4.1a9 ([#5822](https://github.com/aws-powertools/powertools-lambda-python/issues/5822)) - **deps:** bump pydantic from 2.10.4 to 2.10.5 ([#5848](https://github.com/aws-powertools/powertools-lambda-python/issues/5848)) - **deps:** bump jinja2 from 3.1.4 to 3.1.5 in /docs ([#5787](https://github.com/aws-powertools/powertools-lambda-python/issues/5787)) - **deps:** bump pydantic-settings from 2.7.0 to 2.7.1 ([#5815](https://github.com/aws-powertools/powertools-lambda-python/issues/5815)) - **deps-dev:** bump ruff from 0.8.4 to 0.8.6 ([#5833](https://github.com/aws-powertools/powertools-lambda-python/issues/5833)) - **deps-dev:** bump boto3-stubs from 1.35.90 to 1.35.92 ([#5827](https://github.com/aws-powertools/powertools-lambda-python/issues/5827)) - **deps-dev:** bump aws-cdk from 2.173.4 to 2.174.0 ([#5832](https://github.com/aws-powertools/powertools-lambda-python/issues/5832)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.173.2a0 to 2.173.4a0 ([#5811](https://github.com/aws-powertools/powertools-lambda-python/issues/5811)) - **deps-dev:** bump cfn-lint from 1.22.2 to 1.22.3 ([#5810](https://github.com/aws-powertools/powertools-lambda-python/issues/5810)) - **deps-dev:** bump boto3-stubs from 1.35.89 to 1.35.90 ([#5809](https://github.com/aws-powertools/powertools-lambda-python/issues/5809)) - **deps-dev:** bump mypy from 1.14.0 to 1.14.1 ([#5812](https://github.com/aws-powertools/powertools-lambda-python/issues/5812)) - **deps-dev:** bump boto3-stubs from 1.35.92 to 1.35.93 ([#5835](https://github.com/aws-powertools/powertools-lambda-python/issues/5835)) - **deps-dev:** bump aws-cdk-lib from 2.173.4 to 2.174.1 ([#5838](https://github.com/aws-powertools/powertools-lambda-python/issues/5838)) - **deps-dev:** bump mypy-boto3-appconfigdata from 1.35.0 to 1.35.93 in the boto-typing group ([#5840](https://github.com/aws-powertools/powertools-lambda-python/issues/5840)) - **deps-dev:** bump aws-cdk-lib from 2.173.2 to 2.173.4 ([#5803](https://github.com/aws-powertools/powertools-lambda-python/issues/5803)) - **deps-dev:** bump aws-cdk from 2.173.2 to 2.173.4 ([#5802](https://github.com/aws-powertools/powertools-lambda-python/issues/5802)) - **deps-dev:** bump boto3-stubs from 1.35.87 to 1.35.89 ([#5804](https://github.com/aws-powertools/powertools-lambda-python/issues/5804)) - **deps-dev:** bump jinja2 from 3.1.4 to 3.1.5 ([#5788](https://github.com/aws-powertools/powertools-lambda-python/issues/5788)) - **deps-dev:** bump aws-cdk from 2.174.0 to 2.174.1 ([#5841](https://github.com/aws-powertools/powertools-lambda-python/issues/5841)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.173.4a0 to 2.174.1a0 ([#5842](https://github.com/aws-powertools/powertools-lambda-python/issues/5842)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.287 to 0.1.288 ([#5793](https://github.com/aws-powertools/powertools-lambda-python/issues/5793)) - **deps-dev:** bump boto3-stubs from 1.35.93 to 1.35.94 ([#5844](https://github.com/aws-powertools/powertools-lambda-python/issues/5844)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.288 to 0.1.289 ([#5843](https://github.com/aws-powertools/powertools-lambda-python/issues/5843)) - **deps-dev:** bump boto3-stubs from 1.35.94 to 1.35.95 ([#5847](https://github.com/aws-powertools/powertools-lambda-python/issues/5847)) - **deps-dev:** bump cfn-lint from 1.22.3 to 1.22.4 ([#5849](https://github.com/aws-powertools/powertools-lambda-python/issues/5849)) - **deps-dev:** bump boto3-stubs from 1.35.95 to 1.35.96 ([#5850](https://github.com/aws-powertools/powertools-lambda-python/issues/5850)) - **deps-dev:** bump boto3-stubs from 1.35.96 to 1.35.97 ([#5852](https://github.com/aws-powertools/powertools-lambda-python/issues/5852)) - **deps-dev:** bump boto3-stubs from 1.35.86 to 1.35.87 ([#5786](https://github.com/aws-powertools/powertools-lambda-python/issues/5786)) - **deps-dev:** bump aws-cdk from 2.174.1 to 2.175.0 ([#5854](https://github.com/aws-powertools/powertools-lambda-python/issues/5854)) - **deps-dev:** bump aws-cdk from 2.175.0 to 2.175.1 ([#5863](https://github.com/aws-powertools/powertools-lambda-python/issues/5863)) - **deps-dev:** bump boto3-stubs from 1.35.85 to 1.35.86 ([#5780](https://github.com/aws-powertools/powertools-lambda-python/issues/5780)) - **deps-dev:** bump mypy from 1.13.0 to 1.14.0 ([#5779](https://github.com/aws-powertools/powertools-lambda-python/issues/5779)) - **deps-dev:** bump ruff from 0.8.6 to 0.9.1 ([#5853](https://github.com/aws-powertools/powertools-lambda-python/issues/5853)) - **deps-dev:** bump aws-cdk-lib from 2.174.1 to 2.175.1 ([#5856](https://github.com/aws-powertools/powertools-lambda-python/issues/5856)) ## [v3.4.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v3.3.0...v3.4.0) - 2024-12-20 ## Bug Fixes - **ci:** add overwrite to SSM workflow ([#5775](https://github.com/aws-powertools/powertools-lambda-python/issues/5775)) - **docs:** typo in homepage extra dependencies command ([#5681](https://github.com/aws-powertools/powertools-lambda-python/issues/5681)) - **openapi:** Allow values of any type in the examples of the Schema Object. ([#5575](https://github.com/aws-powertools/powertools-lambda-python/issues/5575)) - **parser:** remove AttributeError validation from event_parser function ([#5742](https://github.com/aws-powertools/powertools-lambda-python/issues/5742)) - **parser:** remove 'aws:' prefix from SelfManagedKafka model ([#5584](https://github.com/aws-powertools/powertools-lambda-python/issues/5584)) ## Code Refactoring - **event_handler:** add type annotations for router decorators ([#5601](https://github.com/aws-powertools/powertools-lambda-python/issues/5601)) - **event_handler:** add type annotations for `resolve` function ([#5602](https://github.com/aws-powertools/powertools-lambda-python/issues/5602)) ## Documentation - **layer:** update layer version number - v3.3.0 ([#5562](https://github.com/aws-powertools/powertools-lambda-python/issues/5562)) ## Features - **event_handler:** mark API operation as deprecated for OpenAPI documentation ([#5732](https://github.com/aws-powertools/powertools-lambda-python/issues/5732)) - **event_handler:** add exception handling mechanism for AppSyncResolver ([#5588](https://github.com/aws-powertools/powertools-lambda-python/issues/5588)) - **event_source:** Extend CodePipeline Artifact Capabilities ([#5448](https://github.com/aws-powertools/powertools-lambda-python/issues/5448)) - **layer:** add new ap-southeast-5 region ([#5769](https://github.com/aws-powertools/powertools-lambda-python/issues/5769)) - **metrics:** warn when overwriting dimension ([#5653](https://github.com/aws-powertools/powertools-lambda-python/issues/5653)) - **parser:** add models for API GW Websockets events ([#5597](https://github.com/aws-powertools/powertools-lambda-python/issues/5597)) - **ssm:** Parameters for resolving to versioned layers ([#5754](https://github.com/aws-powertools/powertools-lambda-python/issues/5754)) ## Maintenance - version bump - **ci:** new pre-release 3.3.1a14 ([#5713](https://github.com/aws-powertools/powertools-lambda-python/issues/5713)) - **ci:** new pre-release 3.3.1a21 ([#5773](https://github.com/aws-powertools/powertools-lambda-python/issues/5773)) - **ci:** new pre-release 3.3.1a0 ([#5565](https://github.com/aws-powertools/powertools-lambda-python/issues/5565)) - **ci:** new pre-release 3.3.1a1 ([#5577](https://github.com/aws-powertools/powertools-lambda-python/issues/5577)) - **ci:** disable dry run in layer balancing workflow ([#5768](https://github.com/aws-powertools/powertools-lambda-python/issues/5768)) - **ci:** new pre-release 3.3.1a20 ([#5766](https://github.com/aws-powertools/powertools-lambda-python/issues/5766)) - **ci:** new pre-release 3.3.1a10 ([#5679](https://github.com/aws-powertools/powertools-lambda-python/issues/5679)) - **ci:** add workflow to balance layers per region ([#5752](https://github.com/aws-powertools/powertools-lambda-python/issues/5752)) - **ci:** new pre-release 3.3.1a9 ([#5668](https://github.com/aws-powertools/powertools-lambda-python/issues/5668)) - **ci:** new pre-release 3.3.1a19 ([#5757](https://github.com/aws-powertools/powertools-lambda-python/issues/5757)) - **ci:** new pre-release 3.3.1a8 ([#5663](https://github.com/aws-powertools/powertools-lambda-python/issues/5663)) - **ci:** adding missing region in matrix ([#5777](https://github.com/aws-powertools/powertools-lambda-python/issues/5777)) - **ci:** new pre-release 3.3.1a2 ([#5585](https://github.com/aws-powertools/powertools-lambda-python/issues/5585)) - **ci:** new pre-release 3.3.1a11 ([#5688](https://github.com/aws-powertools/powertools-lambda-python/issues/5688)) - **ci:** new pre-release 3.3.1a3 ([#5598](https://github.com/aws-powertools/powertools-lambda-python/issues/5598)) - **ci:** new pre-release 3.3.1a7 ([#5656](https://github.com/aws-powertools/powertools-lambda-python/issues/5656)) - **ci:** new pre-release 3.3.1a6 ([#5650](https://github.com/aws-powertools/powertools-lambda-python/issues/5650)) - **ci:** new pre-release 3.3.1a12 ([#5697](https://github.com/aws-powertools/powertools-lambda-python/issues/5697)) - **ci:** new pre-release 3.3.1a18 ([#5739](https://github.com/aws-powertools/powertools-lambda-python/issues/5739)) - **ci:** replace closed-issue-message action with powertools action ([#5641](https://github.com/aws-powertools/powertools-lambda-python/issues/5641)) - **ci:** new pre-release 3.3.1a17 ([#5733](https://github.com/aws-powertools/powertools-lambda-python/issues/5733)) - **ci:** new pre-release 3.3.1a4 ([#5612](https://github.com/aws-powertools/powertools-lambda-python/issues/5612)) - **ci:** new pre-release 3.3.1a13 ([#5707](https://github.com/aws-powertools/powertools-lambda-python/issues/5707)) - **ci:** new pre-release 3.3.1a16 ([#5725](https://github.com/aws-powertools/powertools-lambda-python/issues/5725)) - **ci:** remove poetry cache in quality check pipeline ([#5626](https://github.com/aws-powertools/powertools-lambda-python/issues/5626)) - **ci:** revert closed issue action update ([#5637](https://github.com/aws-powertools/powertools-lambda-python/issues/5637)) - **ci:** new pre-release 3.3.1a15 ([#5720](https://github.com/aws-powertools/powertools-lambda-python/issues/5720)) - **ci:** new pre-release 3.3.1a5 ([#5639](https://github.com/aws-powertools/powertools-lambda-python/issues/5639)) - **deps:** bump squidfunk/mkdocs-material from `ef0b45e` to `d063d84` in /docs ([#5649](https://github.com/aws-powertools/powertools-lambda-python/issues/5649)) - **deps:** bump pydantic from 2.10.0 to 2.10.1 ([#5632](https://github.com/aws-powertools/powertools-lambda-python/issues/5632)) - **deps:** bump pypa/gh-action-pypi-publish from 1.12.2 to 1.12.3 ([#5709](https://github.com/aws-powertools/powertools-lambda-python/issues/5709)) - **deps:** bump codecov/codecov-action from 5.0.3 to 5.0.7 ([#5617](https://github.com/aws-powertools/powertools-lambda-python/issues/5617)) - **deps:** bump actions/dependency-review-action from 4.4.0 to 4.5.0 ([#5616](https://github.com/aws-powertools/powertools-lambda-python/issues/5616)) - **deps:** bump squidfunk/mkdocs-material from `ce587cb` to `ef0b45e` in /docs ([#5603](https://github.com/aws-powertools/powertools-lambda-python/issues/5603)) - **deps:** bump squidfunk/mkdocs-material from `d063d84` to `3f571e7` in /docs ([#5678](https://github.com/aws-powertools/powertools-lambda-python/issues/5678)) - **deps:** bump redis from 5.2.0 to 5.2.1 ([#5701](https://github.com/aws-powertools/powertools-lambda-python/issues/5701)) - **deps:** bump pydantic-settings from 2.6.1 to 2.7.0 ([#5735](https://github.com/aws-powertools/powertools-lambda-python/issues/5735)) - **deps:** bump aws-actions/closed-issue-message from 80edfc24bdf1283400eb04d20a8a605ae8bf7d48 to 37548691e7cc75ba58f85c9f873f9eee43590449 ([#5606](https://github.com/aws-powertools/powertools-lambda-python/issues/5606)) - **deps:** bump pydantic from 2.9.2 to 2.10.0 ([#5611](https://github.com/aws-powertools/powertools-lambda-python/issues/5611)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.17 to 3.0.18 ([#5743](https://github.com/aws-powertools/powertools-lambda-python/issues/5743)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.16 to 3.0.17 ([#5643](https://github.com/aws-powertools/powertools-lambda-python/issues/5643)) - **deps:** bump squidfunk/mkdocs-material from `3f571e7` to `d485eb6` in /docs ([#5710](https://github.com/aws-powertools/powertools-lambda-python/issues/5710)) - **deps:** bump codecov/codecov-action from 5.0.7 to 5.1.0 ([#5692](https://github.com/aws-powertools/powertools-lambda-python/issues/5692)) - **deps:** bump pydantic from 2.10.1 to 2.10.2 ([#5654](https://github.com/aws-powertools/powertools-lambda-python/issues/5654)) - **deps:** bump squidfunk/mkdocs-material from `d485eb6` to `ba73db5` in /docs ([#5746](https://github.com/aws-powertools/powertools-lambda-python/issues/5746)) - **deps:** bump docker/setup-buildx-action from 3.7.1 to 3.8.0 ([#5744](https://github.com/aws-powertools/powertools-lambda-python/issues/5744)) - **deps:** bump datadog-lambda from 6.101.0 to 6.102.0 ([#5570](https://github.com/aws-powertools/powertools-lambda-python/issues/5570)) - **deps:** bump pydantic from 2.10.2 to 2.10.3 ([#5682](https://github.com/aws-powertools/powertools-lambda-python/issues/5682)) - **deps:** bump aws-encryption-sdk from 3.3.0 to 4.0.0 ([#5564](https://github.com/aws-powertools/powertools-lambda-python/issues/5564)) - **deps:** bump pydantic from 2.10.3 to 2.10.4 ([#5760](https://github.com/aws-powertools/powertools-lambda-python/issues/5760)) - **deps:** bump actions/upload-artifact from 4.4.3 to 4.5.0 ([#5763](https://github.com/aws-powertools/powertools-lambda-python/issues/5763)) - **deps:** bump codecov/codecov-action from 5.1.1 to 5.1.2 ([#5764](https://github.com/aws-powertools/powertools-lambda-python/issues/5764)) - **deps:** bump codecov/codecov-action from 4.6.0 to 5.0.2 ([#5567](https://github.com/aws-powertools/powertools-lambda-python/issues/5567)) - **deps:** bump fastjsonschema from 2.20.0 to 2.21.1 ([#5676](https://github.com/aws-powertools/powertools-lambda-python/issues/5676)) - **deps:** bump datadog-lambda from 6.102.0 to 6.104.0 ([#5631](https://github.com/aws-powertools/powertools-lambda-python/issues/5631)) - **deps:** bump codecov/codecov-action from 5.1.0 to 5.1.1 ([#5703](https://github.com/aws-powertools/powertools-lambda-python/issues/5703)) - **deps:** bump codecov/codecov-action from 5.0.2 to 5.0.3 ([#5592](https://github.com/aws-powertools/powertools-lambda-python/issues/5592)) - **deps-dev:** bump httpx from 0.27.2 to 0.28.0 ([#5665](https://github.com/aws-powertools/powertools-lambda-python/issues/5665)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.171.0a0 to 2.171.1a0 ([#5666](https://github.com/aws-powertools/powertools-lambda-python/issues/5666)) - **deps-dev:** bump aws-cdk from 2.171.0 to 2.171.1 ([#5662](https://github.com/aws-powertools/powertools-lambda-python/issues/5662)) - **deps-dev:** bump aws-cdk-lib from 2.171.0 to 2.171.1 ([#5661](https://github.com/aws-powertools/powertools-lambda-python/issues/5661)) - **deps-dev:** bump boto3-stubs from 1.35.69 to 1.35.71 ([#5660](https://github.com/aws-powertools/powertools-lambda-python/issues/5660)) - **deps-dev:** bump cfn-lint from 1.20.0 to 1.20.1 ([#5659](https://github.com/aws-powertools/powertools-lambda-python/issues/5659)) - **deps-dev:** bump mkdocs-material from 9.5.46 to 9.5.47 ([#5677](https://github.com/aws-powertools/powertools-lambda-python/issues/5677)) - **deps-dev:** bump cfn-lint from 1.20.1 to 1.20.2 ([#5686](https://github.com/aws-powertools/powertools-lambda-python/issues/5686)) - **deps-dev:** bump boto3-stubs from 1.35.71 to 1.35.74 ([#5691](https://github.com/aws-powertools/powertools-lambda-python/issues/5691)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.284 to 0.1.285 ([#5642](https://github.com/aws-powertools/powertools-lambda-python/issues/5642)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.170.0a0 to 2.171.0a0 ([#5655](https://github.com/aws-powertools/powertools-lambda-python/issues/5655)) - **deps-dev:** bump ruff from 0.8.1 to 0.8.2 ([#5693](https://github.com/aws-powertools/powertools-lambda-python/issues/5693)) - **deps-dev:** bump pytest from 8.3.3 to 8.3.4 ([#5695](https://github.com/aws-powertools/powertools-lambda-python/issues/5695)) - **deps-dev:** bump mkdocs-material from 9.5.45 to 9.5.46 ([#5645](https://github.com/aws-powertools/powertools-lambda-python/issues/5645)) - **deps-dev:** bump sentry-sdk from 2.19.0 to 2.19.1 ([#5694](https://github.com/aws-powertools/powertools-lambda-python/issues/5694)) - **deps-dev:** bump aws-cdk-lib from 2.170.0 to 2.171.0 ([#5647](https://github.com/aws-powertools/powertools-lambda-python/issues/5647)) - **deps-dev:** bump aws-cdk from 2.170.0 to 2.171.0 ([#5648](https://github.com/aws-powertools/powertools-lambda-python/issues/5648)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.285 to 0.1.287 ([#5685](https://github.com/aws-powertools/powertools-lambda-python/issues/5685)) - **deps-dev:** bump boto3-stubs from 1.35.67 to 1.35.69 ([#5652](https://github.com/aws-powertools/powertools-lambda-python/issues/5652)) - **deps-dev:** bump sentry-sdk from 2.19.1 to 2.19.2 ([#5699](https://github.com/aws-powertools/powertools-lambda-python/issues/5699)) - **deps-dev:** bump ruff from 0.7.4 to 0.8.0 ([#5630](https://github.com/aws-powertools/powertools-lambda-python/issues/5630)) - **deps-dev:** bump types-python-dateutil from 2.9.0.20241003 to 2.9.0.20241206 ([#5700](https://github.com/aws-powertools/powertools-lambda-python/issues/5700)) - **deps-dev:** bump httpx from 0.28.0 to 0.28.1 ([#5702](https://github.com/aws-powertools/powertools-lambda-python/issues/5702)) - **deps-dev:** bump aws-cdk from 2.171.1 to 2.172.0 ([#5712](https://github.com/aws-powertools/powertools-lambda-python/issues/5712)) - **deps-dev:** bump cfn-lint from 1.20.2 to 1.21.0 ([#5711](https://github.com/aws-powertools/powertools-lambda-python/issues/5711)) - **deps-dev:** bump boto3-stubs from 1.35.76 to 1.35.77 ([#5716](https://github.com/aws-powertools/powertools-lambda-python/issues/5716)) - **deps-dev:** bump aws-cdk-lib from 2.171.1 to 2.172.0 ([#5719](https://github.com/aws-powertools/powertools-lambda-python/issues/5719)) - **deps-dev:** bump cfn-lint from 1.21.0 to 1.22.0 ([#5718](https://github.com/aws-powertools/powertools-lambda-python/issues/5718)) - **deps-dev:** bump aws-cdk from 2.169.0 to 2.170.0 ([#5628](https://github.com/aws-powertools/powertools-lambda-python/issues/5628)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.167.2a0 to 2.170.0a0 ([#5629](https://github.com/aws-powertools/powertools-lambda-python/issues/5629)) - **deps-dev:** bump boto3-stubs from 1.35.77 to 1.35.78 ([#5723](https://github.com/aws-powertools/powertools-lambda-python/issues/5723)) - **deps-dev:** bump sentry-sdk from 2.18.0 to 2.19.0 ([#5633](https://github.com/aws-powertools/powertools-lambda-python/issues/5633)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.171.1a0 to 2.172.0a0 ([#5724](https://github.com/aws-powertools/powertools-lambda-python/issues/5724)) - **deps-dev:** bump aws-cdk from 2.172.0 to 2.173.0 ([#5727](https://github.com/aws-powertools/powertools-lambda-python/issues/5727)) - **deps-dev:** bump mkdocs-material from 9.5.44 to 9.5.45 ([#5610](https://github.com/aws-powertools/powertools-lambda-python/issues/5610)) - **deps-dev:** bump ruff from 0.8.2 to 0.8.3 ([#5728](https://github.com/aws-powertools/powertools-lambda-python/issues/5728)) - **deps-dev:** bump boto3-stubs from 1.35.64 to 1.35.67 ([#5621](https://github.com/aws-powertools/powertools-lambda-python/issues/5621)) - **deps-dev:** bump aws-cdk-lib from 2.167.2 to 2.170.0 ([#5622](https://github.com/aws-powertools/powertools-lambda-python/issues/5622)) - **deps-dev:** bump cfn-lint from 1.22.0 to 1.22.1 ([#5729](https://github.com/aws-powertools/powertools-lambda-python/issues/5729)) - **deps-dev:** bump aws-cdk from 2.167.2 to 2.169.0 ([#5618](https://github.com/aws-powertools/powertools-lambda-python/issues/5618)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.282 to 0.1.284 ([#5607](https://github.com/aws-powertools/powertools-lambda-python/issues/5607)) - **deps-dev:** bump boto3-stubs from 1.35.78 to 1.35.80 ([#5730](https://github.com/aws-powertools/powertools-lambda-python/issues/5730)) - **deps-dev:** bump aws-cdk-lib from 2.172.0 to 2.173.0 ([#5731](https://github.com/aws-powertools/powertools-lambda-python/issues/5731)) - **deps-dev:** bump mkdocs-material from 9.5.47 to 9.5.48 ([#5717](https://github.com/aws-powertools/powertools-lambda-python/issues/5717)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.172.0a0 to 2.173.0a0 ([#5736](https://github.com/aws-powertools/powertools-lambda-python/issues/5736)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.167.1a0 to 2.167.2a0 ([#5619](https://github.com/aws-powertools/powertools-lambda-python/issues/5619)) - **deps-dev:** bump boto3-stubs from 1.35.80 to 1.35.81 ([#5750](https://github.com/aws-powertools/powertools-lambda-python/issues/5750)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.281 to 0.1.282 ([#5594](https://github.com/aws-powertools/powertools-lambda-python/issues/5594)) - **deps-dev:** bump cfn-lint from 1.19.0 to 1.20.0 ([#5595](https://github.com/aws-powertools/powertools-lambda-python/issues/5595)) - **deps-dev:** bump aws-cdk from 2.167.1 to 2.167.2 ([#5593](https://github.com/aws-powertools/powertools-lambda-python/issues/5593)) - **deps-dev:** bump cfn-lint from 1.22.1 to 1.22.2 ([#5749](https://github.com/aws-powertools/powertools-lambda-python/issues/5749)) - **deps-dev:** bump aws-cdk-lib from 2.167.1 to 2.167.2 ([#5596](https://github.com/aws-powertools/powertools-lambda-python/issues/5596)) - **deps-dev:** bump aws-cdk from 2.173.0 to 2.173.1 ([#5745](https://github.com/aws-powertools/powertools-lambda-python/issues/5745)) - **deps-dev:** bump boto3-stubs from 1.35.63 to 1.35.64 ([#5582](https://github.com/aws-powertools/powertools-lambda-python/issues/5582)) - **deps-dev:** bump mkdocs-material from 9.5.48 to 9.5.49 ([#5748](https://github.com/aws-powertools/powertools-lambda-python/issues/5748)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.167.0a0 to 2.167.1a0 ([#5583](https://github.com/aws-powertools/powertools-lambda-python/issues/5583)) - **deps-dev:** bump aws-cdk-lib from 2.173.0 to 2.173.1 ([#5747](https://github.com/aws-powertools/powertools-lambda-python/issues/5747)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.173.0a0 to 2.173.1a0 ([#5755](https://github.com/aws-powertools/powertools-lambda-python/issues/5755)) - **deps-dev:** bump aws-cdk from 2.173.1 to 2.173.2 ([#5762](https://github.com/aws-powertools/powertools-lambda-python/issues/5762)) - **deps-dev:** bump boto3-stubs from 1.35.81 to 1.35.84 ([#5765](https://github.com/aws-powertools/powertools-lambda-python/issues/5765)) - **deps-dev:** bump boto3-stubs from 1.35.60 to 1.35.63 ([#5581](https://github.com/aws-powertools/powertools-lambda-python/issues/5581)) - **deps-dev:** bump ruff from 0.8.0 to 0.8.1 ([#5671](https://github.com/aws-powertools/powertools-lambda-python/issues/5671)) - **deps-dev:** bump aws-cdk from 2.167.0 to 2.167.1 ([#5572](https://github.com/aws-powertools/powertools-lambda-python/issues/5572)) - **deps-dev:** bump boto3-stubs from 1.35.84 to 1.35.85 ([#5770](https://github.com/aws-powertools/powertools-lambda-python/issues/5770)) - **deps-dev:** bump ruff from 0.7.3 to 0.7.4 ([#5569](https://github.com/aws-powertools/powertools-lambda-python/issues/5569)) - **deps-dev:** bump aws-cdk-lib from 2.167.0 to 2.167.1 ([#5568](https://github.com/aws-powertools/powertools-lambda-python/issues/5568)) - **deps-dev:** bump ruff from 0.8.3 to 0.8.4 ([#5772](https://github.com/aws-powertools/powertools-lambda-python/issues/5772)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.173.1a0 to 2.173.2a0 ([#5771](https://github.com/aws-powertools/powertools-lambda-python/issues/5771)) - **deps-dev:** bump aws-cdk-lib from 2.173.1 to 2.173.2 ([#5759](https://github.com/aws-powertools/powertools-lambda-python/issues/5759)) - **layers:** balance Python 3.13 layers in GovCloud partition ([#5579](https://github.com/aws-powertools/powertools-lambda-python/issues/5579)) ## [v3.3.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v3.2.0...v3.3.0) - 2024-11-14 ## Bug Fixes - **appsync:** make contextual data accessible for async functions ([#5317](https://github.com/aws-powertools/powertools-lambda-python/issues/5317)) - **ci:** Update output to something easily copy/pasteable ([#5435](https://github.com/aws-powertools/powertools-lambda-python/issues/5435)) - **ci:** remove space ([#5433](https://github.com/aws-powertools/powertools-lambda-python/issues/5433)) - **metrics:** add warning for invalid dimension values; prevent their addition to EMF blobs ([#5542](https://github.com/aws-powertools/powertools-lambda-python/issues/5542)) - **parameters:** fix force_fetch feature when working with get_parameters ([#5515](https://github.com/aws-powertools/powertools-lambda-python/issues/5515)) - **parser:** support TypeAdapter instances as models ([#5535](https://github.com/aws-powertools/powertools-lambda-python/issues/5535)) ## Documentation - **layer:** update layer version number - v3.2.0 ([#5426](https://github.com/aws-powertools/powertools-lambda-python/issues/5426)) - **parser:** change parser documentation ([#5262](https://github.com/aws-powertools/powertools-lambda-python/issues/5262)) ## Features - **event_handler:** mutualTLS Security Scheme for OpenAPI ([#5484](https://github.com/aws-powertools/powertools-lambda-python/issues/5484)) - **layers:** introduce new CDK Python constructor for Powertools Lambda Layer ([#5320](https://github.com/aws-powertools/powertools-lambda-python/issues/5320)) - **runtime:** add Python 3.13 support ([#5527](https://github.com/aws-powertools/powertools-lambda-python/issues/5527)) ## Maintenance - version bump - **ci:** Bump CDK version to build layers and fix imports ([#5555](https://github.com/aws-powertools/powertools-lambda-python/issues/5555)) - **ci:** new pre-release 3.2.1a0 ([#5434](https://github.com/aws-powertools/powertools-lambda-python/issues/5434)) - **ci:** new pre-release 3.2.1a15 ([#5551](https://github.com/aws-powertools/powertools-lambda-python/issues/5551)) - **ci:** new pre-release 3.2.1a14 ([#5545](https://github.com/aws-powertools/powertools-lambda-python/issues/5545)) - **ci:** fix imports to build Lambda layer ([#5557](https://github.com/aws-powertools/powertools-lambda-python/issues/5557)) - **ci:** new pre-release 3.2.1a1 ([#5443](https://github.com/aws-powertools/powertools-lambda-python/issues/5443)) - **ci:** bump minimum required pydantic version ([#5446](https://github.com/aws-powertools/powertools-lambda-python/issues/5446)) - **ci:** new pre-release 3.2.1a2 ([#5456](https://github.com/aws-powertools/powertools-lambda-python/issues/5456)) - **ci:** new pre-release 3.2.1a12 ([#5524](https://github.com/aws-powertools/powertools-lambda-python/issues/5524)) - **ci:** new pre-release 3.2.1a3 ([#5465](https://github.com/aws-powertools/powertools-lambda-python/issues/5465)) - **ci:** new pre-release 3.2.1a4 ([#5470](https://github.com/aws-powertools/powertools-lambda-python/issues/5470)) - **ci:** new pre-release 3.2.1a5 ([#5473](https://github.com/aws-powertools/powertools-lambda-python/issues/5473)) - **ci:** new pre-release 3.2.1a11 ([#5517](https://github.com/aws-powertools/powertools-lambda-python/issues/5517)) - **ci:** new pre-release 3.2.1a6 ([#5480](https://github.com/aws-powertools/powertools-lambda-python/issues/5480)) - **ci:** new pre-release 3.2.1a7 ([#5488](https://github.com/aws-powertools/powertools-lambda-python/issues/5488)) - **ci:** new pre-release 3.2.1a10 ([#5509](https://github.com/aws-powertools/powertools-lambda-python/issues/5509)) - **ci:** new pre-release 3.2.1a8 ([#5497](https://github.com/aws-powertools/powertools-lambda-python/issues/5497)) - **ci:** new pre-release 3.2.1a9 ([#5504](https://github.com/aws-powertools/powertools-lambda-python/issues/5504)) - **ci:** new pre-release 3.2.1a13 ([#5537](https://github.com/aws-powertools/powertools-lambda-python/issues/5537)) - **deps:** bump pypa/gh-action-pypi-publish from 1.10.3 to 1.11.0 ([#5477](https://github.com/aws-powertools/powertools-lambda-python/issues/5477)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.15 to 3.0.16 ([#5499](https://github.com/aws-powertools/powertools-lambda-python/issues/5499)) - **deps:** bump actions/dependency-review-action from 4.3.4 to 4.3.5 ([#5431](https://github.com/aws-powertools/powertools-lambda-python/issues/5431)) - **deps:** bump actions/setup-python from 5.2.0 to 5.3.0 ([#5529](https://github.com/aws-powertools/powertools-lambda-python/issues/5529)) - **deps:** bump datadog-lambda from 6.99.0 to 6.100.0 ([#5491](https://github.com/aws-powertools/powertools-lambda-python/issues/5491)) - **deps:** bump actions/checkout from 4.2.1 to 4.2.2 ([#5438](https://github.com/aws-powertools/powertools-lambda-python/issues/5438)) - **deps:** bump actions/checkout from 4.2.0 to 4.2.2 ([#5531](https://github.com/aws-powertools/powertools-lambda-python/issues/5531)) - **deps:** bump actions/setup-node from 4.0.4 to 4.1.0 ([#5450](https://github.com/aws-powertools/powertools-lambda-python/issues/5450)) - **deps:** bump squidfunk/mkdocs-material from `2c2802b` to `ce587cb` in /docs ([#5507](https://github.com/aws-powertools/powertools-lambda-python/issues/5507)) - **deps:** bump actions/setup-python from 5.2.0 to 5.3.0 ([#5449](https://github.com/aws-powertools/powertools-lambda-python/issues/5449)) - **deps:** bump redis from 5.1.1 to 5.2.0 ([#5454](https://github.com/aws-powertools/powertools-lambda-python/issues/5454)) - **deps:** bump docker/setup-buildx-action from 2.4.1 to 3.7.1 ([#5530](https://github.com/aws-powertools/powertools-lambda-python/issues/5530)) - **deps:** bump squidfunk/mkdocs-material from `31eb7f7` to `2c2802b` in /docs ([#5487](https://github.com/aws-powertools/powertools-lambda-python/issues/5487)) - **deps:** bump docker/setup-qemu-action from 2.1.0 to 3.2.0 ([#5528](https://github.com/aws-powertools/powertools-lambda-python/issues/5528)) - **deps:** bump actions/dependency-review-action from 4.3.5 to 4.4.0 ([#5469](https://github.com/aws-powertools/powertools-lambda-python/issues/5469)) - **deps:** bump datadog-lambda from 6.100.0 to 6.101.0 ([#5513](https://github.com/aws-powertools/powertools-lambda-python/issues/5513)) - **deps:** bump pypa/gh-action-pypi-publish from 1.11.0 to 1.12.1 ([#5514](https://github.com/aws-powertools/powertools-lambda-python/issues/5514)) - **deps:** bump pypa/gh-action-pypi-publish from 1.12.1 to 1.12.2 ([#5519](https://github.com/aws-powertools/powertools-lambda-python/issues/5519)) - **deps-dev:** bump sentry-sdk from 2.17.0 to 2.18.0 ([#5502](https://github.com/aws-powertools/powertools-lambda-python/issues/5502)) - **deps-dev:** bump boto3-stubs from 1.35.51 to 1.35.52 ([#5478](https://github.com/aws-powertools/powertools-lambda-python/issues/5478)) - **deps-dev:** bump mkdocs-material from 9.5.43 to 9.5.44 ([#5506](https://github.com/aws-powertools/powertools-lambda-python/issues/5506)) - **deps-dev:** bump cfn-lint from 1.18.2 to 1.18.3 ([#5479](https://github.com/aws-powertools/powertools-lambda-python/issues/5479)) - **deps-dev:** bump boto3-stubs from 1.35.49 to 1.35.51 ([#5472](https://github.com/aws-powertools/powertools-lambda-python/issues/5472)) - **deps-dev:** bump aws-cdk from 2.165.0 to 2.166.0 ([#5520](https://github.com/aws-powertools/powertools-lambda-python/issues/5520)) - **deps-dev:** bump aws-cdk-lib from 2.165.0 to 2.166.0 ([#5522](https://github.com/aws-powertools/powertools-lambda-python/issues/5522)) - **deps-dev:** bump boto3-stubs from 1.35.52 to 1.35.53 ([#5485](https://github.com/aws-powertools/powertools-lambda-python/issues/5485)) - **deps-dev:** bump cfn-lint from 1.18.1 to 1.18.2 ([#5468](https://github.com/aws-powertools/powertools-lambda-python/issues/5468)) - **deps-dev:** bump boto3-stubs from 1.35.54 to 1.35.56 ([#5523](https://github.com/aws-powertools/powertools-lambda-python/issues/5523)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.163.1a0 to 2.164.1a0 ([#5467](https://github.com/aws-powertools/powertools-lambda-python/issues/5467)) - **deps-dev:** bump mkdocs-material from 9.5.42 to 9.5.43 ([#5486](https://github.com/aws-powertools/powertools-lambda-python/issues/5486)) - **deps-dev:** bump aws-cdk from 2.164.0 to 2.164.1 ([#5462](https://github.com/aws-powertools/powertools-lambda-python/issues/5462)) - **deps-dev:** bump boto3-stubs from 1.35.46 to 1.35.49 ([#5460](https://github.com/aws-powertools/powertools-lambda-python/issues/5460)) - **deps-dev:** bump aws-cdk-lib from 2.164.0 to 2.164.1 ([#5459](https://github.com/aws-powertools/powertools-lambda-python/issues/5459)) - **deps-dev:** bump ruff from 0.7.0 to 0.7.1 ([#5451](https://github.com/aws-powertools/powertools-lambda-python/issues/5451)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.278 to 0.1.279 ([#5512](https://github.com/aws-powertools/powertools-lambda-python/issues/5512)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.165.0a0 to 2.166.0a0 ([#5533](https://github.com/aws-powertools/powertools-lambda-python/issues/5533)) - **deps-dev:** bump aws-cdk-lib from 2.163.1 to 2.164.0 ([#5453](https://github.com/aws-powertools/powertools-lambda-python/issues/5453)) - **deps-dev:** bump aws-cdk from 2.163.1 to 2.164.0 ([#5452](https://github.com/aws-powertools/powertools-lambda-python/issues/5452)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.279 to 0.1.281 ([#5548](https://github.com/aws-powertools/powertools-lambda-python/issues/5548)) - **deps-dev:** bump aws-cdk-lib from 2.164.1 to 2.165.0 ([#5490](https://github.com/aws-powertools/powertools-lambda-python/issues/5490)) - **deps-dev:** bump boto3-stubs from 1.35.53 to 1.35.54 ([#5493](https://github.com/aws-powertools/powertools-lambda-python/issues/5493)) - **deps-dev:** bump aws-cdk from 2.164.1 to 2.165.0 ([#5494](https://github.com/aws-powertools/powertools-lambda-python/issues/5494)) - **deps-dev:** bump mypy from 1.11.2 to 1.13.0 ([#5440](https://github.com/aws-powertools/powertools-lambda-python/issues/5440)) - **deps-dev:** bump ruff from 0.7.2 to 0.7.3 ([#5532](https://github.com/aws-powertools/powertools-lambda-python/issues/5532)) - **deps-dev:** bump boto3-stubs from 1.35.56 to 1.35.58 ([#5540](https://github.com/aws-powertools/powertools-lambda-python/issues/5540)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.162.1a0 to 2.163.1a0 ([#5441](https://github.com/aws-powertools/powertools-lambda-python/issues/5441)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.277 to 0.1.278 ([#5439](https://github.com/aws-powertools/powertools-lambda-python/issues/5439)) - **deps-dev:** bump cfn-lint from 1.18.3 to 1.18.4 ([#5501](https://github.com/aws-powertools/powertools-lambda-python/issues/5501)) - **deps-dev:** bump cfn-lint from 1.18.4 to 1.19.0 ([#5544](https://github.com/aws-powertools/powertools-lambda-python/issues/5544)) - **deps-dev:** bump ruff from 0.7.1 to 0.7.2 ([#5492](https://github.com/aws-powertools/powertools-lambda-python/issues/5492)) - **deps-dev:** bump aws-cdk-lib from 2.162.1 to 2.163.1 ([#5429](https://github.com/aws-powertools/powertools-lambda-python/issues/5429)) - **deps-dev:** bump boto3-stubs from 1.35.45 to 1.35.46 ([#5430](https://github.com/aws-powertools/powertools-lambda-python/issues/5430)) - **deps-dev:** bump aws-cdk from 2.162.1 to 2.163.1 ([#5432](https://github.com/aws-powertools/powertools-lambda-python/issues/5432)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.164.1a0 to 2.165.0a0 ([#5500](https://github.com/aws-powertools/powertools-lambda-python/issues/5500)) - **deps-dev:** bump xenon from 0.9.1 to 0.9.3 ([#5428](https://github.com/aws-powertools/powertools-lambda-python/issues/5428)) - **deps-dev:** bump boto3-stubs from 1.35.58 to 1.35.59 ([#5549](https://github.com/aws-powertools/powertools-lambda-python/issues/5549)) - **layers:** add pydantic-settings package to v3 Layer ([#5516](https://github.com/aws-powertools/powertools-lambda-python/issues/5516)) ## [v3.2.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v3.1.0...v3.2.0) - 2024-10-22 ## Bug Fixes - test command in verify step ([#5381](https://github.com/aws-powertools/powertools-lambda-python/issues/5381)) - **ci:** Tables are nicer ([#5416](https://github.com/aws-powertools/powertools-lambda-python/issues/5416)) - **ci:** GovCloud layer verification ([#5382](https://github.com/aws-powertools/powertools-lambda-python/issues/5382)) - **ci:** Update partition name ([#5380](https://github.com/aws-powertools/powertools-lambda-python/issues/5380)) - **layer:** update partition name in the GovCloud workflow ([#5379](https://github.com/aws-powertools/powertools-lambda-python/issues/5379)) ## Documentation - Add GovCloud layer info ([#5414](https://github.com/aws-powertools/powertools-lambda-python/issues/5414)) - **event_handler:** add Terraform payload info for API Gateway HTTP API ([#5351](https://github.com/aws-powertools/powertools-lambda-python/issues/5351)) - **examples:** temporarily fix SAR version to v2.x ([#5360](https://github.com/aws-powertools/powertools-lambda-python/issues/5360)) - **layer:** update layer version number ([#5344](https://github.com/aws-powertools/powertools-lambda-python/issues/5344)) - **upgrade_guide:** update Lambda layer name ([#5347](https://github.com/aws-powertools/powertools-lambda-python/issues/5347)) ## Features - **ci:** GovCloud Layer Workflow ([#5261](https://github.com/aws-powertools/powertools-lambda-python/issues/5261)) - **logger:** add thread safe logging keys ([#5141](https://github.com/aws-powertools/powertools-lambda-python/issues/5141)) ## Maintenance - version bump - **ci:** new pre-release 3.1.1a0 ([#5353](https://github.com/aws-powertools/powertools-lambda-python/issues/5353)) - **ci:** Add dump of govcloud layer info in verify step ([#5415](https://github.com/aws-powertools/powertools-lambda-python/issues/5415)) - **deps:** bump squidfunk/mkdocs-material from `f9cb76d` to `0d4e687` in /docs ([#5395](https://github.com/aws-powertools/powertools-lambda-python/issues/5395)) - **deps:** bump actions/upload-artifact from 4.4.1 to 4.4.3 ([#5357](https://github.com/aws-powertools/powertools-lambda-python/issues/5357)) - **deps:** bump squidfunk/mkdocs-material from `8e8b333` to `f9cb76d` in /docs ([#5366](https://github.com/aws-powertools/powertools-lambda-python/issues/5366)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.14 to 3.0.15 ([#5418](https://github.com/aws-powertools/powertools-lambda-python/issues/5418)) - **deps:** bump jsonpath-ng from 1.6.1 to 1.7.0 ([#5369](https://github.com/aws-powertools/powertools-lambda-python/issues/5369)) - **deps:** bump squidfunk/mkdocs-material from `0d4e687` to `31eb7f7` in /docs ([#5417](https://github.com/aws-powertools/powertools-lambda-python/issues/5417)) - **deps:** bump actions/upload-artifact from 4.4.0 to 4.4.3 ([#5373](https://github.com/aws-powertools/powertools-lambda-python/issues/5373)) - **deps-dev:** bump boto3-stubs from 1.35.38 to 1.35.39 ([#5370](https://github.com/aws-powertools/powertools-lambda-python/issues/5370)) - **deps-dev:** bump boto3-stubs from 1.35.39 to 1.35.41 ([#5392](https://github.com/aws-powertools/powertools-lambda-python/issues/5392)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.161.1a0 to 2.162.1a0 ([#5386](https://github.com/aws-powertools/powertools-lambda-python/issues/5386)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.274 to 0.1.275 ([#5406](https://github.com/aws-powertools/powertools-lambda-python/issues/5406)) - **deps-dev:** bump boto3-stubs from 1.35.43 to 1.35.44 ([#5407](https://github.com/aws-powertools/powertools-lambda-python/issues/5407)) - **deps-dev:** bump cfn-lint from 1.17.2 to 1.18.1 ([#5423](https://github.com/aws-powertools/powertools-lambda-python/issues/5423)) - **deps-dev:** bump cfn-lint from 1.17.1 to 1.17.2 ([#5408](https://github.com/aws-powertools/powertools-lambda-python/issues/5408)) - **deps-dev:** bump aws-cdk-lib from 2.161.1 to 2.162.1 ([#5371](https://github.com/aws-powertools/powertools-lambda-python/issues/5371)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.273 to 0.1.274 ([#5394](https://github.com/aws-powertools/powertools-lambda-python/issues/5394)) - **deps-dev:** bump aws-cdk from 2.161.1 to 2.162.1 ([#5372](https://github.com/aws-powertools/powertools-lambda-python/issues/5372)) - **deps-dev:** bump boto3-stubs from 1.35.41 to 1.35.42 ([#5397](https://github.com/aws-powertools/powertools-lambda-python/issues/5397)) - **deps-dev:** bump cfn-lint from 1.16.1 to 1.17.1 ([#5404](https://github.com/aws-powertools/powertools-lambda-python/issues/5404)) - **deps-dev:** bump mkdocs-material from 9.5.40 to 9.5.41 ([#5393](https://github.com/aws-powertools/powertools-lambda-python/issues/5393)) - **deps-dev:** bump cfn-lint from 1.16.0 to 1.16.1 ([#5363](https://github.com/aws-powertools/powertools-lambda-python/issues/5363)) - **deps-dev:** bump boto3-stubs from 1.35.37 to 1.35.38 ([#5364](https://github.com/aws-powertools/powertools-lambda-python/issues/5364)) - **deps-dev:** bump mkdocs-material from 9.5.39 to 9.5.40 ([#5365](https://github.com/aws-powertools/powertools-lambda-python/issues/5365)) - **deps-dev:** bump ruff from 0.6.9 to 0.7.0 ([#5403](https://github.com/aws-powertools/powertools-lambda-python/issues/5403)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.275 to 0.1.277 ([#5419](https://github.com/aws-powertools/powertools-lambda-python/issues/5419)) - **deps-dev:** bump boto3-stubs from 1.35.42 to 1.35.43 ([#5402](https://github.com/aws-powertools/powertools-lambda-python/issues/5402)) - **deps-dev:** bump boto3-stubs from 1.35.36 to 1.35.37 ([#5356](https://github.com/aws-powertools/powertools-lambda-python/issues/5356)) - **deps-dev:** bump nox from 2024.4.15 to 2024.10.9 ([#5355](https://github.com/aws-powertools/powertools-lambda-python/issues/5355)) - **deps-dev:** bump mkdocs-material from 9.5.41 to 9.5.42 ([#5420](https://github.com/aws-powertools/powertools-lambda-python/issues/5420)) - **deps-dev:** bump boto3-stubs from 1.35.44 to 1.35.45 ([#5421](https://github.com/aws-powertools/powertools-lambda-python/issues/5421)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.161.0a0 to 2.161.1a0 ([#5349](https://github.com/aws-powertools/powertools-lambda-python/issues/5349)) - **deps-dev:** bump boto3-stubs from 1.35.35 to 1.35.36 ([#5350](https://github.com/aws-powertools/powertools-lambda-python/issues/5350)) - **deps-dev:** bump sentry-sdk from 2.15.0 to 2.16.0 ([#5348](https://github.com/aws-powertools/powertools-lambda-python/issues/5348)) - **deps-dev:** bump sentry-sdk from 2.16.0 to 2.17.0 ([#5400](https://github.com/aws-powertools/powertools-lambda-python/issues/5400)) - **docs:** remove layer callout from data masking docs ([#5377](https://github.com/aws-powertools/powertools-lambda-python/issues/5377)) ## [v3.1.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v3.0.0...v3.1.0) - 2024-10-08 ## Bug Fixes - **ci:** Layer Rename Fix ([#5291](https://github.com/aws-powertools/powertools-lambda-python/issues/5291)) - **ci:** layer rename ([#5283](https://github.com/aws-powertools/powertools-lambda-python/issues/5283)) - **idempotency:** fix response hook invocation when function returns None ([#5251](https://github.com/aws-powertools/powertools-lambda-python/issues/5251)) - **layer:** reverting SSM parameter name ([#5340](https://github.com/aws-powertools/powertools-lambda-python/issues/5340)) - **layers:** rename Lambda layer name from x86 to x86_64 ([#5226](https://github.com/aws-powertools/powertools-lambda-python/issues/5226)) - **parser:** fallback to `validate_python` when using `type[Model]` and nested models ([#5313](https://github.com/aws-powertools/powertools-lambda-python/issues/5313)) - **parser:** revert a regression in v3 when raising ValidationError ([#5259](https://github.com/aws-powertools/powertools-lambda-python/issues/5259)) - **parser:** make size and etag optional for LifecycleExpiration events in S3 ([#5250](https://github.com/aws-powertools/powertools-lambda-python/issues/5250)) ## Code Refactoring - **examples:** fix issues reported by SonarCloud and Scorecard ([#5315](https://github.com/aws-powertools/powertools-lambda-python/issues/5315)) ## Documentation - **idempotency:** fix description in `Advanced` table ([#5191](https://github.com/aws-powertools/powertools-lambda-python/issues/5191)) - **metrics:** fix test references ([#5265](https://github.com/aws-powertools/powertools-lambda-python/issues/5265)) - **public_reference:** add Flyweight as a public reference ([#5322](https://github.com/aws-powertools/powertools-lambda-python/issues/5322)) - **upgrade_guide:** update upgrade guide with Pydantic information ([#5316](https://github.com/aws-powertools/powertools-lambda-python/issues/5316)) - **v3:** fix small things in the documentation ([#5224](https://github.com/aws-powertools/powertools-lambda-python/issues/5224)) - **versioning:** add v2 maintainance mode banner ([#5240](https://github.com/aws-powertools/powertools-lambda-python/issues/5240)) ## Features - **event_source:** add CodeDeploy Lifecycle Hook event ([#5219](https://github.com/aws-powertools/powertools-lambda-python/issues/5219)) - **openapi:** enable direct list input in Examples model ([#5318](https://github.com/aws-powertools/powertools-lambda-python/issues/5318)) ## Maintenance - version bump - **ci:** new pre-release 3.0.1a7 ([#5299](https://github.com/aws-powertools/powertools-lambda-python/issues/5299)) - **ci:** new pre-release 3.0.1a3 ([#5270](https://github.com/aws-powertools/powertools-lambda-python/issues/5270)) - **ci:** new pre-release 3.0.1a4 ([#5277](https://github.com/aws-powertools/powertools-lambda-python/issues/5277)) - **ci:** new pre-release 3.0.1a2 ([#5258](https://github.com/aws-powertools/powertools-lambda-python/issues/5258)) - **ci:** new pre-release 3.0.1a5 ([#5288](https://github.com/aws-powertools/powertools-lambda-python/issues/5288)) - **ci:** new pre-release 3.0.1a9 ([#5337](https://github.com/aws-powertools/powertools-lambda-python/issues/5337)) - **ci:** new pre-release 3.0.1a8 ([#5323](https://github.com/aws-powertools/powertools-lambda-python/issues/5323)) - **ci:** new pre-release 3.0.1a0 ([#5220](https://github.com/aws-powertools/powertools-lambda-python/issues/5220)) - **ci:** new pre-release 3.0.1a1 ([#5247](https://github.com/aws-powertools/powertools-lambda-python/issues/5247)) - **ci:** new pre-release 3.0.1a6 ([#5293](https://github.com/aws-powertools/powertools-lambda-python/issues/5293)) - **deps:** bump actions/download-artifact from 4.1.7 to 4.1.8 ([#5203](https://github.com/aws-powertools/powertools-lambda-python/issues/5203)) - **deps:** bump squidfunk/mkdocs-material from `22a429f` to `08fbf58` in /docs ([#5243](https://github.com/aws-powertools/powertools-lambda-python/issues/5243)) - **deps:** bump docker/setup-buildx-action from 3.6.1 to 3.7.0 ([#5298](https://github.com/aws-powertools/powertools-lambda-python/issues/5298)) - **deps:** bump actions/checkout from 4.1.7 to 4.2.0 ([#5244](https://github.com/aws-powertools/powertools-lambda-python/issues/5244)) - **deps:** bump actions/setup-node from 4.0.3 to 4.0.4 ([#5186](https://github.com/aws-powertools/powertools-lambda-python/issues/5186)) - **deps:** bump docker/setup-buildx-action from 3.7.0 to 3.7.1 ([#5310](https://github.com/aws-powertools/powertools-lambda-python/issues/5310)) - **deps:** bump pypa/gh-action-pypi-publish from 1.10.2 to 1.10.3 ([#5311](https://github.com/aws-powertools/powertools-lambda-python/issues/5311)) - **deps:** bump squidfunk/mkdocs-material from `a2e3a31` to `22a429f` in /docs ([#5201](https://github.com/aws-powertools/powertools-lambda-python/issues/5201)) - **deps:** bump pypa/gh-action-pypi-publish from 1.10.1 to 1.10.2 ([#5202](https://github.com/aws-powertools/powertools-lambda-python/issues/5202)) - **deps:** bump actions/checkout from 4.2.0 to 4.2.1 ([#5329](https://github.com/aws-powertools/powertools-lambda-python/issues/5329)) - **deps:** bump squidfunk/mkdocs-material from `08fbf58` to `7aea359` in /docs ([#5253](https://github.com/aws-powertools/powertools-lambda-python/issues/5253)) - **deps:** bump actions/setup-python from 5.1.0 to 5.2.0 ([#5204](https://github.com/aws-powertools/powertools-lambda-python/issues/5204)) - **deps:** bump codecov/codecov-action from 4.5.0 to 4.6.0 ([#5287](https://github.com/aws-powertools/powertools-lambda-python/issues/5287)) - **deps:** bump redis from 5.1.0 to 5.1.1 ([#5331](https://github.com/aws-powertools/powertools-lambda-python/issues/5331)) - **deps:** bump actions/checkout from 4.1.6 to 4.1.7 ([#5206](https://github.com/aws-powertools/powertools-lambda-python/issues/5206)) - **deps:** bump actions/upload-artifact from 4.4.0 to 4.4.1 ([#5328](https://github.com/aws-powertools/powertools-lambda-python/issues/5328)) - **deps:** bump actions/upload-artifact from 4.3.3 to 4.4.0 ([#5217](https://github.com/aws-powertools/powertools-lambda-python/issues/5217)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.12 to 3.0.13 ([#5276](https://github.com/aws-powertools/powertools-lambda-python/issues/5276)) - **deps:** bump redis from 5.0.8 to 5.1.0 ([#5264](https://github.com/aws-powertools/powertools-lambda-python/issues/5264)) - **deps:** bump datadog-lambda from 6.98.0 to 6.99.0 ([#5333](https://github.com/aws-powertools/powertools-lambda-python/issues/5333)) - **deps:** bump squidfunk/mkdocs-material from `7aea359` to `8e8b333` in /docs ([#5272](https://github.com/aws-powertools/powertools-lambda-python/issues/5272)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.13 to 3.0.14 ([#5330](https://github.com/aws-powertools/powertools-lambda-python/issues/5330)) - **deps:** bump docker/setup-qemu-action from 3.0.0 to 3.2.0 ([#5205](https://github.com/aws-powertools/powertools-lambda-python/issues/5205)) - **deps-dev:** bump mkdocs-material from 9.5.38 to 9.5.39 ([#5273](https://github.com/aws-powertools/powertools-lambda-python/issues/5273)) - **deps-dev:** bump cfn-lint from 1.15.1 to 1.15.2 ([#5274](https://github.com/aws-powertools/powertools-lambda-python/issues/5274)) - **deps-dev:** bump boto3-stubs from 1.35.28 to 1.35.29 ([#5263](https://github.com/aws-powertools/powertools-lambda-python/issues/5263)) - **deps-dev:** bump boto3-stubs from 1.35.34 to 1.35.35 ([#5334](https://github.com/aws-powertools/powertools-lambda-python/issues/5334)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.270 to 0.1.271 ([#5284](https://github.com/aws-powertools/powertools-lambda-python/issues/5284)) - **deps-dev:** bump mkdocs-material from 9.5.37 to 9.5.38 ([#5255](https://github.com/aws-powertools/powertools-lambda-python/issues/5255)) - **deps-dev:** bump ruff from 0.6.7 to 0.6.8 ([#5254](https://github.com/aws-powertools/powertools-lambda-python/issues/5254)) - **deps-dev:** bump boto3-stubs from 1.35.27 to 1.35.28 ([#5256](https://github.com/aws-powertools/powertools-lambda-python/issues/5256)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.269 to 0.1.270 ([#5257](https://github.com/aws-powertools/powertools-lambda-python/issues/5257)) - **deps-dev:** bump sentry-sdk from 2.14.0 to 2.15.0 ([#5285](https://github.com/aws-powertools/powertools-lambda-python/issues/5285)) - **deps-dev:** bump boto3-stubs from 1.35.29 to 1.35.31 ([#5286](https://github.com/aws-powertools/powertools-lambda-python/issues/5286)) - **deps-dev:** bump boto3-stubs from 1.35.31 to 1.35.32 ([#5292](https://github.com/aws-powertools/powertools-lambda-python/issues/5292)) - **deps-dev:** bump aws-cdk-lib from 2.161.0 to 2.161.1 ([#5335](https://github.com/aws-powertools/powertools-lambda-python/issues/5335)) - **deps-dev:** bump boto3-stubs from 1.35.32 to 1.35.33 ([#5295](https://github.com/aws-powertools/powertools-lambda-python/issues/5295)) - **deps-dev:** bump types-python-dateutil from 2.9.0.20240906 to 2.9.0.20241003 ([#5296](https://github.com/aws-powertools/powertools-lambda-python/issues/5296)) - **deps-dev:** bump boto3-stubs from 1.35.26 to 1.35.27 ([#5242](https://github.com/aws-powertools/powertools-lambda-python/issues/5242)) - **deps-dev:** bump mkdocs-material from 9.5.36 to 9.5.37 ([#5241](https://github.com/aws-powertools/powertools-lambda-python/issues/5241)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.271 to 0.1.272 ([#5297](https://github.com/aws-powertools/powertools-lambda-python/issues/5297)) - **deps-dev:** bump boto3-stubs from 1.35.25 to 1.35.26 ([#5234](https://github.com/aws-powertools/powertools-lambda-python/issues/5234)) - **deps-dev:** bump aws-cdk from 2.159.1 to 2.160.0 ([#5233](https://github.com/aws-powertools/powertools-lambda-python/issues/5233)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.159.1a0 to 2.160.0a0 ([#5235](https://github.com/aws-powertools/powertools-lambda-python/issues/5235)) - **deps-dev:** bump aws-cdk-lib from 2.159.1 to 2.160.0 ([#5230](https://github.com/aws-powertools/powertools-lambda-python/issues/5230)) - **deps-dev:** bump cfn-lint from 1.15.0 to 1.15.1 ([#5232](https://github.com/aws-powertools/powertools-lambda-python/issues/5232)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.158.0a0 to 2.159.1a0 ([#5231](https://github.com/aws-powertools/powertools-lambda-python/issues/5231)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.268 to 0.1.269 ([#5229](https://github.com/aws-powertools/powertools-lambda-python/issues/5229)) - **deps-dev:** bump aws-cdk-lib from 2.160.0 to 2.161.0 ([#5304](https://github.com/aws-powertools/powertools-lambda-python/issues/5304)) - **deps-dev:** bump boto3-stubs from 1.35.33 to 1.35.34 ([#5306](https://github.com/aws-powertools/powertools-lambda-python/issues/5306)) - **deps-dev:** bump types-redis from 4.6.0.20240903 to 4.6.0.20241004 ([#5307](https://github.com/aws-powertools/powertools-lambda-python/issues/5307)) - **deps-dev:** bump aws-cdk-lib from 2.158.0 to 2.159.1 ([#5208](https://github.com/aws-powertools/powertools-lambda-python/issues/5208)) - **deps-dev:** bump ruff from 0.6.4 to 0.6.7 ([#5207](https://github.com/aws-powertools/powertools-lambda-python/issues/5207)) - **deps-dev:** bump aws-cdk from 2.157.0 to 2.159.1 ([#5194](https://github.com/aws-powertools/powertools-lambda-python/issues/5194)) - **deps-dev:** bump aws-cdk from 2.160.0 to 2.161.0 ([#5309](https://github.com/aws-powertools/powertools-lambda-python/issues/5309)) - **deps-dev:** bump ruff from 0.6.8 to 0.6.9 ([#5308](https://github.com/aws-powertools/powertools-lambda-python/issues/5308)) - **deps-dev:** bump cfn-lint from 1.15.2 to 1.16.0 ([#5305](https://github.com/aws-powertools/powertools-lambda-python/issues/5305)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.160.0a0 to 2.161.0a0 ([#5332](https://github.com/aws-powertools/powertools-lambda-python/issues/5332)) - **deps-dev:** bump aws-cdk from 2.161.0 to 2.161.1 ([#5327](https://github.com/aws-powertools/powertools-lambda-python/issues/5327)) - **deps-dev:** bump mkdocs-material from 9.5.34 to 9.5.36 ([#5210](https://github.com/aws-powertools/powertools-lambda-python/issues/5210)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.272 to 0.1.273 ([#5336](https://github.com/aws-powertools/powertools-lambda-python/issues/5336)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.264 to 0.1.268 ([#5216](https://github.com/aws-powertools/powertools-lambda-python/issues/5216)) - **deps-dev:** bump multiprocess from 0.70.16 to 0.70.17 ([#5275](https://github.com/aws-powertools/powertools-lambda-python/issues/5275)) - **deps-dev:** bump boto3-stubs from 1.35.17 to 1.35.25 ([#5218](https://github.com/aws-powertools/powertools-lambda-python/issues/5218)) - **deps-dev:** bump bandit from 1.7.9 to 1.7.10 ([#5214](https://github.com/aws-powertools/powertools-lambda-python/issues/5214)) - **deps-dev:** bump cfn-lint from 1.12.4 to 1.15.0 ([#5215](https://github.com/aws-powertools/powertools-lambda-python/issues/5215)) - **docs:** recreate requirements.txt file for mkdocs container ([#5246](https://github.com/aws-powertools/powertools-lambda-python/issues/5246)) - **tests:** fix e2e tests in Idempotency utility ([#5280](https://github.com/aws-powertools/powertools-lambda-python/issues/5280)) ## [v3.0.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.43.1...v3.0.0) - 2024-09-23 ## Bug Fixes - **v3:** revert unnecessary changes that impacts v3 ([#5087](https://github.com/aws-powertools/powertools-lambda-python/issues/5087)) ## Code Refactoring - **batch:** add from **future** import annotations ([#4993](https://github.com/aws-powertools/powertools-lambda-python/issues/4993)) - **batch_processing:** mark batch_processor and async_batch_processor as deprecated ([#4910](https://github.com/aws-powertools/powertools-lambda-python/issues/4910)) - **data_classes:** add from **future** import annotations ([#4939](https://github.com/aws-powertools/powertools-lambda-python/issues/4939)) - **data_masking:** add from **future** import annotations ([#4945](https://github.com/aws-powertools/powertools-lambda-python/issues/4945)) - **event_handler:** add from **future** import annotations ([#4992](https://github.com/aws-powertools/powertools-lambda-python/issues/4992)) - **event_handler:** add from **future** import annotations in the Middlewares ([#4975](https://github.com/aws-powertools/powertools-lambda-python/issues/4975)) - **feature_flags:** add from **future** import annotations ([#4960](https://github.com/aws-powertools/powertools-lambda-python/issues/4960)) - **general:** drop pydantic v1 ([#4305](https://github.com/aws-powertools/powertools-lambda-python/issues/4305)) - **idempotency:** add from **future** import annotations ([#4961](https://github.com/aws-powertools/powertools-lambda-python/issues/4961)) - **jmespath_utils:** deprecate extract_data_from_envelope in favor of query ([#4907](https://github.com/aws-powertools/powertools-lambda-python/issues/4907)) - **jmespath_utils:** add from **future** import annotations ([#4962](https://github.com/aws-powertools/powertools-lambda-python/issues/4962)) - **logging:** add from **future** import annotations ([#4940](https://github.com/aws-powertools/powertools-lambda-python/issues/4940)) - **metrics:** add from **future** import annotations ([#4944](https://github.com/aws-powertools/powertools-lambda-python/issues/4944)) - **middleware_factory:** add from **future** import annotations ([#4941](https://github.com/aws-powertools/powertools-lambda-python/issues/4941)) - **openapi:** add from **future** import annotations ([#4990](https://github.com/aws-powertools/powertools-lambda-python/issues/4990)) - **parameters:** deprecate the config parameter in favor of boto_config ([#4893](https://github.com/aws-powertools/powertools-lambda-python/issues/4893)) - **parameters:** add top-level get_multiple method in SSMProvider class ([#4785](https://github.com/aws-powertools/powertools-lambda-python/issues/4785)) - **parameters:** add from **future** import annotations ([#4976](https://github.com/aws-powertools/powertools-lambda-python/issues/4976)) - **parameters:** increase default max_age (cache) to 5 minutes ([#4279](https://github.com/aws-powertools/powertools-lambda-python/issues/4279)) - **parser:** add from **future** import annotations ([#4977](https://github.com/aws-powertools/powertools-lambda-python/issues/4977)) - **parser:** add from **future** import annotations ([#4983](https://github.com/aws-powertools/powertools-lambda-python/issues/4983)) - **shared:** add from **future** import annotations ([#4942](https://github.com/aws-powertools/powertools-lambda-python/issues/4942)) - **streaming:** add from **future** import annotations ([#4987](https://github.com/aws-powertools/powertools-lambda-python/issues/4987)) - **tracing:** add from **future** import annotations ([#4943](https://github.com/aws-powertools/powertools-lambda-python/issues/4943)) - **typing:** add from **future** import annotations ([#4985](https://github.com/aws-powertools/powertools-lambda-python/issues/4985)) - **typing:** enable TCH, UP and FA100 ruff rules ([#5017](https://github.com/aws-powertools/powertools-lambda-python/issues/5017)) - **typing:** reduce aws_lambda_powertools.shared.types usage ([#4896](https://github.com/aws-powertools/powertools-lambda-python/issues/4896)) - **typing:** enable boto3 implicit type annotations ([#4692](https://github.com/aws-powertools/powertools-lambda-python/issues/4692)) - **typing:** move more types into TYPE_CHECKING ([#5088](https://github.com/aws-powertools/powertools-lambda-python/issues/5088)) - **validation:** add from **future** import annotations ([#4984](https://github.com/aws-powertools/powertools-lambda-python/issues/4984)) ## Documentation - **upgrade_guide:** create upgrade guide from v2 to v3 ([#5028](https://github.com/aws-powertools/powertools-lambda-python/issues/5028)) ## Features - **data_classes:** return empty dict or list instead of None ([#4606](https://github.com/aws-powertools/powertools-lambda-python/issues/4606)) - **event_handler:** Ensure Bedrock Agents resolver works with Pydantic v2 ([#5156](https://github.com/aws-powertools/powertools-lambda-python/issues/5156)) - **idempotency:** simplify access to expiration time in `DataRecord` class ([#5082](https://github.com/aws-powertools/powertools-lambda-python/issues/5082)) - **lambda-layer:** add pipeline to build Lambda layer in v3 ([#4826](https://github.com/aws-powertools/powertools-lambda-python/issues/4826)) - **parser:** Adds DDB deserialization to DynamoDBStreamChangedRecordModel ([#4401](https://github.com/aws-powertools/powertools-lambda-python/issues/4401)) - **parser:** Allow primitive data types to be parsed using TypeAdapter ([#4502](https://github.com/aws-powertools/powertools-lambda-python/issues/4502)) - **v3:** merging develop into v3 ([#5160](https://github.com/aws-powertools/powertools-lambda-python/issues/5160)) ## Maintenance - version bump - **ci:** fix bump poetry version ([#5211](https://github.com/aws-powertools/powertools-lambda-python/issues/5211)) - **ci:** fix working-directory in v3 layer pipeline ([#5199](https://github.com/aws-powertools/powertools-lambda-python/issues/5199)) - **ci:** fix Redis e2e tests in v3 branch ([#4852](https://github.com/aws-powertools/powertools-lambda-python/issues/4852)) - **ci:** fix e2e tests in v3 branch ([#4848](https://github.com/aws-powertools/powertools-lambda-python/issues/4848)) - **ci:** add the aws-encryption-sdk dependency in the Lambda layer ([#4630](https://github.com/aws-powertools/powertools-lambda-python/issues/4630)) - **ci:** bump pydantic library to 2.0+ and boto3 to 1.34.32 ([#4235](https://github.com/aws-powertools/powertools-lambda-python/issues/4235)) - **v3:** merging develop into v3 - 15/05/2024 ([#4335](https://github.com/aws-powertools/powertools-lambda-python/issues/4335)) - **v3:** merging develop into v3 ([#4267](https://github.com/aws-powertools/powertools-lambda-python/issues/4267)) ## [v2.43.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.43.0...v2.43.1) - 2024-08-12 ## Bug Fixes - **event_source:** fix regression when working with zero numbers in DynamoDBStreamEvent ([#4932](https://github.com/aws-powertools/powertools-lambda-python/issues/4932)) ## Maintenance - version bump - **ci:** new pre-release 2.43.1a0 ([#4920](https://github.com/aws-powertools/powertools-lambda-python/issues/4920)) - **ci:** new pre-release 2.43.1a1 ([#4926](https://github.com/aws-powertools/powertools-lambda-python/issues/4926)) - **ci:** new pre-release 2.42.1a9 ([#4912](https://github.com/aws-powertools/powertools-lambda-python/issues/4912)) - **deps-dev:** bump ruff from 0.5.6 to 0.5.7 ([#4918](https://github.com/aws-powertools/powertools-lambda-python/issues/4918)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.234 to 0.1.238 ([#4917](https://github.com/aws-powertools/powertools-lambda-python/issues/4917)) - **deps-dev:** bump mypy-boto3-ssm from 1.34.132 to 1.34.158 in the boto-typing group ([#4921](https://github.com/aws-powertools/powertools-lambda-python/issues/4921)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.238 to 0.1.242 ([#4922](https://github.com/aws-powertools/powertools-lambda-python/issues/4922)) - **deps-dev:** bump cfn-lint from 1.9.6 to 1.9.7 ([#4923](https://github.com/aws-powertools/powertools-lambda-python/issues/4923)) - **deps-dev:** bump cfn-lint from 1.9.5 to 1.9.6 ([#4916](https://github.com/aws-powertools/powertools-lambda-python/issues/4916)) ## [v2.43.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.42.0...v2.43.0) - 2024-08-08 ## Bug Fixes - **data_class:** ensure DynamoDBStreamEvent conforms to decimal limits ([#4863](https://github.com/aws-powertools/powertools-lambda-python/issues/4863)) ## Code Refactoring - **test:** make CORS test consistent with expected behavior ([#4882](https://github.com/aws-powertools/powertools-lambda-python/issues/4882)) - **tracer:** make capture_lambda_handler type more generic ([#4796](https://github.com/aws-powertools/powertools-lambda-python/issues/4796)) ## Documentation - fix type vs. field in comment ([#4832](https://github.com/aws-powertools/powertools-lambda-python/issues/4832)) - **public_reference:** add CHS Inc. as a public reference ([#4885](https://github.com/aws-powertools/powertools-lambda-python/issues/4885)) - **public_reference:** add LocalStack as a public reference ([#4858](https://github.com/aws-powertools/powertools-lambda-python/issues/4858)) - **public_reference:** add Caylent as a public reference ([#4822](https://github.com/aws-powertools/powertools-lambda-python/issues/4822)) ## Features - **metrics:** add unit None for CloudWatch EMF Metrics ([#4904](https://github.com/aws-powertools/powertools-lambda-python/issues/4904)) - **validation:** returns output from validate function ([#4839](https://github.com/aws-powertools/powertools-lambda-python/issues/4839)) ## Maintenance - version bump - **ci:** new pre-release 2.42.1a5 ([#4868](https://github.com/aws-powertools/powertools-lambda-python/issues/4868)) - **ci:** new pre-release 2.42.1a8 ([#4903](https://github.com/aws-powertools/powertools-lambda-python/issues/4903)) - **ci:** new pre-release 2.42.1a0 ([#4827](https://github.com/aws-powertools/powertools-lambda-python/issues/4827)) - **ci:** new pre-release 2.42.1a7 ([#4894](https://github.com/aws-powertools/powertools-lambda-python/issues/4894)) - **ci:** new pre-release 2.42.1a1 ([#4837](https://github.com/aws-powertools/powertools-lambda-python/issues/4837)) - **ci:** new pre-release 2.42.1a3 ([#4856](https://github.com/aws-powertools/powertools-lambda-python/issues/4856)) - **ci:** new pre-release 2.42.1a4 ([#4864](https://github.com/aws-powertools/powertools-lambda-python/issues/4864)) - **ci:** new pre-release 2.42.1a6 ([#4884](https://github.com/aws-powertools/powertools-lambda-python/issues/4884)) - **ci:** new pre-release 2.42.1a2 ([#4847](https://github.com/aws-powertools/powertools-lambda-python/issues/4847)) - **deps:** bump golang.org/x/sync from 0.7.0 to 0.8.0 in /layer/scripts/layer-balancer in the layer-balancer group ([#4892](https://github.com/aws-powertools/powertools-lambda-python/issues/4892)) - **deps:** bump actions/upload-artifact from 4.3.5 to 4.3.6 ([#4901](https://github.com/aws-powertools/powertools-lambda-python/issues/4901)) - **deps:** bump actions/upload-artifact from 4.3.4 to 4.3.5 ([#4871](https://github.com/aws-powertools/powertools-lambda-python/issues/4871)) - **deps:** bump ossf/scorecard-action from 2.3.3 to 2.4.0 ([#4829](https://github.com/aws-powertools/powertools-lambda-python/issues/4829)) - **deps:** bump squidfunk/mkdocs-material from `257eca8` to `9919d6e` in /docs ([#4878](https://github.com/aws-powertools/powertools-lambda-python/issues/4878)) - **deps:** bump docker/setup-buildx-action from 3.5.0 to 3.6.1 ([#4844](https://github.com/aws-powertools/powertools-lambda-python/issues/4844)) - **deps:** bump redis from 5.0.7 to 5.0.8 ([#4854](https://github.com/aws-powertools/powertools-lambda-python/issues/4854)) - **deps-dev:** bump ruff from 0.5.5 to 0.5.6 ([#4874](https://github.com/aws-powertools/powertools-lambda-python/issues/4874)) - **deps-dev:** bump mypy-boto3-cloudwatch from 1.34.83 to 1.34.153 in the boto-typing group ([#4887](https://github.com/aws-powertools/powertools-lambda-python/issues/4887)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.224 to 0.1.228 ([#4867](https://github.com/aws-powertools/powertools-lambda-python/issues/4867)) - **deps-dev:** bump cfn-lint from 1.9.1 to 1.9.3 ([#4866](https://github.com/aws-powertools/powertools-lambda-python/issues/4866)) - **deps-dev:** bump sentry-sdk from 2.11.0 to 2.12.0 ([#4861](https://github.com/aws-powertools/powertools-lambda-python/issues/4861)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.228 to 0.1.230 ([#4876](https://github.com/aws-powertools/powertools-lambda-python/issues/4876)) - **deps-dev:** bump black from 24.4.2 to 24.8.0 ([#4873](https://github.com/aws-powertools/powertools-lambda-python/issues/4873)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.223 to 0.1.224 ([#4855](https://github.com/aws-powertools/powertools-lambda-python/issues/4855)) - **deps-dev:** bump mypy-boto3-logs from 1.34.66 to 1.34.151 in the boto-typing group ([#4853](https://github.com/aws-powertools/powertools-lambda-python/issues/4853)) - **deps-dev:** bump coverage from 7.6.0 to 7.6.1 ([#4888](https://github.com/aws-powertools/powertools-lambda-python/issues/4888)) - **deps-dev:** bump cfn-lint from 1.8.2 to 1.9.1 ([#4851](https://github.com/aws-powertools/powertools-lambda-python/issues/4851)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.150.0a0 to 2.151.0a0 ([#4889](https://github.com/aws-powertools/powertools-lambda-python/issues/4889)) - **deps-dev:** bump aws-cdk from 2.150.0 to 2.151.0 ([#4872](https://github.com/aws-powertools/powertools-lambda-python/issues/4872)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.219 to 0.1.222 ([#4836](https://github.com/aws-powertools/powertools-lambda-python/issues/4836)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.222 to 0.1.223 ([#4843](https://github.com/aws-powertools/powertools-lambda-python/issues/4843)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.233 to 0.1.234 ([#4909](https://github.com/aws-powertools/powertools-lambda-python/issues/4909)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.230 to 0.1.231 ([#4891](https://github.com/aws-powertools/powertools-lambda-python/issues/4891)) - **deps-dev:** bump cfn-lint from 1.9.3 to 1.9.5 ([#4890](https://github.com/aws-powertools/powertools-lambda-python/issues/4890)) - **deps-dev:** bump pytest from 8.3.1 to 8.3.2 ([#4824](https://github.com/aws-powertools/powertools-lambda-python/issues/4824)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.231 to 0.1.233 ([#4900](https://github.com/aws-powertools/powertools-lambda-python/issues/4900)) - **deps-dev:** bump mkdocs-material from 9.5.30 to 9.5.31 ([#4877](https://github.com/aws-powertools/powertools-lambda-python/issues/4877)) - **deps-dev:** bump types-redis from 4.6.0.20240425 to 4.6.0.20240726 ([#4831](https://github.com/aws-powertools/powertools-lambda-python/issues/4831)) - **deps-dev:** bump ruff from 0.5.4 to 0.5.5 ([#4823](https://github.com/aws-powertools/powertools-lambda-python/issues/4823)) - **deps-dev:** bump aws-cdk-lib from 2.150.0 to 2.151.0 ([#4875](https://github.com/aws-powertools/powertools-lambda-python/issues/4875)) - **deps-dev:** bump types-redis from 4.6.0.20240726 to 4.6.0.20240806 ([#4899](https://github.com/aws-powertools/powertools-lambda-python/issues/4899)) - **maintenance:** add Banxware customer refernece ([#4841](https://github.com/aws-powertools/powertools-lambda-python/issues/4841)) ## [v2.42.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.41.0...v2.42.0) - 2024-07-25 ## Bug Fixes - **idempotency:** ensure in_progress_expiration field is set on Lambda timeout. ([#4773](https://github.com/aws-powertools/powertools-lambda-python/issues/4773)) ## Documentation - **idempotency:** improve navigation, wording, and new section on guarantees ([#4613](https://github.com/aws-powertools/powertools-lambda-python/issues/4613)) ## Features - **event_handler:** add OpenAPI extensions ([#4703](https://github.com/aws-powertools/powertools-lambda-python/issues/4703)) ## Maintenance - version bump - **ci:** new pre-release 2.41.1a4 ([#4772](https://github.com/aws-powertools/powertools-lambda-python/issues/4772)) - **ci:** new pre-release 2.41.1a0 ([#4749](https://github.com/aws-powertools/powertools-lambda-python/issues/4749)) - **ci:** new pre-release 2.41.1a1 ([#4756](https://github.com/aws-powertools/powertools-lambda-python/issues/4756)) - **ci:** new pre-release 2.41.1a2 ([#4758](https://github.com/aws-powertools/powertools-lambda-python/issues/4758)) - **ci:** new pre-release 2.41.1a9 ([#4808](https://github.com/aws-powertools/powertools-lambda-python/issues/4808)) - **ci:** new pre-release 2.41.1a3 ([#4766](https://github.com/aws-powertools/powertools-lambda-python/issues/4766)) - **ci:** new pre-release 2.41.1a8 ([#4802](https://github.com/aws-powertools/powertools-lambda-python/issues/4802)) - **ci:** new pre-release 2.41.1a5 ([#4777](https://github.com/aws-powertools/powertools-lambda-python/issues/4777)) - **ci:** new pre-release 2.41.1a6 ([#4783](https://github.com/aws-powertools/powertools-lambda-python/issues/4783)) - **ci:** new pre-release 2.41.1a7 ([#4792](https://github.com/aws-powertools/powertools-lambda-python/issues/4792)) - **deps:** bump github.com/aws/aws-sdk-go-v2/config from 1.27.26 to 1.27.27 in /layer/scripts/layer-balancer in the layer-balancer group ([#4779](https://github.com/aws-powertools/powertools-lambda-python/issues/4779)) - **deps:** bump aws-actions/closed-issue-message from 8b6324312193476beecf11f8e8539d73a3553bf4 to 80edfc24bdf1283400eb04d20a8a605ae8bf7d48 ([#4786](https://github.com/aws-powertools/powertools-lambda-python/issues/4786)) - **deps:** bump actions/dependency-review-action from 4.3.3 to 4.3.4 ([#4753](https://github.com/aws-powertools/powertools-lambda-python/issues/4753)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#4745](https://github.com/aws-powertools/powertools-lambda-python/issues/4745)) - **deps:** bump datadog-lambda from 6.96.0 to 6.97.0 ([#4770](https://github.com/aws-powertools/powertools-lambda-python/issues/4770)) - **deps:** bump docker/setup-buildx-action from 3.4.0 to 3.5.0 ([#4801](https://github.com/aws-powertools/powertools-lambda-python/issues/4801)) - **deps:** bump docker/setup-qemu-action from 3.1.0 to 3.2.0 ([#4800](https://github.com/aws-powertools/powertools-lambda-python/issues/4800)) - **deps-dev:** bump cfn-lint from 1.8.1 to 1.8.2 ([#4788](https://github.com/aws-powertools/powertools-lambda-python/issues/4788)) - **deps-dev:** bump pytest-asyncio from 0.23.7 to 0.23.8 ([#4776](https://github.com/aws-powertools/powertools-lambda-python/issues/4776)) - **deps-dev:** bump pytest from 8.2.2 to 8.3.1 ([#4799](https://github.com/aws-powertools/powertools-lambda-python/issues/4799)) - **deps-dev:** bump aws-cdk-lib from 2.148.1 to 2.150.0 ([#4806](https://github.com/aws-powertools/powertools-lambda-python/issues/4806)) - **deps-dev:** bump ruff from 0.5.3 to 0.5.4 ([#4798](https://github.com/aws-powertools/powertools-lambda-python/issues/4798)) - **deps-dev:** bump cfn-lint from 1.6.1 to 1.8.1 ([#4780](https://github.com/aws-powertools/powertools-lambda-python/issues/4780)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.211 to 0.1.212 ([#4769](https://github.com/aws-powertools/powertools-lambda-python/issues/4769)) - **deps-dev:** bump ruff from 0.5.2 to 0.5.3 ([#4781](https://github.com/aws-powertools/powertools-lambda-python/issues/4781)) - **deps-dev:** bump mkdocs-material from 9.5.28 to 9.5.29 ([#4764](https://github.com/aws-powertools/powertools-lambda-python/issues/4764)) - **deps-dev:** bump aws-cdk from 2.148.0 to 2.149.0 ([#4765](https://github.com/aws-powertools/powertools-lambda-python/issues/4765)) - **deps-dev:** bump ruff from 0.5.1 to 0.5.2 ([#4762](https://github.com/aws-powertools/powertools-lambda-python/issues/4762)) - **deps-dev:** bump sentry-sdk from 2.9.0 to 2.10.0 ([#4763](https://github.com/aws-powertools/powertools-lambda-python/issues/4763)) - **deps-dev:** bump aws-cdk from 2.149.0 to 2.150.0 ([#4805](https://github.com/aws-powertools/powertools-lambda-python/issues/4805)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.207 to 0.1.211 ([#4760](https://github.com/aws-powertools/powertools-lambda-python/issues/4760)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.34.131 to 1.34.148 in the boto-typing group ([#4812](https://github.com/aws-powertools/powertools-lambda-python/issues/4812)) - **deps-dev:** bump sentry-sdk from 2.10.0 to 2.11.0 ([#4815](https://github.com/aws-powertools/powertools-lambda-python/issues/4815)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.212 to 0.1.219 ([#4817](https://github.com/aws-powertools/powertools-lambda-python/issues/4817)) - **deps-dev:** bump cfn-lint from 1.6.0 to 1.6.1 ([#4751](https://github.com/aws-powertools/powertools-lambda-python/issues/4751)) - **deps-dev:** bump mkdocs-material from 9.5.29 to 9.5.30 ([#4807](https://github.com/aws-powertools/powertools-lambda-python/issues/4807)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.148.1a0 to 2.150.0a0 ([#4813](https://github.com/aws-powertools/powertools-lambda-python/issues/4813)) - **deps-dev:** bump cfn-lint from 1.5.3 to 1.6.0 ([#4747](https://github.com/aws-powertools/powertools-lambda-python/issues/4747)) - **deps-dev:** bump coverage from 7.5.4 to 7.6.0 ([#4746](https://github.com/aws-powertools/powertools-lambda-python/issues/4746)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.206 to 0.1.207 ([#4748](https://github.com/aws-powertools/powertools-lambda-python/issues/4748)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.34.128 to 1.34.145 in the boto-typing group ([#4787](https://github.com/aws-powertools/powertools-lambda-python/issues/4787)) - **docs:** Add lambda layer policy to versioning docs ([#4811](https://github.com/aws-powertools/powertools-lambda-python/issues/4811)) - **logger:** use package logger over source logger to reduce noise ([#4793](https://github.com/aws-powertools/powertools-lambda-python/issues/4793)) ## [v2.41.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.40.1...v2.41.0) - 2024-07-11 ## Bug Fixes - **event_handler:** make the max_age attribute comply with RFC specification ([#4731](https://github.com/aws-powertools/powertools-lambda-python/issues/4731)) - **event_handler:** disable allow-credentials header when origin allow_origin is * ([#4638](https://github.com/aws-powertools/powertools-lambda-python/issues/4638)) - **event_handler:** convert null body to empty string in ALBResolver to avoid HTTP 502 ([#4683](https://github.com/aws-powertools/powertools-lambda-python/issues/4683)) - **event_handler:** custom serializer recursive values when using data validation ([#4664](https://github.com/aws-powertools/powertools-lambda-python/issues/4664)) ## Documentation - **i-made-this:** Bedrock agents with Powertools for AWS Lambda ([#4705](https://github.com/aws-powertools/powertools-lambda-python/issues/4705)) - **public_reference:** add BusPatrol as a public reference ([#4713](https://github.com/aws-powertools/powertools-lambda-python/issues/4713)) ## Features - **batch:** add option to not raise `BatchProcessingError` exception when the entire batch fails ([#4719](https://github.com/aws-powertools/powertools-lambda-python/issues/4719)) - **feature_flags:** allow customers to bring their own boto3 client and session ([#4717](https://github.com/aws-powertools/powertools-lambda-python/issues/4717)) - **parser:** add support for API Gateway Lambda authorizer events ([#4718](https://github.com/aws-powertools/powertools-lambda-python/issues/4718)) ## Maintenance - version bump - Add token to codecov action ([#4682](https://github.com/aws-powertools/powertools-lambda-python/issues/4682)) - **ci:** new pre-release 2.40.2a5 ([#4706](https://github.com/aws-powertools/powertools-lambda-python/issues/4706)) - **ci:** new pre-release 2.40.2a0 ([#4665](https://github.com/aws-powertools/powertools-lambda-python/issues/4665)) - **ci:** new pre-release 2.40.2a8 ([#4737](https://github.com/aws-powertools/powertools-lambda-python/issues/4737)) - **ci:** new pre-release 2.40.2a7 ([#4726](https://github.com/aws-powertools/powertools-lambda-python/issues/4726)) - **ci:** new pre-release 2.40.2a1 ([#4669](https://github.com/aws-powertools/powertools-lambda-python/issues/4669)) - **ci:** new pre-release 2.40.2a2 ([#4679](https://github.com/aws-powertools/powertools-lambda-python/issues/4679)) - **ci:** new pre-release 2.40.2a3 ([#4688](https://github.com/aws-powertools/powertools-lambda-python/issues/4688)) - **ci:** new pre-release 2.40.2a6 ([#4715](https://github.com/aws-powertools/powertools-lambda-python/issues/4715)) - **ci:** new pre-release 2.40.2a4 ([#4694](https://github.com/aws-powertools/powertools-lambda-python/issues/4694)) - **deps:** bump docker/setup-qemu-action from 3.0.0 to 3.1.0 ([#4685](https://github.com/aws-powertools/powertools-lambda-python/issues/4685)) - **deps:** bump actions/setup-python from 5.1.0 to 5.1.1 ([#4732](https://github.com/aws-powertools/powertools-lambda-python/issues/4732)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#4733](https://github.com/aws-powertools/powertools-lambda-python/issues/4733)) - **deps:** bump actions/upload-artifact from 4.3.3 to 4.3.4 ([#4698](https://github.com/aws-powertools/powertools-lambda-python/issues/4698)) - **deps:** bump actions/download-artifact from 4.1.7 to 4.1.8 ([#4699](https://github.com/aws-powertools/powertools-lambda-python/issues/4699)) - **deps:** bump actions/setup-node from 4.0.2 to 4.0.3 ([#4725](https://github.com/aws-powertools/powertools-lambda-python/issues/4725)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.9 to 3.0.10 ([#4678](https://github.com/aws-powertools/powertools-lambda-python/issues/4678)) - **deps:** bump docker/setup-buildx-action from 3.3.0 to 3.4.0 ([#4693](https://github.com/aws-powertools/powertools-lambda-python/issues/4693)) - **deps:** bump zipp from 3.17.0 to 3.19.1 in /docs ([#4720](https://github.com/aws-powertools/powertools-lambda-python/issues/4720)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#4659](https://github.com/aws-powertools/powertools-lambda-python/issues/4659)) - **deps:** bump certifi from 2024.6.2 to 2024.7.4 ([#4700](https://github.com/aws-powertools/powertools-lambda-python/issues/4700)) - **deps:** bump github.com/aws/aws-sdk-go-v2/config from 1.27.23 to 1.27.24 in /layer/scripts/layer-balancer in the layer-balancer group ([#4684](https://github.com/aws-powertools/powertools-lambda-python/issues/4684)) - **deps-dev:** bump mkdocs-material from 9.5.27 to 9.5.28 ([#4676](https://github.com/aws-powertools/powertools-lambda-python/issues/4676)) - **deps-dev:** bump cfn-lint from 1.4.2 to 1.5.0 ([#4675](https://github.com/aws-powertools/powertools-lambda-python/issues/4675)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.147.3a0 to 2.148.0a0 ([#4722](https://github.com/aws-powertools/powertools-lambda-python/issues/4722)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.200 to 0.1.201 ([#4687](https://github.com/aws-powertools/powertools-lambda-python/issues/4687)) - **deps-dev:** bump aws-cdk-lib from 2.147.2 to 2.147.3 ([#4674](https://github.com/aws-powertools/powertools-lambda-python/issues/4674)) - **deps-dev:** bump zipp from 3.17.0 to 3.19.1 in /layer ([#4721](https://github.com/aws-powertools/powertools-lambda-python/issues/4721)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.202 to 0.1.205 ([#4723](https://github.com/aws-powertools/powertools-lambda-python/issues/4723)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.147.2a0 to 2.147.3a0 ([#4686](https://github.com/aws-powertools/powertools-lambda-python/issues/4686)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.199 to 0.1.200 ([#4677](https://github.com/aws-powertools/powertools-lambda-python/issues/4677)) - **deps-dev:** bump aws-cdk-lib from 2.147.3 to 2.148.0 ([#4710](https://github.com/aws-powertools/powertools-lambda-python/issues/4710)) - **deps-dev:** bump aws-cdk from 2.147.2 to 2.147.3 ([#4672](https://github.com/aws-powertools/powertools-lambda-python/issues/4672)) - **deps-dev:** bump mypy-boto3-s3 from 1.34.120 to 1.34.138 in the boto-typing group ([#4673](https://github.com/aws-powertools/powertools-lambda-python/issues/4673)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.201 to 0.1.202 ([#4696](https://github.com/aws-powertools/powertools-lambda-python/issues/4696)) - **deps-dev:** bump cfn-lint from 1.5.1 to 1.5.2 ([#4724](https://github.com/aws-powertools/powertools-lambda-python/issues/4724)) - **deps-dev:** bump ruff from 0.5.0 to 0.5.1 ([#4697](https://github.com/aws-powertools/powertools-lambda-python/issues/4697)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.198 to 0.1.199 ([#4668](https://github.com/aws-powertools/powertools-lambda-python/issues/4668)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.147.1a0 to 2.147.2a0 ([#4667](https://github.com/aws-powertools/powertools-lambda-python/issues/4667)) - **deps-dev:** bump aws-cdk from 2.147.3 to 2.148.0 ([#4708](https://github.com/aws-powertools/powertools-lambda-python/issues/4708)) - **deps-dev:** bump cfn-lint from 1.5.2 to 1.5.3 ([#4734](https://github.com/aws-powertools/powertools-lambda-python/issues/4734)) - **deps-dev:** bump sentry-sdk from 2.8.0 to 2.9.0 ([#4735](https://github.com/aws-powertools/powertools-lambda-python/issues/4735)) - **deps-dev:** bump cfn-lint from 1.4.1 to 1.4.2 ([#4660](https://github.com/aws-powertools/powertools-lambda-python/issues/4660)) - **deps-dev:** bump aws-cdk-lib from 2.147.1 to 2.147.2 ([#4661](https://github.com/aws-powertools/powertools-lambda-python/issues/4661)) - **deps-dev:** bump cfn-lint from 1.5.0 to 1.5.1 ([#4711](https://github.com/aws-powertools/powertools-lambda-python/issues/4711)) - **deps-dev:** bump aws-cdk from 2.147.1 to 2.147.2 ([#4657](https://github.com/aws-powertools/powertools-lambda-python/issues/4657)) - **deps-dev:** bump ruff from 0.4.10 to 0.5.0 ([#4644](https://github.com/aws-powertools/powertools-lambda-python/issues/4644)) - **deps-dev:** bump sentry-sdk from 2.7.1 to 2.8.0 ([#4712](https://github.com/aws-powertools/powertools-lambda-python/issues/4712)) - **layers:** downgrade aws cdk to 2.145.0 ([#4739](https://github.com/aws-powertools/powertools-lambda-python/issues/4739)) ## [v2.40.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.40.0...v2.40.1) - 2024-06-28 ## Bug Fixes - **event_handler:** current_event regression AppSyncResolver Router ([#4652](https://github.com/aws-powertools/powertools-lambda-python/issues/4652)) ## Maintenance - version bump - **ci:** new pre-release 2.40.1a1 ([#4653](https://github.com/aws-powertools/powertools-lambda-python/issues/4653)) - **ci:** new pre-release 2.40.1a0 ([#4648](https://github.com/aws-powertools/powertools-lambda-python/issues/4648)) - **deps-dev:** bump cfn-lint from 1.3.7 to 1.4.1 ([#4646](https://github.com/aws-powertools/powertools-lambda-python/issues/4646)) - **deps-dev:** bump sentry-sdk from 2.7.0 to 2.7.1 ([#4645](https://github.com/aws-powertools/powertools-lambda-python/issues/4645)) ## [v2.40.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.39.1...v2.40.0) - 2024-06-27 ## Bug Fixes - **event_sources:** change partition and offset field types in KafkaEventRecord ([#4515](https://github.com/aws-powertools/powertools-lambda-python/issues/4515)) ## Documentation - **homepage:** Fix homepage link ([#4587](https://github.com/aws-powertools/powertools-lambda-python/issues/4587)) - **i-made-this:** add new article about best practices for accelerating serverless development ([#4518](https://github.com/aws-powertools/powertools-lambda-python/issues/4518)) - **public reference:** add Brsk as a public reference ([#4597](https://github.com/aws-powertools/powertools-lambda-python/issues/4597)) ## Features - **event-handler:** add appsync batch resolvers ([#1998](https://github.com/aws-powertools/powertools-lambda-python/issues/1998)) - **validation:** support JSON Schema referencing in validation utils ([#4508](https://github.com/aws-powertools/powertools-lambda-python/issues/4508)) ## Maintenance - version bump - **ci:** add the Metrics feature to nox tests ([#4552](https://github.com/aws-powertools/powertools-lambda-python/issues/4552)) - **ci:** new pre-release 2.39.2a5 ([#4636](https://github.com/aws-powertools/powertools-lambda-python/issues/4636)) - **ci:** add the Streaming feature to nox tests ([#4575](https://github.com/aws-powertools/powertools-lambda-python/issues/4575)) - **ci:** new pre-release 2.39.2a4 ([#4629](https://github.com/aws-powertools/powertools-lambda-python/issues/4629)) - **ci:** new pre-release 2.39.2a3 ([#4620](https://github.com/aws-powertools/powertools-lambda-python/issues/4620)) - **ci:** add the Event Handler feature to nox tests ([#4581](https://github.com/aws-powertools/powertools-lambda-python/issues/4581)) - **ci:** add the Data Class feature to nox tests ([#4583](https://github.com/aws-powertools/powertools-lambda-python/issues/4583)) - **ci:** add the Parser feature to nox tests ([#4584](https://github.com/aws-powertools/powertools-lambda-python/issues/4584)) - **ci:** add the Idempotency feature to nox tests ([#4585](https://github.com/aws-powertools/powertools-lambda-python/issues/4585)) - **ci:** new pre-release 2.39.2a2 ([#4610](https://github.com/aws-powertools/powertools-lambda-python/issues/4610)) - **ci:** introduce tests with Nox ([#4537](https://github.com/aws-powertools/powertools-lambda-python/issues/4537)) - **ci:** new pre-release 2.39.2a1 ([#4598](https://github.com/aws-powertools/powertools-lambda-python/issues/4598)) - **ci:** add the Tracer feature to nox tests ([#4567](https://github.com/aws-powertools/powertools-lambda-python/issues/4567)) - **ci:** add the Middleware Factory feature to nox tests ([#4568](https://github.com/aws-powertools/powertools-lambda-python/issues/4568)) - **ci:** add the Parameters feature to nox tests ([#4569](https://github.com/aws-powertools/powertools-lambda-python/issues/4569)) - **ci:** add the Batch Processor feature to nox tests ([#4586](https://github.com/aws-powertools/powertools-lambda-python/issues/4586)) - **ci:** add the Feature Flags feature to nox tests ([#4570](https://github.com/aws-powertools/powertools-lambda-python/issues/4570)) - **ci:** add the Validation feature to nox tests ([#4571](https://github.com/aws-powertools/powertools-lambda-python/issues/4571)) - **ci:** introduce daily pre-releases ([#4535](https://github.com/aws-powertools/powertools-lambda-python/issues/4535)) - **ci:** new pre-release 2.39.2a0 ([#4590](https://github.com/aws-powertools/powertools-lambda-python/issues/4590)) - **ci:** add the Data Masking feature to nox tests ([#4574](https://github.com/aws-powertools/powertools-lambda-python/issues/4574)) - **ci:** add the Typing feature to nox tests ([#4572](https://github.com/aws-powertools/powertools-lambda-python/issues/4572)) - **deps:** bump pypa/gh-action-pypi-publish from 1.8.14 to 1.9.0 ([#4592](https://github.com/aws-powertools/powertools-lambda-python/issues/4592)) - **deps:** bump pydantic from 1.10.16 to 1.10.17 ([#4595](https://github.com/aws-powertools/powertools-lambda-python/issues/4595)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#4565](https://github.com/aws-powertools/powertools-lambda-python/issues/4565)) - **deps:** bump squidfunk/mkdocs-material from `96abcbb` to `257eca8` in /docs ([#4540](https://github.com/aws-powertools/powertools-lambda-python/issues/4540)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.7 to 3.0.9 ([#4539](https://github.com/aws-powertools/powertools-lambda-python/issues/4539)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#4546](https://github.com/aws-powertools/powertools-lambda-python/issues/4546)) - **deps:** bump redis from 5.0.5 to 5.0.6 ([#4527](https://github.com/aws-powertools/powertools-lambda-python/issues/4527)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#4580](https://github.com/aws-powertools/powertools-lambda-python/issues/4580)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 2 updates ([#4635](https://github.com/aws-powertools/powertools-lambda-python/issues/4635)) - **deps:** bump codecov/codecov-action from 4.4.1 to 4.5.0 ([#4514](https://github.com/aws-powertools/powertools-lambda-python/issues/4514)) - **deps:** bump pypa/gh-action-pypi-publish from 1.8.14 to 1.9.0 ([#4538](https://github.com/aws-powertools/powertools-lambda-python/issues/4538)) - **deps:** bump fastjsonschema from 2.19.1 to 2.20.0 ([#4543](https://github.com/aws-powertools/powertools-lambda-python/issues/4543)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.189 to 0.1.192 ([#4578](https://github.com/aws-powertools/powertools-lambda-python/issues/4578)) - **deps-dev:** bump sentry-sdk from 2.5.1 to 2.6.0 ([#4579](https://github.com/aws-powertools/powertools-lambda-python/issues/4579)) - **deps-dev:** bump cfn-lint from 0.87.7 to 1.3.0 ([#4577](https://github.com/aws-powertools/powertools-lambda-python/issues/4577)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.192 to 0.1.193 ([#4596](https://github.com/aws-powertools/powertools-lambda-python/issues/4596)) - **deps-dev:** bump ruff from 0.4.9 to 0.4.10 ([#4594](https://github.com/aws-powertools/powertools-lambda-python/issues/4594)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.193 to 0.1.194 ([#4601](https://github.com/aws-powertools/powertools-lambda-python/issues/4601)) - **deps-dev:** bump aws-cdk from 2.146.0 to 2.147.0 ([#4604](https://github.com/aws-powertools/powertools-lambda-python/issues/4604)) - **deps-dev:** bump aws-cdk-lib from 2.146.0 to 2.147.0 ([#4603](https://github.com/aws-powertools/powertools-lambda-python/issues/4603)) - **deps-dev:** bump filelock from 3.15.1 to 3.15.3 ([#4576](https://github.com/aws-powertools/powertools-lambda-python/issues/4576)) - **deps-dev:** bump hvac from 2.2.0 to 2.3.0 ([#4563](https://github.com/aws-powertools/powertools-lambda-python/issues/4563)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.188 to 0.1.189 ([#4564](https://github.com/aws-powertools/powertools-lambda-python/issues/4564)) - **deps-dev:** bump cfn-lint from 1.3.0 to 1.3.3 ([#4602](https://github.com/aws-powertools/powertools-lambda-python/issues/4602)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.194 to 0.1.198 ([#4627](https://github.com/aws-powertools/powertools-lambda-python/issues/4627)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.146.0a0 to 2.147.0a0 ([#4619](https://github.com/aws-powertools/powertools-lambda-python/issues/4619)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.184 to 0.1.188 ([#4550](https://github.com/aws-powertools/powertools-lambda-python/issues/4550)) - **deps-dev:** bump mkdocs-material from 9.5.26 to 9.5.27 ([#4544](https://github.com/aws-powertools/powertools-lambda-python/issues/4544)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.145.0a0 to 2.146.0a0 ([#4542](https://github.com/aws-powertools/powertools-lambda-python/issues/4542)) - **deps-dev:** bump urllib3 from 1.26.18 to 1.26.19 in /layer ([#4547](https://github.com/aws-powertools/powertools-lambda-python/issues/4547)) - **deps-dev:** bump aws-cdk-lib from 2.145.0 to 2.146.0 ([#4526](https://github.com/aws-powertools/powertools-lambda-python/issues/4526)) - **deps-dev:** bump aws-cdk from 2.147.0 to 2.147.1 ([#4614](https://github.com/aws-powertools/powertools-lambda-python/issues/4614)) - **deps-dev:** bump coverage from 7.5.3 to 7.5.4 ([#4617](https://github.com/aws-powertools/powertools-lambda-python/issues/4617)) - **deps-dev:** bump aws-cdk-lib from 2.147.0 to 2.147.1 ([#4615](https://github.com/aws-powertools/powertools-lambda-python/issues/4615)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.34.125 to 1.34.128 in the boto-typing group ([#4541](https://github.com/aws-powertools/powertools-lambda-python/issues/4541)) - **deps-dev:** bump pdoc3 from 0.10.0 to 0.11.0 ([#4618](https://github.com/aws-powertools/powertools-lambda-python/issues/4618)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.34.109 to 1.34.125 in the boto-typing group ([#4509](https://github.com/aws-powertools/powertools-lambda-python/issues/4509)) - **deps-dev:** bump mike from 2.1.1 to 2.1.2 ([#4616](https://github.com/aws-powertools/powertools-lambda-python/issues/4616)) - **deps-dev:** bump mypy from 1.10.0 to 1.10.1 ([#4624](https://github.com/aws-powertools/powertools-lambda-python/issues/4624)) - **deps-dev:** bump filelock from 3.15.3 to 3.15.4 ([#4626](https://github.com/aws-powertools/powertools-lambda-python/issues/4626)) - **deps-dev:** bump ruff from 0.4.8 to 0.4.9 ([#4528](https://github.com/aws-powertools/powertools-lambda-python/issues/4528)) - **deps-dev:** bump cfn-lint from 1.3.3 to 1.3.5 ([#4628](https://github.com/aws-powertools/powertools-lambda-python/issues/4628)) - **deps-dev:** bump mypy-boto3-ssm from 1.34.91 to 1.34.132 in the boto-typing group ([#4623](https://github.com/aws-powertools/powertools-lambda-python/issues/4623)) - **deps-dev:** bump aws-cdk from 2.145.0 to 2.146.0 ([#4525](https://github.com/aws-powertools/powertools-lambda-python/issues/4525)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.182 to 0.1.184 ([#4529](https://github.com/aws-powertools/powertools-lambda-python/issues/4529)) - **deps-dev:** bump bandit from 1.7.8 to 1.7.9 ([#4511](https://github.com/aws-powertools/powertools-lambda-python/issues/4511)) - **deps-dev:** bump cfn-lint from 0.87.6 to 0.87.7 ([#4513](https://github.com/aws-powertools/powertools-lambda-python/issues/4513)) - **deps-dev:** bump filelock from 3.14.0 to 3.15.1 ([#4512](https://github.com/aws-powertools/powertools-lambda-python/issues/4512)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.179 to 0.1.182 ([#4510](https://github.com/aws-powertools/powertools-lambda-python/issues/4510)) - **deps-dev:** bump cfn-lint from 1.3.5 to 1.3.7 ([#4634](https://github.com/aws-powertools/powertools-lambda-python/issues/4634)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.34.114 to 1.34.131 in the boto-typing group ([#4593](https://github.com/aws-powertools/powertools-lambda-python/issues/4593)) - **governance:** fix errors when creating Gitpod environment ([#4532](https://github.com/aws-powertools/powertools-lambda-python/issues/4532)) - **layers:** downgrade aws cdk to 2.145.0 ([#4640](https://github.com/aws-powertools/powertools-lambda-python/issues/4640)) ## [v2.39.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.39.0...v2.39.1) - 2024-06-13 ## Bug Fixes - **event_handler:** regression making pydantic required (it should not) ([#4500](https://github.com/aws-powertools/powertools-lambda-python/issues/4500)) ## Maintenance - version bump ## [v2.39.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.38.1...v2.39.0) - 2024-06-13 ## Bug Fixes - **event_handler:** do not skip middleware and exception handlers on 404 error ([#4492](https://github.com/aws-powertools/powertools-lambda-python/issues/4492)) - **event_handler:** raise more specific SerializationError exception for unsupported types in data validation ([#4415](https://github.com/aws-powertools/powertools-lambda-python/issues/4415)) - **event_handler:** security scheme unhashable list when working with router ([#4421](https://github.com/aws-powertools/powertools-lambda-python/issues/4421)) - **event_handler:** CORS Origin for ALBResolver multi-headers ([#4385](https://github.com/aws-powertools/powertools-lambda-python/issues/4385)) - **idempotency:** POWERTOOLS_IDEMPOTENCY_DISABLED should respect truthy values ([#4391](https://github.com/aws-powertools/powertools-lambda-python/issues/4391)) ## Documentation - **homepage:** Change installation to CDK v2 ([#4351](https://github.com/aws-powertools/powertools-lambda-python/issues/4351)) - **public reference:** add Recast as a public reference ([#4491](https://github.com/aws-powertools/powertools-lambda-python/issues/4491)) ## Features - **event_source:** add CloudFormationCustomResourceEvent data class. ([#4342](https://github.com/aws-powertools/powertools-lambda-python/issues/4342)) - **events:** Update and Add Cognito User Pool Events ([#4423](https://github.com/aws-powertools/powertools-lambda-python/issues/4423)) ## Maintenance - version bump - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#4369](https://github.com/aws-powertools/powertools-lambda-python/issues/4369)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#4468](https://github.com/aws-powertools/powertools-lambda-python/issues/4468)) - **deps:** bump datadog-lambda from 5.94.0 to 6.95.0 ([#4471](https://github.com/aws-powertools/powertools-lambda-python/issues/4471)) - **deps:** bump redis from 5.0.4 to 5.0.5 ([#4464](https://github.com/aws-powertools/powertools-lambda-python/issues/4464)) - **deps:** bump aws-encryption-sdk from 3.2.0 to 3.3.0 ([#4393](https://github.com/aws-powertools/powertools-lambda-python/issues/4393)) - **deps:** bump codecov/codecov-action from 4.4.0 to 4.4.1 ([#4376](https://github.com/aws-powertools/powertools-lambda-python/issues/4376)) - **deps:** bump squidfunk/mkdocs-material from `8a87f05` to `96abcbb` in /docs ([#4461](https://github.com/aws-powertools/powertools-lambda-python/issues/4461)) - **deps:** bump typing-extensions from 4.12.1 to 4.12.2 ([#4470](https://github.com/aws-powertools/powertools-lambda-python/issues/4470)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 2 updates ([#4396](https://github.com/aws-powertools/powertools-lambda-python/issues/4396)) - **deps:** bump aws-xray-sdk from 2.13.0 to 2.13.1 ([#4379](https://github.com/aws-powertools/powertools-lambda-python/issues/4379)) - **deps:** bump actions/dependency-review-action from 4.3.2 to 4.3.3 ([#4456](https://github.com/aws-powertools/powertools-lambda-python/issues/4456)) - **deps:** bump aws-xray-sdk from 2.13.1 to 2.14.0 ([#4453](https://github.com/aws-powertools/powertools-lambda-python/issues/4453)) - **deps:** bump typing-extensions from 4.11.0 to 4.12.0 ([#4404](https://github.com/aws-powertools/powertools-lambda-python/issues/4404)) - **deps:** bump squidfunk/mkdocs-material from `5358893` to `8a87f05` in /docs ([#4408](https://github.com/aws-powertools/powertools-lambda-python/issues/4408)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.6 to 3.0.7 ([#4478](https://github.com/aws-powertools/powertools-lambda-python/issues/4478)) - **deps:** bump squidfunk/mkdocs-material from `48d1914` to `5358893` in /docs ([#4377](https://github.com/aws-powertools/powertools-lambda-python/issues/4377)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#4444](https://github.com/aws-powertools/powertools-lambda-python/issues/4444)) - **deps:** bump pydantic from 1.10.15 to 1.10.16 ([#4485](https://github.com/aws-powertools/powertools-lambda-python/issues/4485)) - **deps:** bump datadog-lambda from 6.95.0 to 6.96.0 ([#4489](https://github.com/aws-powertools/powertools-lambda-python/issues/4489)) - **deps:** bump actions/checkout from 4.1.6 to 4.1.7 ([#4493](https://github.com/aws-powertools/powertools-lambda-python/issues/4493)) - **deps:** bump typing-extensions from 4.12.0 to 4.12.1 ([#4440](https://github.com/aws-powertools/powertools-lambda-python/issues/4440)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.5 to 3.0.6 ([#4445](https://github.com/aws-powertools/powertools-lambda-python/issues/4445)) - **deps:** bump requests from 2.31.0 to 2.32.0 ([#4383](https://github.com/aws-powertools/powertools-lambda-python/issues/4383)) - **deps-dev:** bump aws-cdk from 2.143.1 to 2.144.0 ([#4443](https://github.com/aws-powertools/powertools-lambda-python/issues/4443)) - **deps-dev:** bump aws-cdk-lib from 2.143.1 to 2.144.0 ([#4441](https://github.com/aws-powertools/powertools-lambda-python/issues/4441)) - **deps-dev:** bump ruff from 0.4.6 to 0.4.7 ([#4435](https://github.com/aws-powertools/powertools-lambda-python/issues/4435)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.143.0a0 to 2.143.1a0 ([#4433](https://github.com/aws-powertools/powertools-lambda-python/issues/4433)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.164 to 0.1.169 ([#4442](https://github.com/aws-powertools/powertools-lambda-python/issues/4442)) - **deps-dev:** bump pytest from 8.2.1 to 8.2.2 ([#4450](https://github.com/aws-powertools/powertools-lambda-python/issues/4450)) - **deps-dev:** bump aws-cdk from 2.143.0 to 2.143.1 ([#4430](https://github.com/aws-powertools/powertools-lambda-python/issues/4430)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.163 to 0.1.164 ([#4428](https://github.com/aws-powertools/powertools-lambda-python/issues/4428)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.161 to 0.1.163 ([#4425](https://github.com/aws-powertools/powertools-lambda-python/issues/4425)) - **deps-dev:** bump cfn-lint from 0.87.5 to 0.87.6 ([#4486](https://github.com/aws-powertools/powertools-lambda-python/issues/4486)) - **deps-dev:** bump sentry-sdk from 2.3.1 to 2.4.0 ([#4449](https://github.com/aws-powertools/powertools-lambda-python/issues/4449)) - **deps-dev:** bump ruff from 0.4.5 to 0.4.6 ([#4417](https://github.com/aws-powertools/powertools-lambda-python/issues/4417)) - **deps-dev:** bump cfn-lint from 0.87.3 to 0.87.4 ([#4419](https://github.com/aws-powertools/powertools-lambda-python/issues/4419)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.159 to 0.1.161 ([#4420](https://github.com/aws-powertools/powertools-lambda-python/issues/4420)) - **deps-dev:** bump coverage from 7.5.2 to 7.5.3 ([#4418](https://github.com/aws-powertools/powertools-lambda-python/issues/4418)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.34.113 to 1.34.114 in the boto-typing group ([#4416](https://github.com/aws-powertools/powertools-lambda-python/issues/4416)) - **deps-dev:** bump mkdocs-material from 9.5.24 to 9.5.25 ([#4411](https://github.com/aws-powertools/powertools-lambda-python/issues/4411)) - **deps-dev:** bump aws-cdk-lib from 2.143.0 to 2.143.1 ([#4429](https://github.com/aws-powertools/powertools-lambda-python/issues/4429)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.142.1a0 to 2.143.0a0 ([#4410](https://github.com/aws-powertools/powertools-lambda-python/issues/4410)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.34.97 to 1.34.113 in the boto-typing group ([#4409](https://github.com/aws-powertools/powertools-lambda-python/issues/4409)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.158 to 0.1.159 ([#4412](https://github.com/aws-powertools/powertools-lambda-python/issues/4412)) - **deps-dev:** bump coverage from 7.5.1 to 7.5.2 ([#4413](https://github.com/aws-powertools/powertools-lambda-python/issues/4413)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.143.1a0 to 2.144.0a0 ([#4448](https://github.com/aws-powertools/powertools-lambda-python/issues/4448)) - **deps-dev:** bump mypy-boto3-s3 from 1.34.105 to 1.34.120 in the boto-typing group ([#4452](https://github.com/aws-powertools/powertools-lambda-python/issues/4452)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.169 to 0.1.173 ([#4459](https://github.com/aws-powertools/powertools-lambda-python/issues/4459)) - **deps-dev:** bump aws-cdk-lib from 2.142.1 to 2.143.0 ([#4403](https://github.com/aws-powertools/powertools-lambda-python/issues/4403)) - **deps-dev:** bump aws-cdk from 2.142.1 to 2.143.0 ([#4402](https://github.com/aws-powertools/powertools-lambda-python/issues/4402)) - **deps-dev:** bump ruff from 0.4.4 to 0.4.5 ([#4399](https://github.com/aws-powertools/powertools-lambda-python/issues/4399)) - **deps-dev:** bump sentry-sdk from 2.2.1 to 2.3.1 ([#4398](https://github.com/aws-powertools/powertools-lambda-python/issues/4398)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.157 to 0.1.158 ([#4397](https://github.com/aws-powertools/powertools-lambda-python/issues/4397)) - **deps-dev:** bump ruff from 0.4.7 to 0.4.8 ([#4455](https://github.com/aws-powertools/powertools-lambda-python/issues/4455)) - **deps-dev:** bump sentry-sdk from 2.4.0 to 2.5.0 ([#4462](https://github.com/aws-powertools/powertools-lambda-python/issues/4462)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.155 to 0.1.157 ([#4394](https://github.com/aws-powertools/powertools-lambda-python/issues/4394)) - **deps-dev:** bump mkdocs-material from 9.5.25 to 9.5.26 ([#4463](https://github.com/aws-powertools/powertools-lambda-python/issues/4463)) - **deps-dev:** bump mypy-boto3-cloudformation from 1.34.84 to 1.34.111 in the boto-typing group ([#4392](https://github.com/aws-powertools/powertools-lambda-python/issues/4392)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.154 to 0.1.155 ([#4386](https://github.com/aws-powertools/powertools-lambda-python/issues/4386)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.173 to 0.1.174 ([#4466](https://github.com/aws-powertools/powertools-lambda-python/issues/4466)) - **deps-dev:** bump pytest-asyncio from 0.23.6 to 0.23.7 ([#4387](https://github.com/aws-powertools/powertools-lambda-python/issues/4387)) - **deps-dev:** bump sentry-sdk from 2.2.0 to 2.2.1 ([#4388](https://github.com/aws-powertools/powertools-lambda-python/issues/4388)) - **deps-dev:** bump ijson from 3.2.3 to 3.3.0 ([#4465](https://github.com/aws-powertools/powertools-lambda-python/issues/4465)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.152 to 0.1.154 ([#4382](https://github.com/aws-powertools/powertools-lambda-python/issues/4382)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.174 to 0.1.175 ([#4472](https://github.com/aws-powertools/powertools-lambda-python/issues/4472)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.34.107 to 1.34.109 in the boto-typing group ([#4378](https://github.com/aws-powertools/powertools-lambda-python/issues/4378)) - **deps-dev:** bump sentry-sdk from 2.5.0 to 2.5.1 ([#4469](https://github.com/aws-powertools/powertools-lambda-python/issues/4469)) - **deps-dev:** bump cfn-lint from 0.87.4 to 0.87.5 ([#4479](https://github.com/aws-powertools/powertools-lambda-python/issues/4479)) - **deps-dev:** bump mkdocs-material from 9.5.23 to 9.5.24 ([#4380](https://github.com/aws-powertools/powertools-lambda-python/issues/4380)) - **deps-dev:** bump pytest from 8.2.0 to 8.2.1 ([#4381](https://github.com/aws-powertools/powertools-lambda-python/issues/4381)) - **deps-dev:** bump aws-cdk from 2.144.0 to 2.145.0 ([#4482](https://github.com/aws-powertools/powertools-lambda-python/issues/4482)) - **deps-dev:** bump aws-cdk-lib from 2.144.0 to 2.145.0 ([#4481](https://github.com/aws-powertools/powertools-lambda-python/issues/4481)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.141.0a0 to 2.142.1a0 ([#4367](https://github.com/aws-powertools/powertools-lambda-python/issues/4367)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.144.0a0 to 2.145.0a0 ([#4487](https://github.com/aws-powertools/powertools-lambda-python/issues/4487)) - **deps-dev:** bump aws-cdk from 2.142.0 to 2.142.1 ([#4366](https://github.com/aws-powertools/powertools-lambda-python/issues/4366)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.150 to 0.1.152 ([#4368](https://github.com/aws-powertools/powertools-lambda-python/issues/4368)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.176 to 0.1.179 ([#4488](https://github.com/aws-powertools/powertools-lambda-python/issues/4488)) - **deps-dev:** bump cfn-lint from 0.87.2 to 0.87.3 ([#4370](https://github.com/aws-powertools/powertools-lambda-python/issues/4370)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.175 to 0.1.176 ([#4480](https://github.com/aws-powertools/powertools-lambda-python/issues/4480)) - **libraries:** add jmespath as a required dependency ([#4422](https://github.com/aws-powertools/powertools-lambda-python/issues/4422)) ## [v2.38.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.38.0...v2.38.1) - 2024-05-17 ## Bug Fixes - **logger:** reverting logger child modification ([#4363](https://github.com/aws-powertools/powertools-lambda-python/issues/4363)) ## Maintenance - version bump ## [v2.38.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.37.0...v2.38.0) - 2024-05-17 ## Bug Fixes - **ci:** apply lessons learned to monthly roadmap reminder cross-repo ([#4078](https://github.com/aws-powertools/powertools-lambda-python/issues/4078)) - **event-sources:** sane defaults for authorizer v1 and v2 ([#4298](https://github.com/aws-powertools/powertools-lambda-python/issues/4298)) - **logger:** correctly pick powertools or custom handler in custom environments ([#4295](https://github.com/aws-powertools/powertools-lambda-python/issues/4295)) - **parser:** make etag optional field on S3 notification events ([#4173](https://github.com/aws-powertools/powertools-lambda-python/issues/4173)) - **typing:** resolved_headers_field is not Optional ([#4148](https://github.com/aws-powertools/powertools-lambda-python/issues/4148)) ## Code Refactoring - **data-masking:** remove Non-GA comments ([#4334](https://github.com/aws-powertools/powertools-lambda-python/issues/4334)) - **parser:** only infer type hints when necessary ([#4183](https://github.com/aws-powertools/powertools-lambda-python/issues/4183)) ## Documentation - **general:** update documentation to add info about v3 ([#4234](https://github.com/aws-powertools/powertools-lambda-python/issues/4234)) - **homepage:** add link to new and official workshop ([#4292](https://github.com/aws-powertools/powertools-lambda-python/issues/4292)) - **idempotency:** fix highlight and import path ([#4154](https://github.com/aws-powertools/powertools-lambda-python/issues/4154)) - **roadmap:** april updates ([#4181](https://github.com/aws-powertools/powertools-lambda-python/issues/4181)) ## Features - **event_handler:** add support for persisting authorization session in OpenAPI ([#4312](https://github.com/aws-powertools/powertools-lambda-python/issues/4312)) - **event_handler:** add decorator for HTTP HEAD verb ([#4275](https://github.com/aws-powertools/powertools-lambda-python/issues/4275)) - **logger-utils:** preserve log level for discovered third-party top-level loggers ([#4299](https://github.com/aws-powertools/powertools-lambda-python/issues/4299)) ## Maintenance - version bump - **ci:** bump upload artifact action to v4 ([#4355](https://github.com/aws-powertools/powertools-lambda-python/issues/4355)) - **ci:** add branch v3 to quality check and e2e actions ([#4232](https://github.com/aws-powertools/powertools-lambda-python/issues/4232)) - **ci:** bump download artifact action to v4 ([#4358](https://github.com/aws-powertools/powertools-lambda-python/issues/4358)) - **deps:** bump actions/download-artifact from 4.1.4 to 4.1.5 ([#4161](https://github.com/aws-powertools/powertools-lambda-python/issues/4161)) - **deps:** bump actions/checkout from 4.1.3 to 4.1.4 ([#4206](https://github.com/aws-powertools/powertools-lambda-python/issues/4206)) - **deps:** bump ossf/scorecard-action from 2.3.1 to 2.3.3 ([#4315](https://github.com/aws-powertools/powertools-lambda-python/issues/4315)) - **deps:** bump github.com/aws/aws-sdk-go-v2/config from 1.27.12 to 1.27.13 in /layer/scripts/layer-balancer in the layer-balancer group ([#4319](https://github.com/aws-powertools/powertools-lambda-python/issues/4319)) - **deps:** bump actions/download-artifact from 4.1.6 to 4.1.7 ([#4205](https://github.com/aws-powertools/powertools-lambda-python/issues/4205)) - **deps:** bump squidfunk/mkdocs-material from `521644b` to `e309089` in /docs ([#4216](https://github.com/aws-powertools/powertools-lambda-python/issues/4216)) - **deps:** bump squidfunk/mkdocs-material from `11d7ec0` to `8ef47d7` in /docs ([#4323](https://github.com/aws-powertools/powertools-lambda-python/issues/4323)) - **deps:** bump datadog-lambda from 5.92.0 to 5.93.0 ([#4211](https://github.com/aws-powertools/powertools-lambda-python/issues/4211)) - **deps:** bump redis from 5.0.3 to 5.0.4 ([#4187](https://github.com/aws-powertools/powertools-lambda-python/issues/4187)) - **deps:** bump actions/upload-artifact from 4.3.2 to 4.3.3 ([#4177](https://github.com/aws-powertools/powertools-lambda-python/issues/4177)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 2 updates ([#4302](https://github.com/aws-powertools/powertools-lambda-python/issues/4302)) - **deps:** bump squidfunk/mkdocs-material from `8ef47d7` to `48d1914` in /docs ([#4336](https://github.com/aws-powertools/powertools-lambda-python/issues/4336)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#4337](https://github.com/aws-powertools/powertools-lambda-python/issues/4337)) - **deps:** bump squidfunk/mkdocs-material from `e309089` to `98c9809` in /docs ([#4236](https://github.com/aws-powertools/powertools-lambda-python/issues/4236)) - **deps:** bump actions/dependency-review-action from 4.3.1 to 4.3.2 ([#4244](https://github.com/aws-powertools/powertools-lambda-python/issues/4244)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.4 to 3.0.5 ([#4281](https://github.com/aws-powertools/powertools-lambda-python/issues/4281)) - **deps:** bump actions/checkout from 4.1.4 to 4.1.5 ([#4282](https://github.com/aws-powertools/powertools-lambda-python/issues/4282)) - **deps:** bump jinja2 from 3.1.3 to 3.1.4 in /docs ([#4284](https://github.com/aws-powertools/powertools-lambda-python/issues/4284)) - **deps:** bump codecov/codecov-action from 4.3.0 to 4.3.1 ([#4252](https://github.com/aws-powertools/powertools-lambda-python/issues/4252)) - **deps:** bump datadog-lambda from 5.93.0 to 5.94.0 ([#4253](https://github.com/aws-powertools/powertools-lambda-python/issues/4253)) - **deps:** bump actions/checkout from 4.1.2 to 4.1.3 ([#4168](https://github.com/aws-powertools/powertools-lambda-python/issues/4168)) - **deps:** bump actions/dependency-review-action from 4.2.5 to 4.3.1 ([#4240](https://github.com/aws-powertools/powertools-lambda-python/issues/4240)) - **deps:** bump actions/checkout from 4.1.5 to 4.1.6 ([#4344](https://github.com/aws-powertools/powertools-lambda-python/issues/4344)) - **deps:** bump squidfunk/mkdocs-material from `98c9809` to `11d7ec0` in /docs ([#4269](https://github.com/aws-powertools/powertools-lambda-python/issues/4269)) - **deps:** bump actions/upload-artifact from 4.3.1 to 4.3.2 ([#4162](https://github.com/aws-powertools/powertools-lambda-python/issues/4162)) - **deps:** bump codecov/codecov-action from 4.3.1 to 4.4.0 ([#4328](https://github.com/aws-powertools/powertools-lambda-python/issues/4328)) - **deps:** bump slsa-framework/slsa-github-generator from 1.10.0 to 2.0.0 ([#4179](https://github.com/aws-powertools/powertools-lambda-python/issues/4179)) - **deps:** bump actions/download-artifact from 4.1.5 to 4.1.6 ([#4178](https://github.com/aws-powertools/powertools-lambda-python/issues/4178)) - **deps-dev:** bump mkdocs-material from 9.5.20 to 9.5.21 ([#4271](https://github.com/aws-powertools/powertools-lambda-python/issues/4271)) - **deps-dev:** bump mike from 2.1.0 to 2.1.1 ([#4268](https://github.com/aws-powertools/powertools-lambda-python/issues/4268)) - **deps-dev:** bump cfn-lint from 0.87.0 to 0.87.1 ([#4272](https://github.com/aws-powertools/powertools-lambda-python/issues/4272)) - **deps-dev:** bump mike from 1.1.2 to 2.1.0 ([#4258](https://github.com/aws-powertools/powertools-lambda-python/issues/4258)) - **deps-dev:** bump aws-cdk-lib from 2.139.1 to 2.140.0 ([#4259](https://github.com/aws-powertools/powertools-lambda-python/issues/4259)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.139.1a0 to 2.140.0a0 ([#4270](https://github.com/aws-powertools/powertools-lambda-python/issues/4270)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.139.0a0 to 2.139.1a0 ([#4261](https://github.com/aws-powertools/powertools-lambda-python/issues/4261)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.133 to 0.1.134 ([#4260](https://github.com/aws-powertools/powertools-lambda-python/issues/4260)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.134 to 0.1.135 ([#4273](https://github.com/aws-powertools/powertools-lambda-python/issues/4273)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.34.91 to 1.34.97 in the boto-typing group ([#4257](https://github.com/aws-powertools/powertools-lambda-python/issues/4257)) - **deps-dev:** bump sentry-sdk from 2.0.1 to 2.1.1 ([#4287](https://github.com/aws-powertools/powertools-lambda-python/issues/4287)) - **deps-dev:** bump aws-cdk-lib from 2.139.0 to 2.139.1 ([#4248](https://github.com/aws-powertools/powertools-lambda-python/issues/4248)) - **deps-dev:** bump cfn-lint from 0.86.4 to 0.87.0 ([#4249](https://github.com/aws-powertools/powertools-lambda-python/issues/4249)) - **deps-dev:** bump pytest-xdist from 3.5.0 to 3.6.1 ([#4247](https://github.com/aws-powertools/powertools-lambda-python/issues/4247)) - **deps-dev:** bump ruff from 0.4.2 to 0.4.3 ([#4286](https://github.com/aws-powertools/powertools-lambda-python/issues/4286)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.132 to 0.1.133 ([#4246](https://github.com/aws-powertools/powertools-lambda-python/issues/4246)) - **deps-dev:** bump jinja2 from 3.1.3 to 3.1.4 ([#4283](https://github.com/aws-powertools/powertools-lambda-python/issues/4283)) - **deps-dev:** bump aws-cdk from 2.139.0 to 2.139.1 ([#4245](https://github.com/aws-powertools/powertools-lambda-python/issues/4245)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.135 to 0.1.136 ([#4285](https://github.com/aws-powertools/powertools-lambda-python/issues/4285)) - **deps-dev:** bump filelock from 3.13.4 to 3.14.0 ([#4241](https://github.com/aws-powertools/powertools-lambda-python/issues/4241)) - **deps-dev:** bump hvac from 2.1.0 to 2.2.0 ([#4238](https://github.com/aws-powertools/powertools-lambda-python/issues/4238)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.131 to 0.1.132 ([#4239](https://github.com/aws-powertools/powertools-lambda-python/issues/4239)) - **deps-dev:** bump mkdocs-material from 9.5.19 to 9.5.20 ([#4242](https://github.com/aws-powertools/powertools-lambda-python/issues/4242)) - **deps-dev:** bump aws-cdk from 2.139.1 to 2.140.0 ([#4256](https://github.com/aws-powertools/powertools-lambda-python/issues/4256)) - **deps-dev:** bump pytest from 8.1.1 to 8.2.0 ([#4237](https://github.com/aws-powertools/powertools-lambda-python/issues/4237)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.136 to 0.1.139 ([#4293](https://github.com/aws-powertools/powertools-lambda-python/issues/4293)) - **deps-dev:** bump aws-cdk-lib from 2.141.0 to 2.142.1 ([#4352](https://github.com/aws-powertools/powertools-lambda-python/issues/4352)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.139 to 0.1.140 ([#4301](https://github.com/aws-powertools/powertools-lambda-python/issues/4301)) - **deps-dev:** bump sentry-sdk from 1.45.0 to 2.0.1 ([#4223](https://github.com/aws-powertools/powertools-lambda-python/issues/4223)) - **deps-dev:** bump mkdocs-material from 9.5.18 to 9.5.19 ([#4224](https://github.com/aws-powertools/powertools-lambda-python/issues/4224)) - **deps-dev:** bump black from 24.4.1 to 24.4.2 ([#4222](https://github.com/aws-powertools/powertools-lambda-python/issues/4222)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.138.0a0 to 2.139.0a0 ([#4225](https://github.com/aws-powertools/powertools-lambda-python/issues/4225)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.130 to 0.1.131 ([#4221](https://github.com/aws-powertools/powertools-lambda-python/issues/4221)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.140 to 0.1.142 ([#4307](https://github.com/aws-powertools/powertools-lambda-python/issues/4307)) - **deps-dev:** bump ruff from 0.4.1 to 0.4.2 ([#4212](https://github.com/aws-powertools/powertools-lambda-python/issues/4212)) - **deps-dev:** bump aws-cdk-lib from 2.138.0 to 2.139.0 ([#4213](https://github.com/aws-powertools/powertools-lambda-python/issues/4213)) - **deps-dev:** bump aws-cdk from 2.138.0 to 2.139.0 ([#4215](https://github.com/aws-powertools/powertools-lambda-python/issues/4215)) - **deps-dev:** bump aws-cdk from 2.140.0 to 2.141.0 ([#4306](https://github.com/aws-powertools/powertools-lambda-python/issues/4306)) - **deps-dev:** bump types-redis from 4.6.0.20240423 to 4.6.0.20240425 ([#4214](https://github.com/aws-powertools/powertools-lambda-python/issues/4214)) - **deps-dev:** bump aws-cdk-lib from 2.140.0 to 2.141.0 ([#4308](https://github.com/aws-powertools/powertools-lambda-python/issues/4308)) - **deps-dev:** bump the boto-typing group with 2 updates ([#4210](https://github.com/aws-powertools/powertools-lambda-python/issues/4210)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.126 to 0.1.130 ([#4209](https://github.com/aws-powertools/powertools-lambda-python/issues/4209)) - **deps-dev:** bump ruff from 0.4.3 to 0.4.4 ([#4309](https://github.com/aws-powertools/powertools-lambda-python/issues/4309)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.140.0a0 to 2.141.0a0 ([#4318](https://github.com/aws-powertools/powertools-lambda-python/issues/4318)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.142 to 0.1.144 ([#4316](https://github.com/aws-powertools/powertools-lambda-python/issues/4316)) - **deps-dev:** bump black from 24.4.0 to 24.4.1 ([#4203](https://github.com/aws-powertools/powertools-lambda-python/issues/4203)) - **deps-dev:** bump mypy from 1.9.0 to 1.10.0 ([#4202](https://github.com/aws-powertools/powertools-lambda-python/issues/4202)) - **deps-dev:** bump mypy-boto3-ssm from 1.34.61 to 1.34.91 in the boto-typing group ([#4201](https://github.com/aws-powertools/powertools-lambda-python/issues/4201)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.123 to 0.1.126 ([#4188](https://github.com/aws-powertools/powertools-lambda-python/issues/4188)) - **deps-dev:** bump cfn-lint from 0.87.1 to 0.87.2 ([#4317](https://github.com/aws-powertools/powertools-lambda-python/issues/4317)) - **deps-dev:** bump coverage from 7.4.4 to 7.5.0 ([#4186](https://github.com/aws-powertools/powertools-lambda-python/issues/4186)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.144 to 0.1.145 ([#4325](https://github.com/aws-powertools/powertools-lambda-python/issues/4325)) - **deps-dev:** bump types-redis from 4.6.0.20240417 to 4.6.0.20240423 ([#4185](https://github.com/aws-powertools/powertools-lambda-python/issues/4185)) - **deps-dev:** bump mkdocs-material from 9.5.21 to 9.5.22 ([#4324](https://github.com/aws-powertools/powertools-lambda-python/issues/4324)) - **deps-dev:** bump mypy-boto3-s3 from 1.34.91 to 1.34.105 in the boto-typing group ([#4329](https://github.com/aws-powertools/powertools-lambda-python/issues/4329)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.145 to 0.1.146 ([#4330](https://github.com/aws-powertools/powertools-lambda-python/issues/4330)) - **deps-dev:** bump cfn-lint from 0.86.3 to 0.86.4 ([#4180](https://github.com/aws-powertools/powertools-lambda-python/issues/4180)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.121 to 0.1.123 ([#4176](https://github.com/aws-powertools/powertools-lambda-python/issues/4176)) - **deps-dev:** bump mkdocs-material from 9.5.22 to 9.5.23 ([#4338](https://github.com/aws-powertools/powertools-lambda-python/issues/4338)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.137.0a0 to 2.138.0a0 ([#4169](https://github.com/aws-powertools/powertools-lambda-python/issues/4169)) - **deps-dev:** bump aws-cdk from 2.141.0 to 2.142.0 ([#4343](https://github.com/aws-powertools/powertools-lambda-python/issues/4343)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.119 to 0.1.121 ([#4167](https://github.com/aws-powertools/powertools-lambda-python/issues/4167)) - **deps-dev:** bump ruff from 0.3.7 to 0.4.1 ([#4166](https://github.com/aws-powertools/powertools-lambda-python/issues/4166)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.34.72 to 1.34.107 in the boto-typing group ([#4345](https://github.com/aws-powertools/powertools-lambda-python/issues/4345)) - **deps-dev:** bump aws-cdk from 2.137.0 to 2.138.0 ([#4157](https://github.com/aws-powertools/powertools-lambda-python/issues/4157)) - **deps-dev:** bump aws-cdk-lib from 2.137.0 to 2.138.0 ([#4160](https://github.com/aws-powertools/powertools-lambda-python/issues/4160)) - **deps-dev:** bump sentry-sdk from 2.1.1 to 2.2.0 ([#4348](https://github.com/aws-powertools/powertools-lambda-python/issues/4348)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.146 to 0.1.150 ([#4346](https://github.com/aws-powertools/powertools-lambda-python/issues/4346)) - **deps-dev:** bump coverage from 7.5.0 to 7.5.1 ([#4288](https://github.com/aws-powertools/powertools-lambda-python/issues/4288)) - **governance:** add FastAPI third party license attribution ([#4297](https://github.com/aws-powertools/powertools-lambda-python/issues/4297)) ## [v2.37.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.36.0...v2.37.0) - 2024-04-18 ## Bug Fixes - **docs:** clarified usage of validation with fine grained responses ([#4101](https://github.com/aws-powertools/powertools-lambda-python/issues/4101)) - **event_source:** fix typo in physicalname attribute for AmazonMQ events ([#4053](https://github.com/aws-powertools/powertools-lambda-python/issues/4053)) - **typing:** make the case_sensitive field a boolean only ([#4128](https://github.com/aws-powertools/powertools-lambda-python/issues/4128)) - **typing:** improve overloads to ensure the return type follows the default_value type ([#4114](https://github.com/aws-powertools/powertools-lambda-python/issues/4114)) ## Documentation - **we-made-this:** new article on how to stream data with AWS Lambda & Powertools for AWS Lambda ([#4068](https://github.com/aws-powertools/powertools-lambda-python/issues/4068)) ## Features - **Idempotency:** add feature for manipulating idempotent responses ([#4037](https://github.com/aws-powertools/powertools-lambda-python/issues/4037)) - **event_handler:** add support for OpenAPI security schemes ([#4103](https://github.com/aws-powertools/powertools-lambda-python/issues/4103)) - **logger:** add method to return currently configured keys ([#4033](https://github.com/aws-powertools/powertools-lambda-python/issues/4033)) ## Maintenance - version bump - **ci:** add monthly roadmap reminder workflow ([#4075](https://github.com/aws-powertools/powertools-lambda-python/issues/4075)) - **ci:** prevent deprecated custom runner from being used ([#4061](https://github.com/aws-powertools/powertools-lambda-python/issues/4061)) - **deps:** bump squidfunk/mkdocs-material from `065f3af` to `6b124e1` in /docs ([#4055](https://github.com/aws-powertools/powertools-lambda-python/issues/4055)) - **deps:** bump squidfunk/mkdocs-material from `3307665` to `065f3af` in /docs ([#4052](https://github.com/aws-powertools/powertools-lambda-python/issues/4052)) - **deps:** bump idna from 3.6 to 3.7 ([#4121](https://github.com/aws-powertools/powertools-lambda-python/issues/4121)) - **deps:** bump sqlparse from 0.4.4 to 0.5.0 ([#4138](https://github.com/aws-powertools/powertools-lambda-python/issues/4138)) - **deps:** bump squidfunk/mkdocs-material from `6b124e1` to `521644b` in /docs ([#4141](https://github.com/aws-powertools/powertools-lambda-python/issues/4141)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#4066](https://github.com/aws-powertools/powertools-lambda-python/issues/4066)) - **deps:** bump pydantic from 1.10.14 to 1.10.15 ([#4064](https://github.com/aws-powertools/powertools-lambda-python/issues/4064)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#4042](https://github.com/aws-powertools/powertools-lambda-python/issues/4042)) - **deps:** bump golang.org/x/sync from 0.6.0 to 0.7.0 in /layer/scripts/layer-balancer in the layer-balancer group ([#4071](https://github.com/aws-powertools/powertools-lambda-python/issues/4071)) - **deps:** bump codecov/codecov-action from 4.1.1 to 4.2.0 ([#4072](https://github.com/aws-powertools/powertools-lambda-python/issues/4072)) - **deps:** bump datadog-lambda from 5.91.0 to 5.92.0 ([#4038](https://github.com/aws-powertools/powertools-lambda-python/issues/4038)) - **deps:** bump github.com/aws/aws-sdk-go-v2/config from 1.27.10 to 1.27.11 in /layer/scripts/layer-balancer in the layer-balancer group ([#4079](https://github.com/aws-powertools/powertools-lambda-python/issues/4079)) - **deps:** bump typing-extensions from 4.10.0 to 4.11.0 ([#4080](https://github.com/aws-powertools/powertools-lambda-python/issues/4080)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.3 to 3.0.4 ([#4099](https://github.com/aws-powertools/powertools-lambda-python/issues/4099)) - **deps:** bump codecov/codecov-action from 4.2.0 to 4.3.0 ([#4098](https://github.com/aws-powertools/powertools-lambda-python/issues/4098)) - **deps:** bump docker/setup-buildx-action from 3.2.0 to 3.3.0 ([#4091](https://github.com/aws-powertools/powertools-lambda-python/issues/4091)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.112 to 0.1.113 ([#4136](https://github.com/aws-powertools/powertools-lambda-python/issues/4136)) - **deps-dev:** bump aws-cdk from 2.135.0 to 2.136.0 ([#4090](https://github.com/aws-powertools/powertools-lambda-python/issues/4090)) - **deps-dev:** bump types-redis from 4.6.0.20240311 to 4.6.0.20240409 ([#4094](https://github.com/aws-powertools/powertools-lambda-python/issues/4094)) - **deps-dev:** bump aws-cdk-lib from 2.135.0 to 2.136.0 ([#4092](https://github.com/aws-powertools/powertools-lambda-python/issues/4092)) - **deps-dev:** bump cfn-lint from 0.86.1 to 0.86.2 ([#4081](https://github.com/aws-powertools/powertools-lambda-python/issues/4081)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.135.0a0 to 2.136.0a0 ([#4095](https://github.com/aws-powertools/powertools-lambda-python/issues/4095)) - **deps-dev:** bump filelock from 3.13.3 to 3.13.4 ([#4096](https://github.com/aws-powertools/powertools-lambda-python/issues/4096)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.106 to 0.1.107 ([#4082](https://github.com/aws-powertools/powertools-lambda-python/issues/4082)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.107 to 0.1.110 ([#4097](https://github.com/aws-powertools/powertools-lambda-python/issues/4097)) - **deps-dev:** bump aws-cdk from 2.136.0 to 2.136.1 ([#4106](https://github.com/aws-powertools/powertools-lambda-python/issues/4106)) - **deps-dev:** bump aws-cdk-lib from 2.136.0 to 2.136.1 ([#4107](https://github.com/aws-powertools/powertools-lambda-python/issues/4107)) - **deps-dev:** bump sentry-sdk from 1.44.1 to 1.45.0 ([#4108](https://github.com/aws-powertools/powertools-lambda-python/issues/4108)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.110 to 0.1.112 ([#4109](https://github.com/aws-powertools/powertools-lambda-python/issues/4109)) - **deps-dev:** bump sentry-sdk from 1.44.0 to 1.44.1 ([#4065](https://github.com/aws-powertools/powertools-lambda-python/issues/4065)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.134.0a0 to 2.135.0a0 ([#4063](https://github.com/aws-powertools/powertools-lambda-python/issues/4063)) - **deps-dev:** bump aws-cdk from 2.136.1 to 2.137.0 ([#4115](https://github.com/aws-powertools/powertools-lambda-python/issues/4115)) - **deps-dev:** bump the boto-typing group with 2 updates ([#4062](https://github.com/aws-powertools/powertools-lambda-python/issues/4062)) - **deps-dev:** bump mypy-boto3-cloudwatch from 1.34.75 to 1.34.83 in the boto-typing group ([#4116](https://github.com/aws-powertools/powertools-lambda-python/issues/4116)) - **deps-dev:** bump ruff from 0.3.5 to 0.3.7 ([#4123](https://github.com/aws-powertools/powertools-lambda-python/issues/4123)) - **deps-dev:** bump aws-cdk-lib from 2.136.1 to 2.137.0 ([#4119](https://github.com/aws-powertools/powertools-lambda-python/issues/4119)) - **deps-dev:** bump aws-cdk from 2.134.0 to 2.135.0 ([#4058](https://github.com/aws-powertools/powertools-lambda-python/issues/4058)) - **deps-dev:** bump aws-cdk-lib from 2.134.0 to 2.135.0 ([#4057](https://github.com/aws-powertools/powertools-lambda-python/issues/4057)) - **deps-dev:** bump mkdocs-material from 9.5.16 to 9.5.17 ([#4056](https://github.com/aws-powertools/powertools-lambda-python/issues/4056)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.136.0a0 to 2.137.0a0 ([#4124](https://github.com/aws-powertools/powertools-lambda-python/issues/4124)) - **deps-dev:** bump mypy-boto3-cloudformation from 1.34.77 to 1.34.84 in the boto-typing group ([#4126](https://github.com/aws-powertools/powertools-lambda-python/issues/4126)) - **deps-dev:** bump ruff from 0.3.4 to 0.3.5 ([#4049](https://github.com/aws-powertools/powertools-lambda-python/issues/4049)) - **deps-dev:** bump mkdocs-material from 9.5.15 to 9.5.16 ([#4050](https://github.com/aws-powertools/powertools-lambda-python/issues/4050)) - **deps-dev:** bump the boto-typing group with 1 update ([#4047](https://github.com/aws-powertools/powertools-lambda-python/issues/4047)) - **deps-dev:** bump black from 24.3.0 to 24.4.0 ([#4135](https://github.com/aws-powertools/powertools-lambda-python/issues/4135)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.105 to 0.1.106 ([#4048](https://github.com/aws-powertools/powertools-lambda-python/issues/4048)) - **deps-dev:** bump cfn-lint from 0.86.2 to 0.86.3 ([#4137](https://github.com/aws-powertools/powertools-lambda-python/issues/4137)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.113 to 0.1.115 ([#4142](https://github.com/aws-powertools/powertools-lambda-python/issues/4142)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.133.0a0 to 2.134.0a0 ([#4039](https://github.com/aws-powertools/powertools-lambda-python/issues/4039)) - **deps-dev:** bump mkdocs-material from 9.5.17 to 9.5.18 ([#4143](https://github.com/aws-powertools/powertools-lambda-python/issues/4143)) - **deps-dev:** bump types-redis from 4.6.0.20240409 to 4.6.0.20240417 ([#4145](https://github.com/aws-powertools/powertools-lambda-python/issues/4145)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.115 to 0.1.119 ([#4150](https://github.com/aws-powertools/powertools-lambda-python/issues/4150)) - **deps-dev:** bump aws-cdk-lib from 2.133.0 to 2.134.0 ([#4031](https://github.com/aws-powertools/powertools-lambda-python/issues/4031)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.104 to 0.1.105 ([#4030](https://github.com/aws-powertools/powertools-lambda-python/issues/4030)) - **deps-dev:** bump aws-cdk from 2.133.0 to 2.134.0 ([#4032](https://github.com/aws-powertools/powertools-lambda-python/issues/4032)) - **deps-dev:** bump the boto-typing group with 1 update ([#4029](https://github.com/aws-powertools/powertools-lambda-python/issues/4029)) - **deps-dev:** bump sentry-sdk from 1.43.0 to 1.44.0 ([#4040](https://github.com/aws-powertools/powertools-lambda-python/issues/4040)) - **docs:** update highlighted lines on the Typing examples ([#4131](https://github.com/aws-powertools/powertools-lambda-python/issues/4131)) ## [v2.36.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.35.1...v2.36.0) - 2024-03-27 ## Bug Fixes - **event_handler:** always add 422 response to the schema ([#3995](https://github.com/aws-powertools/powertools-lambda-python/issues/3995)) - **event_handler:** make decoded_body field optional in ApiGateway resolver ([#3937](https://github.com/aws-powertools/powertools-lambda-python/issues/3937)) - **tracer:** add name sanitization for X-Ray subsegments ([#4005](https://github.com/aws-powertools/powertools-lambda-python/issues/4005)) ## Code Refactoring - **logger:** add type annotation for append_keys method ([#3988](https://github.com/aws-powertools/powertools-lambda-python/issues/3988)) - **parameters:** improve typing for get_secret method ([#3910](https://github.com/aws-powertools/powertools-lambda-python/issues/3910)) ## Documentation - **batch:** improved the example demonstrating how to create a custom partial processor. ([#4024](https://github.com/aws-powertools/powertools-lambda-python/issues/4024)) - **bedrock-agents:** fix type in Bedrock operation example ([#3948](https://github.com/aws-powertools/powertools-lambda-python/issues/3948)) - **tutorial:** fix "Simplifying with Tracer" section in the tutorial ([#3962](https://github.com/aws-powertools/powertools-lambda-python/issues/3962)) ## Features - **batch:** add flag in SqsFifoProcessor to enable continuous message processing ([#3954](https://github.com/aws-powertools/powertools-lambda-python/issues/3954)) - **data_classes:** Add CloudWatchAlarmEvent data class ([#3868](https://github.com/aws-powertools/powertools-lambda-python/issues/3868)) - **event-handler:** add compress option when serving Swagger HTML ([#3946](https://github.com/aws-powertools/powertools-lambda-python/issues/3946)) - **event_handler:** define exception_handler directly from the router ([#3979](https://github.com/aws-powertools/powertools-lambda-python/issues/3979)) - **metrics:** allow custom timestamps for metrics ([#4006](https://github.com/aws-powertools/powertools-lambda-python/issues/4006)) - **parameters:** add feature for creating and updating Parameters and Secrets ([#2858](https://github.com/aws-powertools/powertools-lambda-python/issues/2858)) - **tracer:** auto-disable tracer when for AWS SAM and Chalice environments ([#3949](https://github.com/aws-powertools/powertools-lambda-python/issues/3949)) ## Maintenance - version bump - **deps:** bump squidfunk/mkdocs-material from `3678304` to `6c81a89` in /docs ([#3973](https://github.com/aws-powertools/powertools-lambda-python/issues/3973)) - **deps:** bump datadog-lambda from 5.89.0 to 5.90.0 ([#3941](https://github.com/aws-powertools/powertools-lambda-python/issues/3941)) - **deps:** bump actions/checkout from 4.1.1 to 4.1.2 ([#3939](https://github.com/aws-powertools/powertools-lambda-python/issues/3939)) - **deps:** bump redis from 5.0.2 to 5.0.3 ([#3929](https://github.com/aws-powertools/powertools-lambda-python/issues/3929)) - **deps:** bump slsa-framework/slsa-github-generator from 1.9.0 to 1.10.0 ([#3997](https://github.com/aws-powertools/powertools-lambda-python/issues/3997)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#4001](https://github.com/aws-powertools/powertools-lambda-python/issues/4001)) - **deps:** bump actions/dependency-review-action from 4.2.3 to 4.2.4 ([#4012](https://github.com/aws-powertools/powertools-lambda-python/issues/4012)) - **deps:** bump docker/setup-buildx-action from 3.1.0 to 3.2.0 ([#3955](https://github.com/aws-powertools/powertools-lambda-python/issues/3955)) - **deps:** bump actions/dependency-review-action from 4.1.3 to 4.2.3 ([#3993](https://github.com/aws-powertools/powertools-lambda-python/issues/3993)) - **deps:** bump datadog-lambda from 5.90.0 to 5.91.0 ([#3958](https://github.com/aws-powertools/powertools-lambda-python/issues/3958)) - **deps:** bump pypa/gh-action-pypi-publish from 1.8.12 to 1.8.14 ([#3918](https://github.com/aws-powertools/powertools-lambda-python/issues/3918)) - **deps:** bump squidfunk/mkdocs-material from `6c81a89` to `3307665` in /docs ([#4017](https://github.com/aws-powertools/powertools-lambda-python/issues/4017)) - **deps:** bump actions/dependency-review-action from 4.2.4 to 4.2.5 ([#4023](https://github.com/aws-powertools/powertools-lambda-python/issues/4023)) - **deps:** bump aws-encryption-sdk from 3.1.1 to 3.2.0 ([#3983](https://github.com/aws-powertools/powertools-lambda-python/issues/3983)) - **deps:** bump actions/setup-python from 5.0.0 to 5.1.0 ([#4022](https://github.com/aws-powertools/powertools-lambda-python/issues/4022)) - **deps:** bump codecov/codecov-action from 4.1.0 to 4.1.1 ([#4021](https://github.com/aws-powertools/powertools-lambda-python/issues/4021)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#3972](https://github.com/aws-powertools/powertools-lambda-python/issues/3972)) - **deps-dev:** bump filelock from 3.13.1 to 3.13.3 ([#4014](https://github.com/aws-powertools/powertools-lambda-python/issues/4014)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.90 to 0.1.91 ([#3975](https://github.com/aws-powertools/powertools-lambda-python/issues/3975)) - **deps-dev:** bump types-python-dateutil from 2.9.0.20240315 to 2.9.0.20240316 ([#3977](https://github.com/aws-powertools/powertools-lambda-python/issues/3977)) - **deps-dev:** bump mkdocs-material from 9.5.13 to 9.5.14 ([#3978](https://github.com/aws-powertools/powertools-lambda-python/issues/3978)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.132.1a0 to 2.133.0a0 ([#3976](https://github.com/aws-powertools/powertools-lambda-python/issues/3976)) - **deps-dev:** bump the boto-typing group with 2 updates ([#3974](https://github.com/aws-powertools/powertools-lambda-python/issues/3974)) - **deps-dev:** bump the boto-typing group with 2 updates ([#3982](https://github.com/aws-powertools/powertools-lambda-python/issues/3982)) - **deps-dev:** bump ruff from 0.3.2 to 0.3.3 ([#3967](https://github.com/aws-powertools/powertools-lambda-python/issues/3967)) - **deps-dev:** bump aws-cdk-lib from 2.132.1 to 2.133.0 ([#3965](https://github.com/aws-powertools/powertools-lambda-python/issues/3965)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.91 to 0.1.94 ([#3985](https://github.com/aws-powertools/powertools-lambda-python/issues/3985)) - **deps-dev:** bump black from 24.2.0 to 24.3.0 ([#3968](https://github.com/aws-powertools/powertools-lambda-python/issues/3968)) - **deps-dev:** bump types-python-dateutil from 2.8.19.20240311 to 2.9.0.20240315 ([#3966](https://github.com/aws-powertools/powertools-lambda-python/issues/3966)) - **deps-dev:** bump aws-cdk from 2.132.1 to 2.133.0 ([#3963](https://github.com/aws-powertools/powertools-lambda-python/issues/3963)) - **deps-dev:** bump the boto-typing group with 1 update ([#3964](https://github.com/aws-powertools/powertools-lambda-python/issues/3964)) - **deps-dev:** bump pytest-asyncio from 0.23.5.post1 to 0.23.6 ([#3984](https://github.com/aws-powertools/powertools-lambda-python/issues/3984)) - **deps-dev:** bump the boto-typing group with 1 update ([#3991](https://github.com/aws-powertools/powertools-lambda-python/issues/3991)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.89 to 0.1.90 ([#3957](https://github.com/aws-powertools/powertools-lambda-python/issues/3957)) - **deps-dev:** bump the boto-typing group with 1 update ([#3956](https://github.com/aws-powertools/powertools-lambda-python/issues/3956)) - **deps-dev:** bump sentry-sdk from 1.42.0 to 1.43.0 ([#3992](https://github.com/aws-powertools/powertools-lambda-python/issues/3992)) - **deps-dev:** bump coverage from 7.4.3 to 7.4.4 ([#3959](https://github.com/aws-powertools/powertools-lambda-python/issues/3959)) - **deps-dev:** bump ruff from 0.3.3 to 0.3.4 ([#3996](https://github.com/aws-powertools/powertools-lambda-python/issues/3996)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.88 to 0.1.89 ([#3952](https://github.com/aws-powertools/powertools-lambda-python/issues/3952)) - **deps-dev:** bump sentry-sdk from 1.41.0 to 1.42.0 ([#3951](https://github.com/aws-powertools/powertools-lambda-python/issues/3951)) - **deps-dev:** bump the boto-typing group with 1 update ([#3950](https://github.com/aws-powertools/powertools-lambda-python/issues/3950)) - **deps-dev:** bump pytest-mock from 3.12.0 to 3.13.0 ([#3999](https://github.com/aws-powertools/powertools-lambda-python/issues/3999)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.94 to 0.1.96 ([#4002](https://github.com/aws-powertools/powertools-lambda-python/issues/4002)) - **deps-dev:** bump the boto-typing group with 2 updates ([#3940](https://github.com/aws-powertools/powertools-lambda-python/issues/3940)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.87 to 0.1.88 ([#3942](https://github.com/aws-powertools/powertools-lambda-python/issues/3942)) - **deps-dev:** bump pytest from 8.0.2 to 8.1.1 ([#3943](https://github.com/aws-powertools/powertools-lambda-python/issues/3943)) - **deps-dev:** bump aws-cdk-aws-lambda-python-alpha from 2.131.0a0 to 2.132.1a0 ([#3944](https://github.com/aws-powertools/powertools-lambda-python/issues/3944)) - **deps-dev:** bump cfn-lint from 0.86.0 to 0.86.1 ([#3998](https://github.com/aws-powertools/powertools-lambda-python/issues/3998)) - **deps-dev:** bump aws-cdk from 2.132.0 to 2.132.1 ([#3938](https://github.com/aws-powertools/powertools-lambda-python/issues/3938)) - **deps-dev:** bump aws-cdk-lib from 2.131.0 to 2.132.1 ([#3936](https://github.com/aws-powertools/powertools-lambda-python/issues/3936)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.96 to 0.1.99 ([#4008](https://github.com/aws-powertools/powertools-lambda-python/issues/4008)) - **deps-dev:** bump aws-cdk from 2.131.0 to 2.132.0 ([#3928](https://github.com/aws-powertools/powertools-lambda-python/issues/3928)) - **deps-dev:** bump types-redis from 4.6.0.20240218 to 4.6.0.20240311 ([#3931](https://github.com/aws-powertools/powertools-lambda-python/issues/3931)) - **deps-dev:** bump types-python-dateutil from 2.8.19.20240106 to 2.8.19.20240311 ([#3932](https://github.com/aws-powertools/powertools-lambda-python/issues/3932)) - **deps-dev:** bump pytest-mock from 3.13.0 to 3.14.0 ([#4007](https://github.com/aws-powertools/powertools-lambda-python/issues/4007)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.99 to 0.1.101 ([#4015](https://github.com/aws-powertools/powertools-lambda-python/issues/4015)) - **deps-dev:** bump ruff from 0.3.0 to 0.3.2 ([#3925](https://github.com/aws-powertools/powertools-lambda-python/issues/3925)) - **deps-dev:** bump mypy from 1.8.0 to 1.9.0 ([#3921](https://github.com/aws-powertools/powertools-lambda-python/issues/3921)) - **deps-dev:** bump bandit from 1.7.7 to 1.7.8 ([#3920](https://github.com/aws-powertools/powertools-lambda-python/issues/3920)) - **deps-dev:** bump pytest-cov from 4.1.0 to 5.0.0 ([#4013](https://github.com/aws-powertools/powertools-lambda-python/issues/4013)) - **deps-dev:** bump pytest-asyncio from 0.23.5 to 0.23.5.post1 ([#3923](https://github.com/aws-powertools/powertools-lambda-python/issues/3923)) - **deps-dev:** bump mkdocs-material from 9.5.14 to 9.5.15 ([#4016](https://github.com/aws-powertools/powertools-lambda-python/issues/4016)) - **deps-dev:** bump the boto-typing group with 2 updates ([#3919](https://github.com/aws-powertools/powertools-lambda-python/issues/3919)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.101 to 0.1.104 ([#4020](https://github.com/aws-powertools/powertools-lambda-python/issues/4020)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.83 to 0.1.87 ([#3930](https://github.com/aws-powertools/powertools-lambda-python/issues/3930)) ## [v2.35.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.35.0...v2.35.1) - 2024-03-08 ## Bug Fixes - **data_sources:** ensure correct types on SQSMessageAttributes ([#3898](https://github.com/aws-powertools/powertools-lambda-python/issues/3898)) - **event_handler:** validate POST bodies on BedrockAgentResolver ([#3903](https://github.com/aws-powertools/powertools-lambda-python/issues/3903)) - **internal:** call ruff with correct args ([#3901](https://github.com/aws-powertools/powertools-lambda-python/issues/3901)) ## Features - **event_handler:** use custom serializer during openapi serialization ([#3900](https://github.com/aws-powertools/powertools-lambda-python/issues/3900)) ## Maintenance - version bump - **deps:** bump aws-xray-sdk from 2.12.1 to 2.13.0 ([#3906](https://github.com/aws-powertools/powertools-lambda-python/issues/3906)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#3911](https://github.com/aws-powertools/powertools-lambda-python/issues/3911)) - **deps:** bump squidfunk/mkdocs-material from `7be068b` to `3678304` in /docs ([#3894](https://github.com/aws-powertools/powertools-lambda-python/issues/3894)) - **deps:** bump datadog-lambda from 5.88.0 to 5.89.0 ([#3907](https://github.com/aws-powertools/powertools-lambda-python/issues/3907)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.81 to 0.1.82 ([#3896](https://github.com/aws-powertools/powertools-lambda-python/issues/3896)) - **deps-dev:** bump sentry-sdk from 1.40.6 to 1.41.0 ([#3905](https://github.com/aws-powertools/powertools-lambda-python/issues/3905)) - **deps-dev:** bump mkdocs-material from 9.5.12 to 9.5.13 ([#3895](https://github.com/aws-powertools/powertools-lambda-python/issues/3895)) - **deps-dev:** bump cdklabs-generative-ai-cdk-constructs from 0.1.82 to 0.1.83 ([#3908](https://github.com/aws-powertools/powertools-lambda-python/issues/3908)) - **deps-dev:** bump the boto-typing group with 1 update ([#3904](https://github.com/aws-powertools/powertools-lambda-python/issues/3904)) ## [v2.35.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.34.2...v2.35.0) - 2024-03-06 ## Bug Fixes - **event_handler:** OpenAPI schema version respects Pydantic version ([#3860](https://github.com/aws-powertools/powertools-lambda-python/issues/3860)) ## Code Refactoring - **logger:** improve typing ([#3869](https://github.com/aws-powertools/powertools-lambda-python/issues/3869)) ## Documentation - **event_handler:** add bedrock agent resolver documentation ([#3602](https://github.com/aws-powertools/powertools-lambda-python/issues/3602)) ## Maintenance - version bump - **deps:** bump docker/setup-buildx-action from 3.0.0 to 3.1.0 ([#3864](https://github.com/aws-powertools/powertools-lambda-python/issues/3864)) - **deps:** bump actions/download-artifact from 4.1.3 to 4.1.4 ([#3875](https://github.com/aws-powertools/powertools-lambda-python/issues/3875)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3884](https://github.com/aws-powertools/powertools-lambda-python/issues/3884)) - **deps:** bump squidfunk/mkdocs-material from `49d1bfd` to `7be068b` in /docs ([#3872](https://github.com/aws-powertools/powertools-lambda-python/issues/3872)) - **deps:** bump squidfunk/mkdocs-material from `43b898a` to `49d1bfd` in /docs ([#3857](https://github.com/aws-powertools/powertools-lambda-python/issues/3857)) - **deps:** bump codecov/codecov-action from 4.0.2 to 4.1.0 ([#3856](https://github.com/aws-powertools/powertools-lambda-python/issues/3856)) - **deps:** bump redis from 5.0.1 to 5.0.2 ([#3867](https://github.com/aws-powertools/powertools-lambda-python/issues/3867)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3887](https://github.com/aws-powertools/powertools-lambda-python/issues/3887)) - **deps:** bump actions/download-artifact from 4.1.2 to 4.1.3 ([#3862](https://github.com/aws-powertools/powertools-lambda-python/issues/3862)) - **deps:** bump pypa/gh-action-pypi-publish from 1.8.11 to 1.8.12 ([#3863](https://github.com/aws-powertools/powertools-lambda-python/issues/3863)) - **deps-dev:** bump aws-cdk-lib from 2.130.0 to 2.131.0 ([#3881](https://github.com/aws-powertools/powertools-lambda-python/issues/3881)) - **deps-dev:** bump cfn-lint from 0.85.3 to 0.86.0 ([#3882](https://github.com/aws-powertools/powertools-lambda-python/issues/3882)) - **deps-dev:** bump black from 24.1.1 to 24.2.0 ([#3760](https://github.com/aws-powertools/powertools-lambda-python/issues/3760)) - **deps-dev:** bump cfn-lint from 0.85.2 to 0.85.3 ([#3861](https://github.com/aws-powertools/powertools-lambda-python/issues/3861)) - **deps-dev:** bump aws-cdk from 2.130.0 to 2.131.0 ([#3883](https://github.com/aws-powertools/powertools-lambda-python/issues/3883)) - **deps-dev:** bump mkdocs-material from 9.5.11 to 9.5.12 ([#3870](https://github.com/aws-powertools/powertools-lambda-python/issues/3870)) - **deps-dev:** bump ruff from 0.2.2 to 0.3.0 ([#3871](https://github.com/aws-powertools/powertools-lambda-python/issues/3871)) - **docs:** add Bedrock Agents to feature list ([#3889](https://github.com/aws-powertools/powertools-lambda-python/issues/3889)) ## [v2.34.2](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.34.1...v2.34.2) - 2024-02-26 ## Bug Fixes - **typing:** ensure return type is a str when default_value is set ([#3840](https://github.com/aws-powertools/powertools-lambda-python/issues/3840)) ## Documentation - **install:** make minimum install the default option then extra ([#3834](https://github.com/aws-powertools/powertools-lambda-python/issues/3834)) ## Features - **event-source:** add function to get multi-value query string params by name ([#3846](https://github.com/aws-powertools/powertools-lambda-python/issues/3846)) ## Maintenance - version bump - **ci:** remove aws-encryption-sdk from Lambda layer due to cffi being tied to python version ([#3853](https://github.com/aws-powertools/powertools-lambda-python/issues/3853)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#3844](https://github.com/aws-powertools/powertools-lambda-python/issues/3844)) - **deps:** bump cryptography from 42.0.2 to 42.0.4 ([#3827](https://github.com/aws-powertools/powertools-lambda-python/issues/3827)) - **deps:** bump codecov/codecov-action from 4.0.1 to 4.0.2 ([#3842](https://github.com/aws-powertools/powertools-lambda-python/issues/3842)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#3835](https://github.com/aws-powertools/powertools-lambda-python/issues/3835)) - **deps-dev:** bump httpx from 0.26.0 to 0.27.0 ([#3828](https://github.com/aws-powertools/powertools-lambda-python/issues/3828)) - **deps-dev:** bump aws-cdk from 2.128.0 to 2.129.0 ([#3831](https://github.com/aws-powertools/powertools-lambda-python/issues/3831)) - **deps-dev:** bump the boto-typing group with 1 update ([#3836](https://github.com/aws-powertools/powertools-lambda-python/issues/3836)) - **deps-dev:** bump aws-cdk from 2.129.0 to 2.130.0 ([#3843](https://github.com/aws-powertools/powertools-lambda-python/issues/3843)) - **deps-dev:** bump aws-cdk-lib from 2.128.0 to 2.130.0 ([#3838](https://github.com/aws-powertools/powertools-lambda-python/issues/3838)) ## [v2.34.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.34.0...v2.34.1) - 2024-02-21 ## Bug Fixes - **ci:** inject PR_LABELS env for PR Label automation ([#3819](https://github.com/aws-powertools/powertools-lambda-python/issues/3819)) - **ci:** revert layer version bump write-only back to append ([#3818](https://github.com/aws-powertools/powertools-lambda-python/issues/3818)) - **event-handler:** return dict on missing multi_value_headers ([#3824](https://github.com/aws-powertools/powertools-lambda-python/issues/3824)) - **idempotency:** validate before saving to cache ([#3822](https://github.com/aws-powertools/powertools-lambda-python/issues/3822)) ## Maintenance - version bump - **deps-dev:** bump ruff from 0.2.1 to 0.2.2 ([#3802](https://github.com/aws-powertools/powertools-lambda-python/issues/3802)) - **deps-dev:** bump the boto-typing group with 2 updates ([#3810](https://github.com/aws-powertools/powertools-lambda-python/issues/3810)) ## [v2.34.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.33.1...v2.34.0) - 2024-02-21 ## Bug Fixes - **ci:** create one layer artifact per region & merge ([#3808](https://github.com/aws-powertools/powertools-lambda-python/issues/3808)) - **event-handler:** multi-value query string and validation of scalar parameters ([#3795](https://github.com/aws-powertools/powertools-lambda-python/issues/3795)) - **event-handler:** swagger schema respects api stage ([#3796](https://github.com/aws-powertools/powertools-lambda-python/issues/3796)) - **event-handler:** handle aliased parameters e.g., Query(alias="categoryType") ([#3766](https://github.com/aws-powertools/powertools-lambda-python/issues/3766)) ## Code Refactoring - **feature-flags:** add intersection tests; structure refinement ([#3775](https://github.com/aws-powertools/powertools-lambda-python/issues/3775)) ## Documentation - **feature_flags:** fix incorrect line markers and envelope name ([#3792](https://github.com/aws-powertools/powertools-lambda-python/issues/3792)) - **home:** update layer version to 62 for package version 2.33.1 ([#3778](https://github.com/aws-powertools/powertools-lambda-python/issues/3778)) - **home:** add note about POWERTOOLS_DEV side effects in CloudWatch Logs ([#3770](https://github.com/aws-powertools/powertools-lambda-python/issues/3770)) - **homepage:** discord flat badge style; remove former devax email ([#3768](https://github.com/aws-powertools/powertools-lambda-python/issues/3768)) - **homepage:** remove leftover announcement banner ([#3783](https://github.com/aws-powertools/powertools-lambda-python/issues/3783)) - **roadmap:** latest roadmap update; use new grid to de-clutter homepage ([#3755](https://github.com/aws-powertools/powertools-lambda-python/issues/3755)) - **we-made-this:** add swagger post ([#3799](https://github.com/aws-powertools/powertools-lambda-python/issues/3799)) - **we-made-this:** add reinvent 2023 session ([#3790](https://github.com/aws-powertools/powertools-lambda-python/issues/3790)) ## Features - **feature_flags:** add intersect actions for conditions ([#3692](https://github.com/aws-powertools/powertools-lambda-python/issues/3692)) ## Maintenance - version bump - **deps:** bump actions/dependency-review-action from 4.1.2 to 4.1.3 ([#3813](https://github.com/aws-powertools/powertools-lambda-python/issues/3813)) - **deps:** bump actions/dependency-review-action from 4.1.0 to 4.1.2 ([#3800](https://github.com/aws-powertools/powertools-lambda-python/issues/3800)) - **deps:** bump actions/dependency-review-action from 4.0.0 to 4.1.0 ([#3771](https://github.com/aws-powertools/powertools-lambda-python/issues/3771)) - **deps:** bump squidfunk/mkdocs-material from `62d3668` to `43b898a` in /docs ([#3801](https://github.com/aws-powertools/powertools-lambda-python/issues/3801)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#3764](https://github.com/aws-powertools/powertools-lambda-python/issues/3764)) - **deps:** bump squidfunk/mkdocs-material from `6a72238` to `62d3668` in /docs ([#3756](https://github.com/aws-powertools/powertools-lambda-python/issues/3756)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 2 updates ([#3814](https://github.com/aws-powertools/powertools-lambda-python/issues/3814)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3784](https://github.com/aws-powertools/powertools-lambda-python/issues/3784)) - **deps-dev:** bump pytest from 8.0.0 to 8.0.1 ([#3812](https://github.com/aws-powertools/powertools-lambda-python/issues/3812)) - **deps-dev:** bump aws-cdk from 2.127.0 to 2.128.0 ([#3776](https://github.com/aws-powertools/powertools-lambda-python/issues/3776)) - **deps-dev:** bump the boto-typing group with 2 updates ([#3797](https://github.com/aws-powertools/powertools-lambda-python/issues/3797)) - **deps-dev:** bump cfn-lint from 0.85.1 to 0.85.2 ([#3786](https://github.com/aws-powertools/powertools-lambda-python/issues/3786)) - **deps-dev:** bump pytest-asyncio from 0.21.1 to 0.23.5 ([#3773](https://github.com/aws-powertools/powertools-lambda-python/issues/3773)) - **deps-dev:** bump aws-cdk-lib from 2.127.0 to 2.128.0 ([#3777](https://github.com/aws-powertools/powertools-lambda-python/issues/3777)) - **deps-dev:** bump sentry-sdk from 1.40.3 to 1.40.4 ([#3765](https://github.com/aws-powertools/powertools-lambda-python/issues/3765)) - **deps-dev:** bump sentry-sdk from 1.40.4 to 1.40.5 ([#3805](https://github.com/aws-powertools/powertools-lambda-python/issues/3805)) - **deps-dev:** bump mkdocs-material from 9.5.9 to 9.5.10 ([#3803](https://github.com/aws-powertools/powertools-lambda-python/issues/3803)) - **deps-dev:** bump types-redis from 4.6.0.20240106 to 4.6.0.20240218 ([#3804](https://github.com/aws-powertools/powertools-lambda-python/issues/3804)) - **deps-dev:** bump the boto-typing group with 1 update ([#3757](https://github.com/aws-powertools/powertools-lambda-python/issues/3757)) - **deps-dev:** bump aws-cdk-lib from 2.126.0 to 2.127.0 ([#3758](https://github.com/aws-powertools/powertools-lambda-python/issues/3758)) - **deps-dev:** bump aws-cdk from 2.126.0 to 2.127.0 ([#3761](https://github.com/aws-powertools/powertools-lambda-python/issues/3761)) - **deps-dev:** bump mkdocs-material from 9.5.8 to 9.5.9 ([#3759](https://github.com/aws-powertools/powertools-lambda-python/issues/3759)) - **deps-dev:** bump sentry-sdk from 1.40.2 to 1.40.3 ([#3750](https://github.com/aws-powertools/powertools-lambda-python/issues/3750)) - **deps-dev:** bump cfn-lint from 0.85.0 to 0.85.1 ([#3749](https://github.com/aws-powertools/powertools-lambda-python/issues/3749)) - **deps-dev:** bump coverage from 7.4.1 to 7.4.2 ([#3811](https://github.com/aws-powertools/powertools-lambda-python/issues/3811)) - **tests:** increase idempotency coverage with nested payload tampering tests ([#3809](https://github.com/aws-powertools/powertools-lambda-python/issues/3809)) ## [v2.33.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.33.0...v2.33.1) - 2024-02-09 ## Bug Fixes - **typing:** make Response headers covariant ([#3745](https://github.com/aws-powertools/powertools-lambda-python/issues/3745)) ## Documentation - Add nathan hanks post community ([#3727](https://github.com/aws-powertools/powertools-lambda-python/issues/3727)) ## Maintenance - version bump - **ci:** drop support for Python 3.7 ([#3638](https://github.com/aws-powertools/powertools-lambda-python/issues/3638)) - **ci:** enable Redis e2e tests ([#3718](https://github.com/aws-powertools/powertools-lambda-python/issues/3718)) - **deps:** bump actions/setup-node from 4.0.1 to 4.0.2 ([#3737](https://github.com/aws-powertools/powertools-lambda-python/issues/3737)) - **deps:** bump squidfunk/mkdocs-material from `e0d6c67` to `6a72238` in /docs ([#3735](https://github.com/aws-powertools/powertools-lambda-python/issues/3735)) - **deps:** bump actions/dependency-review-action from 3.1.5 to 4.0.0 ([#3646](https://github.com/aws-powertools/powertools-lambda-python/issues/3646)) - **deps:** bump release-drafter/release-drafter from 5.25.0 to 6.0.0 ([#3699](https://github.com/aws-powertools/powertools-lambda-python/issues/3699)) - **deps:** bump actions/download-artifact from 4.1.1 to 4.1.2 ([#3725](https://github.com/aws-powertools/powertools-lambda-python/issues/3725)) - **deps:** bump squidfunk/mkdocs-material from `a4a2029` to `e0d6c67` in /docs ([#3708](https://github.com/aws-powertools/powertools-lambda-python/issues/3708)) - **deps:** bump codecov/codecov-action from 3.1.6 to 4.0.1 ([#3700](https://github.com/aws-powertools/powertools-lambda-python/issues/3700)) - **deps:** bump actions/download-artifact from 3.0.2 to 4.1.1 ([#3612](https://github.com/aws-powertools/powertools-lambda-python/issues/3612)) - **deps:** revert aws-cdk-lib as a runtime dep ([#3730](https://github.com/aws-powertools/powertools-lambda-python/issues/3730)) - **deps:** bump actions/upload-artifact from 3.1.3 to 4.3.1 ([#3714](https://github.com/aws-powertools/powertools-lambda-python/issues/3714)) - **deps-dev:** bump cfn-lint from 0.83.8 to 0.85.0 ([#3724](https://github.com/aws-powertools/powertools-lambda-python/issues/3724)) - **deps-dev:** bump httpx from 0.24.1 to 0.26.0 ([#3712](https://github.com/aws-powertools/powertools-lambda-python/issues/3712)) - **deps-dev:** bump pytest from 7.4.4 to 8.0.0 ([#3711](https://github.com/aws-powertools/powertools-lambda-python/issues/3711)) - **deps-dev:** bump sentry-sdk from 1.40.1 to 1.40.2 ([#3740](https://github.com/aws-powertools/powertools-lambda-python/issues/3740)) - **deps-dev:** bump coverage from 7.2.7 to 7.4.1 ([#3713](https://github.com/aws-powertools/powertools-lambda-python/issues/3713)) - **deps-dev:** bump the boto-typing group with 7 updates ([#3709](https://github.com/aws-powertools/powertools-lambda-python/issues/3709)) - **deps-dev:** bump types-python-dateutil from 2.8.19.14 to 2.8.19.20240106 ([#3720](https://github.com/aws-powertools/powertools-lambda-python/issues/3720)) - **deps-dev:** bump mypy from 1.4.1 to 1.8.0 ([#3710](https://github.com/aws-powertools/powertools-lambda-python/issues/3710)) - **deps-dev:** bump ruff from 0.2.0 to 0.2.1 ([#3742](https://github.com/aws-powertools/powertools-lambda-python/issues/3742)) - **deps-dev:** bump isort from 5.11.5 to 5.13.2 ([#3723](https://github.com/aws-powertools/powertools-lambda-python/issues/3723)) - **deps-dev:** bump pytest-socket from 0.6.0 to 0.7.0 ([#3721](https://github.com/aws-powertools/powertools-lambda-python/issues/3721)) - **deps-dev:** bump ruff from 0.1.15 to 0.2.0 ([#3702](https://github.com/aws-powertools/powertools-lambda-python/issues/3702)) - **deps-dev:** bump aws-cdk from 2.125.0 to 2.126.0 ([#3701](https://github.com/aws-powertools/powertools-lambda-python/issues/3701)) - **deps-dev:** bump hvac from 1.2.1 to 2.1.0 ([#3738](https://github.com/aws-powertools/powertools-lambda-python/issues/3738)) - **deps-dev:** bump black from 23.12.1 to 24.1.1 ([#3739](https://github.com/aws-powertools/powertools-lambda-python/issues/3739)) ## [v2.33.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.32.0...v2.33.0) - 2024-02-02 ## Bug Fixes - **data-masking:** fix and improve e2e tests for DataMasking ([#3695](https://github.com/aws-powertools/powertools-lambda-python/issues/3695)) - **event-handler:** strip whitespace from Content-Type headers during OpenAPI schema validation ([#3677](https://github.com/aws-powertools/powertools-lambda-python/issues/3677)) ## Documentation - **data-masking:** add docs for data masking utility ([#3186](https://github.com/aws-powertools/powertools-lambda-python/issues/3186)) - **metrics:** fix empty metric warning filter ([#3660](https://github.com/aws-powertools/powertools-lambda-python/issues/3660)) - **proccess:** add versioning and maintenance policy ([#3682](https://github.com/aws-powertools/powertools-lambda-python/issues/3682)) ## Features - **event_handler:** support Header parameter validation in OpenAPI schema ([#3687](https://github.com/aws-powertools/powertools-lambda-python/issues/3687)) - **event_handler:** add support for multiValueQueryStringParameters in OpenAPI schema ([#3667](https://github.com/aws-powertools/powertools-lambda-python/issues/3667)) ## Maintenance - version bump - **deps:** bump codecov/codecov-action from 3.1.5 to 3.1.6 ([#3683](https://github.com/aws-powertools/powertools-lambda-python/issues/3683)) - **deps:** bump codecov/codecov-action from 3.1.4 to 3.1.5 ([#3674](https://github.com/aws-powertools/powertools-lambda-python/issues/3674)) - **deps:** bump pydantic from 1.10.13 to 1.10.14 ([#3655](https://github.com/aws-powertools/powertools-lambda-python/issues/3655)) - **deps:** bump squidfunk/mkdocs-material from `58eef6c` to `9aad7af` in /docs ([#3670](https://github.com/aws-powertools/powertools-lambda-python/issues/3670)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3665](https://github.com/aws-powertools/powertools-lambda-python/issues/3665)) - **deps:** bump squidfunk/mkdocs-material from `9aad7af` to `a4a2029` in /docs ([#3679](https://github.com/aws-powertools/powertools-lambda-python/issues/3679)) - **deps-dev:** bump sentry-sdk from 1.39.2 to 1.40.0 ([#3684](https://github.com/aws-powertools/powertools-lambda-python/issues/3684)) - **deps-dev:** bump ruff from 0.1.14 to 0.1.15 ([#3685](https://github.com/aws-powertools/powertools-lambda-python/issues/3685)) - **deps-dev:** bump ruff from 0.1.13 to 0.1.14 ([#3656](https://github.com/aws-powertools/powertools-lambda-python/issues/3656)) - **deps-dev:** bump aws-cdk from 2.122.0 to 2.123.0 ([#3673](https://github.com/aws-powertools/powertools-lambda-python/issues/3673)) - **deps-dev:** bump aws-cdk from 2.124.0 to 2.125.0 ([#3693](https://github.com/aws-powertools/powertools-lambda-python/issues/3693)) - **deps-dev:** bump aws-cdk from 2.123.0 to 2.124.0 ([#3678](https://github.com/aws-powertools/powertools-lambda-python/issues/3678)) ## [v2.32.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.31.0...v2.32.0) - 2024-01-19 ## Bug Fixes - **event_handler:** escape OpenAPI schema on Swagger UI ([#3606](https://github.com/aws-powertools/powertools-lambda-python/issues/3606)) ## Code Refactoring - **event-handler:** Inject CSS and JS files into SwaggerUI route when no custom CDN is used. ([#3562](https://github.com/aws-powertools/powertools-lambda-python/issues/3562)) - **event_handler:** fix BedrockAgentResolver docstring ([#3645](https://github.com/aws-powertools/powertools-lambda-python/issues/3645)) ## Documentation - **homepage:** add banner about Python 3.7 deprecation ([#3618](https://github.com/aws-powertools/powertools-lambda-python/issues/3618)) - **i-made-this:** added new article on how to create a serverless API with CDK and Powertools ([#3605](https://github.com/aws-powertools/powertools-lambda-python/issues/3605)) ## Features - **event_handler:** add support for additional response models ([#3591](https://github.com/aws-powertools/powertools-lambda-python/issues/3591)) - **event_handler:** add support to download OpenAPI spec file ([#3571](https://github.com/aws-powertools/powertools-lambda-python/issues/3571)) - **event_source:** Add support for S3 batch operations ([#3572](https://github.com/aws-powertools/powertools-lambda-python/issues/3572)) - **event_source:** Add support for policyLevel field in CloudWatch Logs event and parser ([#3624](https://github.com/aws-powertools/powertools-lambda-python/issues/3624)) - **idempotency:** leverage new DynamoDB Failed conditional writes behavior with ReturnValuesOnConditionCheckFailure ([#3446](https://github.com/aws-powertools/powertools-lambda-python/issues/3446)) - **idempotency:** adding redis as idempotency backend ([#2567](https://github.com/aws-powertools/powertools-lambda-python/issues/2567)) ## Maintenance - version bump - **ci:** Disable Redis e2e until we drop Python 3.7 ([#3652](https://github.com/aws-powertools/powertools-lambda-python/issues/3652)) - **ci:** update boto3 library version to 1.26.164+ ([#3632](https://github.com/aws-powertools/powertools-lambda-python/issues/3632)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3649](https://github.com/aws-powertools/powertools-lambda-python/issues/3649)) - **deps:** bump jinja2 from 3.1.2 to 3.1.3 in /docs ([#3620](https://github.com/aws-powertools/powertools-lambda-python/issues/3620)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3639](https://github.com/aws-powertools/powertools-lambda-python/issues/3639)) - **deps:** bump gitpython from 3.1.37 to 3.1.41 in /docs ([#3610](https://github.com/aws-powertools/powertools-lambda-python/issues/3610)) - **deps:** bump squidfunk/mkdocs-material from `2f29d71` to `58eef6c` in /docs ([#3633](https://github.com/aws-powertools/powertools-lambda-python/issues/3633)) - **deps:** bump redis from 4.6.0 to 5.0.1 ([#3613](https://github.com/aws-powertools/powertools-lambda-python/issues/3613)) - **deps-dev:** bump gitpython from 3.1.40 to 3.1.41 ([#3611](https://github.com/aws-powertools/powertools-lambda-python/issues/3611)) - **deps-dev:** bump sentry-sdk from 1.39.1 to 1.39.2 ([#3614](https://github.com/aws-powertools/powertools-lambda-python/issues/3614)) - **deps-dev:** bump aws-cdk from 2.120.0 to 2.121.1 ([#3634](https://github.com/aws-powertools/powertools-lambda-python/issues/3634)) - **deps-dev:** bump jinja2 from 3.1.2 to 3.1.3 ([#3619](https://github.com/aws-powertools/powertools-lambda-python/issues/3619)) - **deps-dev:** bump cfn-lint from 0.83.7 to 0.83.8 ([#3603](https://github.com/aws-powertools/powertools-lambda-python/issues/3603)) - **deps-dev:** bump aws-cdk from 2.121.1 to 2.122.0 ([#3648](https://github.com/aws-powertools/powertools-lambda-python/issues/3648)) - **deps-dev:** bump ruff from 0.1.11 to 0.1.13 ([#3625](https://github.com/aws-powertools/powertools-lambda-python/issues/3625)) - **deps-dev:** bump aws-cdk from 2.118.0 to 2.120.0 ([#3627](https://github.com/aws-powertools/powertools-lambda-python/issues/3627)) ## [v2.31.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.30.2...v2.31.0) - 2024-01-05 ## Bug Fixes - **ci:** fail dispatch analytics job when Lambda call fails ([#3579](https://github.com/aws-powertools/powertools-lambda-python/issues/3579)) ## Code Refactoring - **parameters:** add overload signatures for get_parameter and get_parameters ([#3534](https://github.com/aws-powertools/powertools-lambda-python/issues/3534)) - **parser:** Improve error message when parsing models and envelopes ([#3587](https://github.com/aws-powertools/powertools-lambda-python/issues/3587)) ## Documentation - **middleware-factory:** Fix and improve typing ([#3569](https://github.com/aws-powertools/powertools-lambda-python/issues/3569)) ## Features - **event-handler:** add description to request body in OpenAPI schema ([#3548](https://github.com/aws-powertools/powertools-lambda-python/issues/3548)) - **event_handler:** support richer top level Tags ([#3543](https://github.com/aws-powertools/powertools-lambda-python/issues/3543)) - **layers:** add new comercial region Canada - ca-west-1 ([#3549](https://github.com/aws-powertools/powertools-lambda-python/issues/3549)) ## Maintenance - version bump - **ci:** Remove dev dependencies locked to Pydantic v1 within the Pydantic v2 workflow. ([#3582](https://github.com/aws-powertools/powertools-lambda-python/issues/3582)) - **deps:** bump squidfunk/mkdocs-material from `9af3b7e` to `2f29d71` in /docs ([#3559](https://github.com/aws-powertools/powertools-lambda-python/issues/3559)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 4 updates ([#3593](https://github.com/aws-powertools/powertools-lambda-python/issues/3593)) - **deps:** bump actions/setup-node from 4.0.0 to 4.0.1 ([#3535](https://github.com/aws-powertools/powertools-lambda-python/issues/3535)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.2 to 3.0.3 ([#3536](https://github.com/aws-powertools/powertools-lambda-python/issues/3536)) - **deps:** bump actions/dependency-review-action from 3.1.4 to 3.1.5 ([#3592](https://github.com/aws-powertools/powertools-lambda-python/issues/3592)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 2 updates ([#3544](https://github.com/aws-powertools/powertools-lambda-python/issues/3544)) - **deps:** bump fastjsonschema from 2.19.0 to 2.19.1 ([#3567](https://github.com/aws-powertools/powertools-lambda-python/issues/3567)) - **deps-dev:** bump ruff from 0.1.8 to 0.1.9 ([#3550](https://github.com/aws-powertools/powertools-lambda-python/issues/3550)) - **deps-dev:** bump aws-cdk from 2.115.0 to 2.116.1 ([#3553](https://github.com/aws-powertools/powertools-lambda-python/issues/3553)) - **deps-dev:** bump aws-cdk from 2.117.0 to 2.118.0 ([#3589](https://github.com/aws-powertools/powertools-lambda-python/issues/3589)) - **deps-dev:** bump cfn-lint from 0.83.6 to 0.83.7 ([#3554](https://github.com/aws-powertools/powertools-lambda-python/issues/3554)) - **deps-dev:** bump ruff from 0.1.9 to 0.1.10 ([#3583](https://github.com/aws-powertools/powertools-lambda-python/issues/3583)) - **deps-dev:** bump pytest from 7.4.3 to 7.4.4 ([#3576](https://github.com/aws-powertools/powertools-lambda-python/issues/3576)) - **deps-dev:** bump aws-cdk from 2.116.1 to 2.117.0 ([#3565](https://github.com/aws-powertools/powertools-lambda-python/issues/3565)) - **deps-dev:** bump ruff from 0.1.10 to 0.1.11 ([#3588](https://github.com/aws-powertools/powertools-lambda-python/issues/3588)) ## [v2.30.2](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.30.1...v2.30.2) - 2023-12-18 ## Bug Fixes - **event-handler:** fix operation tags schema generation ([#3528](https://github.com/aws-powertools/powertools-lambda-python/issues/3528)) - **event-handler:** set default OpenAPI version to 3.0.0 ([#3527](https://github.com/aws-powertools/powertools-lambda-python/issues/3527)) - **event-handler:** upgrade Swagger UI to fix regressions ([#3526](https://github.com/aws-powertools/powertools-lambda-python/issues/3526)) ## Maintenance - version bump - **deps-dev:** bump cfn-lint from 0.83.5 to 0.83.6 ([#3521](https://github.com/aws-powertools/powertools-lambda-python/issues/3521)) ## [v2.30.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.30.0...v2.30.1) - 2023-12-15 ## Bug Fixes - **event_handler:** allow responses and metadata when using Router ([#3514](https://github.com/aws-powertools/powertools-lambda-python/issues/3514)) ## Maintenance - version bump - **deps-dev:** bump aws-cdk from 2.114.1 to 2.115.0 ([#3508](https://github.com/aws-powertools/powertools-lambda-python/issues/3508)) - **deps-dev:** bump the boto-typing group with 11 updates ([#3509](https://github.com/aws-powertools/powertools-lambda-python/issues/3509)) - **deps-dev:** bump sentry-sdk from 1.39.0 to 1.39.1 ([#3512](https://github.com/aws-powertools/powertools-lambda-python/issues/3512)) ## [v2.30.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.29.1...v2.30.0) - 2023-12-14 ## Bug Fixes - **docs:** make the Lambda Layer version consistent ([#3498](https://github.com/aws-powertools/powertools-lambda-python/issues/3498)) ## Documentation - **customer-reference:** add Transformity as a customer reference ([#3497](https://github.com/aws-powertools/powertools-lambda-python/issues/3497)) ## Features - **general:** add support for Python 3.12 ([#3304](https://github.com/aws-powertools/powertools-lambda-python/issues/3304)) ## Maintenance - version bump - **deps:** bump squidfunk/mkdocs-material from `876b39c` to `9af3b7e` in /docs ([#3486](https://github.com/aws-powertools/powertools-lambda-python/issues/3486)) - **deps-dev:** bump sentry-sdk from 1.38.0 to 1.39.0 ([#3495](https://github.com/aws-powertools/powertools-lambda-python/issues/3495)) - **deps-dev:** bump cfn-lint from 0.83.4 to 0.83.5 ([#3487](https://github.com/aws-powertools/powertools-lambda-python/issues/3487)) - **deps-dev:** bump ruff from 0.1.7 to 0.1.8 ([#3501](https://github.com/aws-powertools/powertools-lambda-python/issues/3501)) - **deps-dev:** bump the boto-typing group with 1 update ([#3500](https://github.com/aws-powertools/powertools-lambda-python/issues/3500)) - **tests:** temporarily disable E2E parallelism ([#3484](https://github.com/aws-powertools/powertools-lambda-python/issues/3484)) ## [v2.29.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.29.0...v2.29.1) - 2023-12-11 ## Bug Fixes - **logger:** log non-ascii characters as is when JSON stringifying ([#3475](https://github.com/aws-powertools/powertools-lambda-python/issues/3475)) ## Maintenance - version bump - **deps:** bump squidfunk/mkdocs-material from `8c72011` to `20241c6` in /docs ([#3470](https://github.com/aws-powertools/powertools-lambda-python/issues/3470)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 2 updates ([#3476](https://github.com/aws-powertools/powertools-lambda-python/issues/3476)) - **deps:** bump actions/setup-python from 4.8.0 to 5.0.0 ([#3465](https://github.com/aws-powertools/powertools-lambda-python/issues/3465)) - **deps:** bump squidfunk/mkdocs-material from `20241c6` to `876b39c` in /docs ([#3477](https://github.com/aws-powertools/powertools-lambda-python/issues/3477)) - **deps:** bump datadog-lambda from 5.84.0 to 5.85.0 ([#3466](https://github.com/aws-powertools/powertools-lambda-python/issues/3466)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 2 updates ([#3467](https://github.com/aws-powertools/powertools-lambda-python/issues/3467)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#3469](https://github.com/aws-powertools/powertools-lambda-python/issues/3469)) - **deps-dev:** bump aws-cdk from 2.113.0 to 2.114.1 ([#3464](https://github.com/aws-powertools/powertools-lambda-python/issues/3464)) - **deps-dev:** bump the boto-typing group with 1 update ([#3478](https://github.com/aws-powertools/powertools-lambda-python/issues/3478)) ## [v2.29.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.28.1...v2.29.0) - 2023-12-06 ## Bug Fixes - **event_handler:** serialize pydantic/dataclasses in exception handler ([#3455](https://github.com/aws-powertools/powertools-lambda-python/issues/3455)) - **metrics:** lambda_handler typing, and \*\*kwargs preservation all middlewares ([#3460](https://github.com/aws-powertools/powertools-lambda-python/issues/3460)) ## Features - **event_sources:** add get_context() to standardize API Gateway Lambda Authorizer context in v1 and v2 ([#3454](https://github.com/aws-powertools/powertools-lambda-python/issues/3454)) ## Maintenance - version bump - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#3441](https://github.com/aws-powertools/powertools-lambda-python/issues/3441)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.1 to 3.0.2 ([#3449](https://github.com/aws-powertools/powertools-lambda-python/issues/3449)) - **deps:** bump datadog-lambda from 5.83.0 to 5.84.0 ([#3438](https://github.com/aws-powertools/powertools-lambda-python/issues/3438)) - **deps:** bump cryptography from 41.0.4 to 41.0.6 ([#3431](https://github.com/aws-powertools/powertools-lambda-python/issues/3431)) - **deps:** bump squidfunk/mkdocs-material from `fc42bac` to `8c72011` in /docs ([#3416](https://github.com/aws-powertools/powertools-lambda-python/issues/3416)) - **deps:** bump actions/dependency-review-action from 3.1.3 to 3.1.4 ([#3426](https://github.com/aws-powertools/powertools-lambda-python/issues/3426)) - **deps:** bump actions/setup-python from 4.7.1 to 4.8.0 ([#3456](https://github.com/aws-powertools/powertools-lambda-python/issues/3456)) - **deps:** bump pypa/gh-action-pypi-publish from 1.8.10 to 1.8.11 ([#3433](https://github.com/aws-powertools/powertools-lambda-python/issues/3433)) - **deps-dev:** bump cfn-lint from 0.83.3 to 0.83.4 ([#3450](https://github.com/aws-powertools/powertools-lambda-python/issues/3450)) - **deps-dev:** bump ruff from 0.1.6 to 0.1.7 ([#3458](https://github.com/aws-powertools/powertools-lambda-python/issues/3458)) - **deps-dev:** bump sentry-sdk from 1.36.0 to 1.38.0 ([#3435](https://github.com/aws-powertools/powertools-lambda-python/issues/3435)) - **deps-dev:** bump aws-cdk-lib from 2.110.1 to 2.111.0 ([#3428](https://github.com/aws-powertools/powertools-lambda-python/issues/3428)) - **deps-dev:** bump aws-cdk from 2.112.0 to 2.113.0 ([#3448](https://github.com/aws-powertools/powertools-lambda-python/issues/3448)) - **deps-dev:** bump aws-cdk from 2.110.1 to 2.111.0 ([#3418](https://github.com/aws-powertools/powertools-lambda-python/issues/3418)) - **deps-dev:** bump the boto-typing group with 11 updates ([#3427](https://github.com/aws-powertools/powertools-lambda-python/issues/3427)) - **deps-dev:** bump the boto-typing group with 1 update ([#3457](https://github.com/aws-powertools/powertools-lambda-python/issues/3457)) - **deps-dev:** bump aws-cdk from 2.111.0 to 2.112.0 ([#3440](https://github.com/aws-powertools/powertools-lambda-python/issues/3440)) - **layers:** Update log retention to 10 years ([#3424](https://github.com/aws-powertools/powertools-lambda-python/issues/3424)) ## [v2.28.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.28.0...v2.28.1) - 2023-11-28 ## Bug Fixes - **event_handler:** fix compress handling ([#3420](https://github.com/aws-powertools/powertools-lambda-python/issues/3420)) ## Maintenance - version bump ## [v2.28.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.27.1...v2.28.0) - 2023-11-23 ## Bug Fixes - **event_handler:** hide error details by default ([#3406](https://github.com/aws-powertools/powertools-lambda-python/issues/3406)) - **event_handler:** fix format for OpenAPI path templating ([#3399](https://github.com/aws-powertools/powertools-lambda-python/issues/3399)) - **event_handler:** lazy load Pydantic to improve cold start ([#3397](https://github.com/aws-powertools/powertools-lambda-python/issues/3397)) - **event_handler:** allow fine grained Response with data validation ([#3394](https://github.com/aws-powertools/powertools-lambda-python/issues/3394)) - **event_handler:** apply serialization as the last operation for middlewares ([#3392](https://github.com/aws-powertools/powertools-lambda-python/issues/3392)) ## Documentation - **event_handlers:** new data validation and OpenAPI feature ([#3386](https://github.com/aws-powertools/powertools-lambda-python/issues/3386)) ## Features - **event_handler:** allow customers to catch request validation errors ([#3396](https://github.com/aws-powertools/powertools-lambda-python/issues/3396)) ## Maintenance - version bump - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#3389](https://github.com/aws-powertools/powertools-lambda-python/issues/3389)) - **deps:** bump datadog-lambda from 4.82.0 to 5.83.0 ([#3401](https://github.com/aws-powertools/powertools-lambda-python/issues/3401)) - **deps-dev:** bump aws-cdk-lib from 2.110.0 to 2.110.1 ([#3402](https://github.com/aws-powertools/powertools-lambda-python/issues/3402)) - **deps-dev:** bump pytest-xdist from 3.4.0 to 3.5.0 ([#3387](https://github.com/aws-powertools/powertools-lambda-python/issues/3387)) - **deps-dev:** bump the boto-typing group with 1 update ([#3400](https://github.com/aws-powertools/powertools-lambda-python/issues/3400)) - **deps-dev:** bump sentry-sdk from 1.35.0 to 1.36.0 ([#3388](https://github.com/aws-powertools/powertools-lambda-python/issues/3388)) - **deps-dev:** bump aws-cdk from 2.110.0 to 2.110.1 ([#3403](https://github.com/aws-powertools/powertools-lambda-python/issues/3403)) ## [v2.27.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.27.0...v2.27.1) - 2023-11-21 ## Bug Fixes - **logger:** allow custom JMESPath functions to extract correlation ID ([#3382](https://github.com/aws-powertools/powertools-lambda-python/issues/3382)) ## Documentation - **event_handlers:** note that CORS and */* binary mime type don't work in API Gateway ([#3383](https://github.com/aws-powertools/powertools-lambda-python/issues/3383)) - **logger:** improve ALC messaging in the PT context ([#3359](https://github.com/aws-powertools/powertools-lambda-python/issues/3359)) - **logger:** Fix ALC link ([#3352](https://github.com/aws-powertools/powertools-lambda-python/issues/3352)) ## Features - **logger:** implement addFilter/removeFilter to address static typing errors ([#3380](https://github.com/aws-powertools/powertools-lambda-python/issues/3380)) ## Maintenance - version bump - **ci:** lint and type checking removal in Pydantic v2 quality check ([#3360](https://github.com/aws-powertools/powertools-lambda-python/issues/3360)) - **deps:** bump actions/github-script from 7.0.0 to 7.0.1 ([#3377](https://github.com/aws-powertools/powertools-lambda-python/issues/3377)) - **deps:** bump squidfunk/mkdocs-material from `2c57e4d` to `fc42bac` in /docs ([#3375](https://github.com/aws-powertools/powertools-lambda-python/issues/3375)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 2 updates ([#3353](https://github.com/aws-powertools/powertools-lambda-python/issues/3353)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3374](https://github.com/aws-powertools/powertools-lambda-python/issues/3374)) - **deps:** bump squidfunk/mkdocs-material from `f486dc9` to `2c57e4d` in /docs ([#3366](https://github.com/aws-powertools/powertools-lambda-python/issues/3366)) - **deps-dev:** bump cfn-lint from 0.83.2 to 0.83.3 ([#3363](https://github.com/aws-powertools/powertools-lambda-python/issues/3363)) - **deps-dev:** bump the boto-typing group with 11 updates ([#3362](https://github.com/aws-powertools/powertools-lambda-python/issues/3362)) - **deps-dev:** bump aws-cdk-lib from 2.108.1 to 2.110.0 ([#3365](https://github.com/aws-powertools/powertools-lambda-python/issues/3365)) - **deps-dev:** bump aws-cdk from 2.108.1 to 2.109.0 ([#3354](https://github.com/aws-powertools/powertools-lambda-python/issues/3354)) - **deps-dev:** bump aws-cdk from 2.109.0 to 2.110.0 ([#3361](https://github.com/aws-powertools/powertools-lambda-python/issues/3361)) - **deps-dev:** bump the boto-typing group with 2 updates ([#3376](https://github.com/aws-powertools/powertools-lambda-python/issues/3376)) - **deps-dev:** bump ruff from 0.1.5 to 0.1.6 ([#3364](https://github.com/aws-powertools/powertools-lambda-python/issues/3364)) ## [v2.27.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.26.1...v2.27.0) - 2023-11-16 ## Features - **logger:** Adding support to new env variables ([#3348](https://github.com/aws-powertools/powertools-lambda-python/issues/3348)) ## Maintenance - version bump - **deps:** bump actions/github-script from 6.4.1 to 7.0.0 ([#3330](https://github.com/aws-powertools/powertools-lambda-python/issues/3330)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 2 updates ([#3340](https://github.com/aws-powertools/powertools-lambda-python/issues/3340)) - **deps:** bump fastjsonschema from 2.18.1 to 2.19.0 ([#3337](https://github.com/aws-powertools/powertools-lambda-python/issues/3337)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#3345](https://github.com/aws-powertools/powertools-lambda-python/issues/3345)) - **deps:** bump actions/dependency-review-action from 3.1.2 to 3.1.3 ([#3331](https://github.com/aws-powertools/powertools-lambda-python/issues/3331)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3329](https://github.com/aws-powertools/powertools-lambda-python/issues/3329)) - **deps:** bump datadog-lambda from 4.81.0 to 4.82.0 ([#3338](https://github.com/aws-powertools/powertools-lambda-python/issues/3338)) - **deps-dev:** bump cfn-lint from 0.83.1 to 0.83.2 ([#3335](https://github.com/aws-powertools/powertools-lambda-python/issues/3335)) - **deps-dev:** bump aws-cdk from 2.108.0 to 2.108.1 ([#3344](https://github.com/aws-powertools/powertools-lambda-python/issues/3344)) - **deps-dev:** bump sentry-sdk from 1.34.0 to 1.35.0 ([#3334](https://github.com/aws-powertools/powertools-lambda-python/issues/3334)) - **deps-dev:** bump pytest-xdist from 3.3.1 to 3.4.0 ([#3332](https://github.com/aws-powertools/powertools-lambda-python/issues/3332)) - **deps-dev:** bump aws-cdk-lib from 2.107.0 to 2.108.1 ([#3343](https://github.com/aws-powertools/powertools-lambda-python/issues/3343)) - **deps-dev:** bump aws-cdk from 2.106.0 to 2.106.1 ([#3328](https://github.com/aws-powertools/powertools-lambda-python/issues/3328)) - **deps-dev:** bump aws-cdk-lib from 2.105.0 to 2.106.0 ([#3319](https://github.com/aws-powertools/powertools-lambda-python/issues/3319)) - **deps-dev:** bump aws-cdk from 2.105.0 to 2.106.0 ([#3320](https://github.com/aws-powertools/powertools-lambda-python/issues/3320)) - **deps-dev:** bump aws-cdk from 2.106.1 to 2.108.0 ([#3341](https://github.com/aws-powertools/powertools-lambda-python/issues/3341)) - **deps-dev:** bump aws-cdk-lib from 2.106.0 to 2.107.0 ([#3333](https://github.com/aws-powertools/powertools-lambda-python/issues/3333)) ## [v2.26.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.26.0...v2.26.1) - 2023-11-10 ## Bug Fixes - **event-handler:** enable path parameters on Bedrock handler ([#3312](https://github.com/aws-powertools/powertools-lambda-python/issues/3312)) - **event_handler:** Router prefix mismatch regression after Middleware feat ([#3302](https://github.com/aws-powertools/powertools-lambda-python/issues/3302)) - **event_source:** kinesis subsequenceNumber str type to int ([#3275](https://github.com/aws-powertools/powertools-lambda-python/issues/3275)) - **parameters:** Respect POWERTOOLS_PARAMETERS_SSM_DECRYPT environment variable when getting multiple ssm parameters. ([#3241](https://github.com/aws-powertools/powertools-lambda-python/issues/3241)) ## Documentation - **customer-reference:** add Vertex Pharmaceuticals as a customer reference ([#3210](https://github.com/aws-powertools/powertools-lambda-python/issues/3210)) - **event-handler:** fixed SchemaValidationMiddleware link ([#3247](https://github.com/aws-powertools/powertools-lambda-python/issues/3247)) ## Features - **data_classes:** add support for Bedrock Agents event ([#3262](https://github.com/aws-powertools/powertools-lambda-python/issues/3262)) - **event_handler:** add Bedrock Agent event handler ([#3285](https://github.com/aws-powertools/powertools-lambda-python/issues/3285)) - **event_handler:** add ability to expose a Swagger UI ([#3254](https://github.com/aws-powertools/powertools-lambda-python/issues/3254)) - **event_handler:** generate OpenAPI specifications and validate input/output ([#3109](https://github.com/aws-powertools/powertools-lambda-python/issues/3109)) - **parser:** add BedrockEventModel parser and envelope ([#3286](https://github.com/aws-powertools/powertools-lambda-python/issues/3286)) ## Maintenance - version bump - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 2.1.5 to 3.0.0 ([#3289](https://github.com/aws-powertools/powertools-lambda-python/issues/3289)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3287](https://github.com/aws-powertools/powertools-lambda-python/issues/3287)) - **deps:** bump actions/checkout from 4.1.0 to 4.1.1 ([#3220](https://github.com/aws-powertools/powertools-lambda-python/issues/3220)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#3282](https://github.com/aws-powertools/powertools-lambda-python/issues/3282)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 2.1.4 to 2.1.5 ([#3281](https://github.com/aws-powertools/powertools-lambda-python/issues/3281)) - **deps:** bump release-drafter/release-drafter from 5.24.0 to 5.25.0 ([#3219](https://github.com/aws-powertools/powertools-lambda-python/issues/3219)) - **deps:** bump squidfunk/mkdocs-material from `cb38dc2` to `df9409b` in /docs ([#3216](https://github.com/aws-powertools/powertools-lambda-python/issues/3216)) - **deps:** bump urllib3 from 1.26.17 to 1.26.18 ([#3222](https://github.com/aws-powertools/powertools-lambda-python/issues/3222)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 2 updates ([#3298](https://github.com/aws-powertools/powertools-lambda-python/issues/3298)) - **deps:** bump squidfunk/mkdocs-material from `772e14e` to `f486dc9` in /docs ([#3299](https://github.com/aws-powertools/powertools-lambda-python/issues/3299)) - **deps:** bump datadog-lambda from 4.80.0 to 4.81.0 ([#3228](https://github.com/aws-powertools/powertools-lambda-python/issues/3228)) - **deps:** bump actions/setup-node from 3.8.1 to 4.0.0 ([#3244](https://github.com/aws-powertools/powertools-lambda-python/issues/3244)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 3.0.0 to 3.0.1 ([#3300](https://github.com/aws-powertools/powertools-lambda-python/issues/3300)) - **deps:** bump actions/dependency-review-action from 3.1.0 to 3.1.1 ([#3301](https://github.com/aws-powertools/powertools-lambda-python/issues/3301)) - **deps:** bump squidfunk/mkdocs-material from `df9409b` to `772e14e` in /docs ([#3265](https://github.com/aws-powertools/powertools-lambda-python/issues/3265)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3305](https://github.com/aws-powertools/powertools-lambda-python/issues/3305)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 2 updates ([#3248](https://github.com/aws-powertools/powertools-lambda-python/issues/3248)) - **deps:** bump actions/dependency-review-action from 3.1.1 to 3.1.2 ([#3308](https://github.com/aws-powertools/powertools-lambda-python/issues/3308)) - **deps:** bump ossf/scorecard-action from 2.3.0 to 2.3.1 ([#3245](https://github.com/aws-powertools/powertools-lambda-python/issues/3245)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3310](https://github.com/aws-powertools/powertools-lambda-python/issues/3310)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3215](https://github.com/aws-powertools/powertools-lambda-python/issues/3215)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#3313](https://github.com/aws-powertools/powertools-lambda-python/issues/3313)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#3278](https://github.com/aws-powertools/powertools-lambda-python/issues/3278)) - **deps-dev:** bump pytest from 7.4.2 to 7.4.3 ([#3249](https://github.com/aws-powertools/powertools-lambda-python/issues/3249)) - **deps-dev:** bump ruff from 0.1.1 to 0.1.2 ([#3250](https://github.com/aws-powertools/powertools-lambda-python/issues/3250)) - **deps-dev:** bump the boto-typing group with 2 updates ([#3242](https://github.com/aws-powertools/powertools-lambda-python/issues/3242)) - **deps-dev:** bump aws-cdk-lib from 2.102.0 to 2.103.0 ([#3258](https://github.com/aws-powertools/powertools-lambda-python/issues/3258)) - **deps-dev:** bump cfn-lint from 0.82.2 to 0.83.0 ([#3243](https://github.com/aws-powertools/powertools-lambda-python/issues/3243)) - **deps-dev:** bump ruff from 0.1.2 to 0.1.3 ([#3257](https://github.com/aws-powertools/powertools-lambda-python/issues/3257)) - **deps-dev:** bump aws-cdk from 2.102.0 to 2.103.0 ([#3259](https://github.com/aws-powertools/powertools-lambda-python/issues/3259)) - **deps-dev:** bump ruff from 0.1.0 to 0.1.1 ([#3235](https://github.com/aws-powertools/powertools-lambda-python/issues/3235)) - **deps-dev:** bump aws-cdk-lib from 2.103.0 to 2.103.1 ([#3263](https://github.com/aws-powertools/powertools-lambda-python/issues/3263)) - **deps-dev:** bump the boto-typing group with 1 update ([#3231](https://github.com/aws-powertools/powertools-lambda-python/issues/3231)) - **deps-dev:** bump aws-cdk from 2.101.1 to 2.102.0 ([#3232](https://github.com/aws-powertools/powertools-lambda-python/issues/3232)) - **deps-dev:** bump aws-cdk from 2.103.0 to 2.103.1 ([#3264](https://github.com/aws-powertools/powertools-lambda-python/issues/3264)) - **deps-dev:** bump cfn-lint from 0.82.0 to 0.82.2 ([#3229](https://github.com/aws-powertools/powertools-lambda-python/issues/3229)) - **deps-dev:** bump cfn-lint from 0.83.0 to 0.83.1 ([#3274](https://github.com/aws-powertools/powertools-lambda-python/issues/3274)) - **deps-dev:** bump the boto-typing group with 1 update ([#3273](https://github.com/aws-powertools/powertools-lambda-python/issues/3273)) - **deps-dev:** bump cfn-lint from 0.81.0 to 0.82.0 ([#3224](https://github.com/aws-powertools/powertools-lambda-python/issues/3224)) - **deps-dev:** bump aws-cdk from 2.101.0 to 2.101.1 ([#3223](https://github.com/aws-powertools/powertools-lambda-python/issues/3223)) - **deps-dev:** bump sentry-sdk from 1.32.0 to 1.33.1 ([#3277](https://github.com/aws-powertools/powertools-lambda-python/issues/3277)) - **deps-dev:** bump urllib3 from 1.26.17 to 1.26.18 in /layer ([#3221](https://github.com/aws-powertools/powertools-lambda-python/issues/3221)) - **deps-dev:** bump aws-cdk from 2.103.1 to 2.104.0 ([#3288](https://github.com/aws-powertools/powertools-lambda-python/issues/3288)) - **deps-dev:** bump sentry-sdk from 1.33.1 to 1.34.0 ([#3290](https://github.com/aws-powertools/powertools-lambda-python/issues/3290)) - **deps-dev:** bump aws-cdk-lib from 2.103.1 to 2.104.0 ([#3291](https://github.com/aws-powertools/powertools-lambda-python/issues/3291)) - **deps-dev:** bump aws-cdk-lib from 2.100.0 to 2.101.1 ([#3217](https://github.com/aws-powertools/powertools-lambda-python/issues/3217)) - **deps-dev:** bump aws-cdk from 2.100.0 to 2.101.0 ([#3214](https://github.com/aws-powertools/powertools-lambda-python/issues/3214)) - **deps-dev:** bump aws-cdk from 2.104.0 to 2.105.0 ([#3307](https://github.com/aws-powertools/powertools-lambda-python/issues/3307)) - **deps-dev:** bump ruff from 0.1.3 to 0.1.4 ([#3297](https://github.com/aws-powertools/powertools-lambda-python/issues/3297)) - **deps-dev:** bump aws-cdk-lib from 2.104.0 to 2.105.0 ([#3309](https://github.com/aws-powertools/powertools-lambda-python/issues/3309)) - **deps-dev:** bump the boto-typing group with 2 updates ([#3211](https://github.com/aws-powertools/powertools-lambda-python/issues/3211)) - **deps-dev:** bump the boto-typing group with 3 updates ([#3314](https://github.com/aws-powertools/powertools-lambda-python/issues/3314)) - **deps-dev:** bump ruff from 0.1.4 to 0.1.5 ([#3315](https://github.com/aws-powertools/powertools-lambda-python/issues/3315)) - **deps-dev:** bump ruff from 0.0.292 to 0.1.0 ([#3213](https://github.com/aws-powertools/powertools-lambda-python/issues/3213)) ## [v2.26.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.25.1...v2.26.0) - 2023-10-13 ## Bug Fixes - **logger:** force Logger to use local timezone when UTC flag is not set ([#3168](https://github.com/aws-powertools/powertools-lambda-python/issues/3168)) - **parameter:** improve AppConfig cached configuration retrieval ([#3195](https://github.com/aws-powertools/powertools-lambda-python/issues/3195)) ## Code Refactoring - **data-masking:** disable e2e tests. ([#3204](https://github.com/aws-powertools/powertools-lambda-python/issues/3204)) - **data_masking:** move Data Masking utility to a private folder ([#3202](https://github.com/aws-powertools/powertools-lambda-python/issues/3202)) ## Documentation - **contributing:** initial structure for revamped contributing guide ([#3133](https://github.com/aws-powertools/powertools-lambda-python/issues/3133)) - **event_handler:** add information about case-insensitive header lookup function ([#3183](https://github.com/aws-powertools/powertools-lambda-python/issues/3183)) ## Features - **data_masking:** add new sensitive data masking utility ([#2197](https://github.com/aws-powertools/powertools-lambda-python/issues/2197)) - **event_handler:** add support to VPC Lattice payload v2 ([#3153](https://github.com/aws-powertools/powertools-lambda-python/issues/3153)) - **layers:** add arm64 support in more regions ([#3151](https://github.com/aws-powertools/powertools-lambda-python/issues/3151)) - **logger:** new stack_trace field with rich exception details ([#3147](https://github.com/aws-powertools/powertools-lambda-python/issues/3147)) - **parser:** infer model from type hint ([#3181](https://github.com/aws-powertools/powertools-lambda-python/issues/3181)) ## Maintenance - version bump - **deps:** bump squidfunk/mkdocs-material from `cbfecae` to `a4cfa88` in /docs ([#3175](https://github.com/aws-powertools/powertools-lambda-python/issues/3175)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3174](https://github.com/aws-powertools/powertools-lambda-python/issues/3174)) - **deps:** bump squidfunk/mkdocs-material from `b41ba6d` to `06673a1` in /docs ([#3124](https://github.com/aws-powertools/powertools-lambda-python/issues/3124)) - **deps:** bump ossf/scorecard-action from 2.2.0 to 2.3.0 ([#3178](https://github.com/aws-powertools/powertools-lambda-python/issues/3178)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#3198](https://github.com/aws-powertools/powertools-lambda-python/issues/3198)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#3177](https://github.com/aws-powertools/powertools-lambda-python/issues/3177)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3127](https://github.com/aws-powertools/powertools-lambda-python/issues/3127)) - **deps:** bump urllib3 from 1.26.16 to 1.26.17 ([#3162](https://github.com/aws-powertools/powertools-lambda-python/issues/3162)) - **deps:** bump aws-xray-sdk from 2.12.0 to 2.12.1 ([#3197](https://github.com/aws-powertools/powertools-lambda-python/issues/3197)) - **deps:** bump fastjsonschema from 2.18.0 to 2.18.1 ([#3159](https://github.com/aws-powertools/powertools-lambda-python/issues/3159)) - **deps:** bump actions/setup-python from 4.7.0 to 4.7.1 ([#3158](https://github.com/aws-powertools/powertools-lambda-python/issues/3158)) - **deps:** bump actions/checkout from 4.0.0 to 4.1.0 ([#3128](https://github.com/aws-powertools/powertools-lambda-python/issues/3128)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3156](https://github.com/aws-powertools/powertools-lambda-python/issues/3156)) - **deps:** bump squidfunk/mkdocs-material from `e5f28aa` to `cbfecae` in /docs ([#3157](https://github.com/aws-powertools/powertools-lambda-python/issues/3157)) - **deps:** bump squidfunk/mkdocs-material from `06673a1` to `e5f28aa` in /docs ([#3134](https://github.com/aws-powertools/powertools-lambda-python/issues/3134)) - **deps:** bump squidfunk/mkdocs-material from `a4cfa88` to `cb38dc2` in /docs ([#3189](https://github.com/aws-powertools/powertools-lambda-python/issues/3189)) - **deps:** bump pydantic from 1.10.12 to 1.10.13 ([#3144](https://github.com/aws-powertools/powertools-lambda-python/issues/3144)) - **deps:** bump gitpython from 3.1.35 to 3.1.37 in /docs ([#3188](https://github.com/aws-powertools/powertools-lambda-python/issues/3188)) - **deps-dev:** bump types-requests from 2.31.0.5 to 2.31.0.6 ([#3145](https://github.com/aws-powertools/powertools-lambda-python/issues/3145)) - **deps-dev:** bump aws-cdk from 2.98.0 to 2.99.0 ([#3148](https://github.com/aws-powertools/powertools-lambda-python/issues/3148)) - **deps-dev:** bump the boto-typing group with 2 updates ([#3143](https://github.com/aws-powertools/powertools-lambda-python/issues/3143)) - **deps-dev:** bump aws-cdk from 2.99.1 to 2.100.0 ([#3185](https://github.com/aws-powertools/powertools-lambda-python/issues/3185)) - **deps-dev:** bump aws-cdk from 2.97.0 to 2.98.0 ([#3139](https://github.com/aws-powertools/powertools-lambda-python/issues/3139)) - **deps-dev:** bump aws-cdk from 2.96.2 to 2.97.0 ([#3129](https://github.com/aws-powertools/powertools-lambda-python/issues/3129)) - **deps-dev:** bump types-requests from 2.31.0.3 to 2.31.0.5 ([#3136](https://github.com/aws-powertools/powertools-lambda-python/issues/3136)) - **deps-dev:** bump the boto-typing group with 1 update ([#3135](https://github.com/aws-powertools/powertools-lambda-python/issues/3135)) - **deps-dev:** bump ruff from 0.0.291 to 0.0.292 ([#3161](https://github.com/aws-powertools/powertools-lambda-python/issues/3161)) - **deps-dev:** bump ruff from 0.0.290 to 0.0.291 ([#3126](https://github.com/aws-powertools/powertools-lambda-python/issues/3126)) - **deps-dev:** bump aws-cdk from 2.99.0 to 2.99.1 ([#3155](https://github.com/aws-powertools/powertools-lambda-python/issues/3155)) - **deps-dev:** bump sentry-sdk from 1.31.0 to 1.32.0 ([#3192](https://github.com/aws-powertools/powertools-lambda-python/issues/3192)) - **deps-dev:** bump urllib3 from 1.26.16 to 1.26.17 in /layer ([#3163](https://github.com/aws-powertools/powertools-lambda-python/issues/3163)) - **deps-dev:** bump cfn-lint from 0.80.3 to 0.80.4 ([#3166](https://github.com/aws-powertools/powertools-lambda-python/issues/3166)) - **deps-dev:** bump cfn-lint from 0.80.2 to 0.80.3 ([#3125](https://github.com/aws-powertools/powertools-lambda-python/issues/3125)) - **deps-dev:** bump cfn-lint from 0.80.4 to 0.81.0 ([#3179](https://github.com/aws-powertools/powertools-lambda-python/issues/3179)) - **deps-dev:** bump the boto-typing group with 1 update ([#3196](https://github.com/aws-powertools/powertools-lambda-python/issues/3196)) - **deps-dev:** bump the boto-typing group with 1 update ([#3170](https://github.com/aws-powertools/powertools-lambda-python/issues/3170)) ## [v2.25.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.25.0...v2.25.1) - 2023-09-22 ## Bug Fixes - **logger:** add explicit None return type annotations ([#3113](https://github.com/aws-powertools/powertools-lambda-python/issues/3113)) - **metrics:** support additional arguments in functions wrapped with log_metrics decorator ([#3120](https://github.com/aws-powertools/powertools-lambda-python/issues/3120)) ## Maintenance - version bump - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3108](https://github.com/aws-powertools/powertools-lambda-python/issues/3108)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3115](https://github.com/aws-powertools/powertools-lambda-python/issues/3115)) - **deps:** bump squidfunk/mkdocs-material from `4ff781e` to `b41ba6d` in /docs ([#3117](https://github.com/aws-powertools/powertools-lambda-python/issues/3117)) - **deps:** bump squidfunk/mkdocs-material from `c4890ab` to `4ff781e` in /docs ([#3110](https://github.com/aws-powertools/powertools-lambda-python/issues/3110)) - **deps-dev:** bump ruff from 0.0.289 to 0.0.290 ([#3105](https://github.com/aws-powertools/powertools-lambda-python/issues/3105)) - **deps-dev:** bump aws-cdk from 2.96.1 to 2.96.2 ([#3102](https://github.com/aws-powertools/powertools-lambda-python/issues/3102)) - **deps-dev:** bump the boto-typing group with 3 updates ([#3118](https://github.com/aws-powertools/powertools-lambda-python/issues/3118)) - **deps-dev:** bump the boto-typing group with 1 update ([#3101](https://github.com/aws-powertools/powertools-lambda-python/issues/3101)) - **deps-dev:** bump cfn-lint from 0.79.11 to 0.80.2 ([#3107](https://github.com/aws-powertools/powertools-lambda-python/issues/3107)) - **deps-dev:** bump types-requests from 2.31.0.2 to 2.31.0.3 ([#3114](https://github.com/aws-powertools/powertools-lambda-python/issues/3114)) ## [v2.25.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.24.0...v2.25.0) - 2023-09-15 ## Code Refactoring - **parameters:** BaseProvider.\_get to also support Dict ([#3090](https://github.com/aws-powertools/powertools-lambda-python/issues/3090)) ## Documentation - **event_handler:** fix typing in micro function example ([#3098](https://github.com/aws-powertools/powertools-lambda-python/issues/3098)) - **event_handler:** add micro function examples ([#3056](https://github.com/aws-powertools/powertools-lambda-python/issues/3056)) - **we-made-this:** fix broken Twitch video embeds ([#3096](https://github.com/aws-powertools/powertools-lambda-python/issues/3096)) ## Features - **event_source:** add Kinesis Firehose Data Transformation data class ([#3029](https://github.com/aws-powertools/powertools-lambda-python/issues/3029)) - **event_sources:** add Secrets Manager secret rotation event ([#3061](https://github.com/aws-powertools/powertools-lambda-python/issues/3061)) ## Maintenance - version bump - **automation:** remove previous labels when PR is updated ([#3066](https://github.com/aws-powertools/powertools-lambda-python/issues/3066)) - **deps:** bump actions/dependency-review-action from 3.0.8 to 3.1.0 ([#3071](https://github.com/aws-powertools/powertools-lambda-python/issues/3071)) - **deps:** bump docker/setup-qemu-action from 2.2.0 to 3.0.0 ([#3081](https://github.com/aws-powertools/powertools-lambda-python/issues/3081)) - **deps:** bump docker/setup-buildx-action from 2.10.0 to 3.0.0 ([#3083](https://github.com/aws-powertools/powertools-lambda-python/issues/3083)) - **deps:** bump squidfunk/mkdocs-material from `dd1770c` to `c4890ab` in /docs ([#3078](https://github.com/aws-powertools/powertools-lambda-python/issues/3078)) - **deps-dev:** bump cfn-lint from 0.79.9 to 0.79.10 ([#3077](https://github.com/aws-powertools/powertools-lambda-python/issues/3077)) - **deps-dev:** bump hvac from 1.2.0 to 1.2.1 ([#3075](https://github.com/aws-powertools/powertools-lambda-python/issues/3075)) - **deps-dev:** bump ruff from 0.0.288 to 0.0.289 ([#3080](https://github.com/aws-powertools/powertools-lambda-python/issues/3080)) - **deps-dev:** bump ruff from 0.0.287 to 0.0.288 ([#3076](https://github.com/aws-powertools/powertools-lambda-python/issues/3076)) - **deps-dev:** bump aws-cdk from 2.95.0 to 2.95.1 ([#3074](https://github.com/aws-powertools/powertools-lambda-python/issues/3074)) - **deps-dev:** bump the boto-typing group with 1 update ([#3085](https://github.com/aws-powertools/powertools-lambda-python/issues/3085)) - **deps-dev:** bump aws-cdk from 2.95.1 to 2.96.0 ([#3087](https://github.com/aws-powertools/powertools-lambda-python/issues/3087)) - **deps-dev:** bump sentry-sdk from 1.30.0 to 1.31.0 ([#3086](https://github.com/aws-powertools/powertools-lambda-python/issues/3086)) - **deps-dev:** bump aws-cdk from 2.94.0 to 2.95.0 ([#3070](https://github.com/aws-powertools/powertools-lambda-python/issues/3070)) - **deps-dev:** bump cfn-lint from 0.79.10 to 0.79.11 ([#3088](https://github.com/aws-powertools/powertools-lambda-python/issues/3088)) - **deps-dev:** bump aws-cdk from 2.96.0 to 2.96.1 ([#3093](https://github.com/aws-powertools/powertools-lambda-python/issues/3093)) - **typing:** move backwards compat types to shared types ([#3092](https://github.com/aws-powertools/powertools-lambda-python/issues/3092)) ## [v2.24.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.23.1...v2.24.0) - 2023-09-08 ## Bug Fixes - **event_handler:** expanding safe URI characters to include +$& ([#3026](https://github.com/aws-powertools/powertools-lambda-python/issues/3026)) - **parser:** change ApproximateCreationDateTime field to datetime in DynamoDBStreamChangedRecordModel ([#3049](https://github.com/aws-powertools/powertools-lambda-python/issues/3049)) ## Code Refactoring - **batch:** type response() method ([#3023](https://github.com/aws-powertools/powertools-lambda-python/issues/3023)) ## Documentation - **event_handler:** demonstrate how to combine logger correlation ID and middleware ([#3064](https://github.com/aws-powertools/powertools-lambda-python/issues/3064)) - **event_handler:** use correct correlation_id for logger in middleware example ([#3063](https://github.com/aws-powertools/powertools-lambda-python/issues/3063)) - **idempotency:** use tab navigation, improves custom serializer example, and additional explanations ([#3067](https://github.com/aws-powertools/powertools-lambda-python/issues/3067)) ## Features - **event_handler:** add Middleware support for REST Event Handler ([#2917](https://github.com/aws-powertools/powertools-lambda-python/issues/2917)) - **idempotency:** add support to custom serialization/deserialization on idempotency decorator ([#2951](https://github.com/aws-powertools/powertools-lambda-python/issues/2951)) ## Maintenance - version bump - **deps:** bump squidfunk/mkdocs-material from `b1f7f94` to `f4764d1` in /docs ([#3031](https://github.com/aws-powertools/powertools-lambda-python/issues/3031)) - **deps:** bump gitpython from 3.1.32 to 3.1.35 in /docs ([#3059](https://github.com/aws-powertools/powertools-lambda-python/issues/3059)) - **deps:** bump squidfunk/mkdocs-material from `f4764d1` to `dd1770c` in /docs ([#3044](https://github.com/aws-powertools/powertools-lambda-python/issues/3044)) - **deps:** bump actions/checkout from 3.6.0 to 4.0.0 ([#3041](https://github.com/aws-powertools/powertools-lambda-python/issues/3041)) - **deps:** bump squidfunk/mkdocs-material from `97da15b` to `b1f7f94` in /docs ([#3021](https://github.com/aws-powertools/powertools-lambda-python/issues/3021)) - **deps:** bump docker/setup-buildx-action from 2.9.1 to 2.10.0 ([#3022](https://github.com/aws-powertools/powertools-lambda-python/issues/3022)) - **deps:** bump actions/upload-artifact from 3.1.2 to 3.1.3 ([#3053](https://github.com/aws-powertools/powertools-lambda-python/issues/3053)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 1 update ([#3052](https://github.com/aws-powertools/powertools-lambda-python/issues/3052)) - **deps-dev:** bump mkdocs-material from 9.2.6 to 9.2.7 ([#3043](https://github.com/aws-powertools/powertools-lambda-python/issues/3043)) - **deps-dev:** bump cfn-lint from 0.79.7 to 0.79.8 ([#3033](https://github.com/aws-powertools/powertools-lambda-python/issues/3033)) - **deps-dev:** bump mkdocs-material from 9.2.5 to 9.2.6 ([#3032](https://github.com/aws-powertools/powertools-lambda-python/issues/3032)) - **deps-dev:** bump ruff from 0.0.286 to 0.0.287 ([#3035](https://github.com/aws-powertools/powertools-lambda-python/issues/3035)) - **deps-dev:** bump sentry-sdk from 1.29.2 to 1.30.0 ([#3028](https://github.com/aws-powertools/powertools-lambda-python/issues/3028)) - **deps-dev:** bump the boto-typing group with 11 updates ([#3027](https://github.com/aws-powertools/powertools-lambda-python/issues/3027)) - **deps-dev:** bump pytest from 7.4.1 to 7.4.2 ([#3057](https://github.com/aws-powertools/powertools-lambda-python/issues/3057)) - **deps-dev:** bump hvac from 1.1.1 to 1.2.0 ([#3054](https://github.com/aws-powertools/powertools-lambda-python/issues/3054)) - **deps-dev:** bump cfn-lint from 0.79.8 to 0.79.9 ([#3046](https://github.com/aws-powertools/powertools-lambda-python/issues/3046)) - **deps-dev:** bump the boto-typing group with 1 update ([#3013](https://github.com/aws-powertools/powertools-lambda-python/issues/3013)) - **deps-dev:** bump pytest from 7.4.0 to 7.4.1 ([#3042](https://github.com/aws-powertools/powertools-lambda-python/issues/3042)) - **deps-dev:** bump ruff from 0.0.285 to 0.0.286 ([#3014](https://github.com/aws-powertools/powertools-lambda-python/issues/3014)) - **deps-dev:** bump gitpython from 3.1.32 to 3.1.35 ([#3060](https://github.com/aws-powertools/powertools-lambda-python/issues/3060)) - **deps-dev:** bump aws-cdk from 2.93.0 to 2.94.0 ([#3036](https://github.com/aws-powertools/powertools-lambda-python/issues/3036)) ## [v2.23.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.23.0...v2.23.1) - 2023-08-25 ## Bug Fixes - **ci:** revert aws credentials action ([#3010](https://github.com/aws-powertools/powertools-lambda-python/issues/3010)) - **ci:** change SAR assume role options ([#3005](https://github.com/aws-powertools/powertools-lambda-python/issues/3005)) - **event_handler:** make invalid chars a raw str to fix DeprecationWarning ([#2982](https://github.com/aws-powertools/powertools-lambda-python/issues/2982)) - **metrics:** preserve default_tags when metric-specific tag is set in Datadog provider ([#2997](https://github.com/aws-powertools/powertools-lambda-python/issues/2997)) ## Maintenance - version bump - **deps:** bump squidfunk/mkdocs-material from `cd3a522` to `97da15b` in /docs ([#2987](https://github.com/aws-powertools/powertools-lambda-python/issues/2987)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#2978](https://github.com/aws-powertools/powertools-lambda-python/issues/2978)) - **deps:** bump aws-actions/configure-aws-credentials from 2.2.0 to 3.0.0 ([#3000](https://github.com/aws-powertools/powertools-lambda-python/issues/3000)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#2983](https://github.com/aws-powertools/powertools-lambda-python/issues/2983)) - **deps:** bump slsa-framework/slsa-github-generator from 1.8.0 to 1.9.0 ([#2992](https://github.com/aws-powertools/powertools-lambda-python/issues/2992)) - **deps:** bump actions/checkout from 3.5.3 to 3.6.0 ([#2999](https://github.com/aws-powertools/powertools-lambda-python/issues/2999)) - **deps-dev:** bump ruff from 0.0.284 to 0.0.285 ([#2977](https://github.com/aws-powertools/powertools-lambda-python/issues/2977)) - **deps-dev:** bump aws-cdk from 2.92.0 to 2.93.0 ([#2993](https://github.com/aws-powertools/powertools-lambda-python/issues/2993)) - **deps-dev:** bump mkdocs-material from 9.1.21 to 9.2.0 ([#2984](https://github.com/aws-powertools/powertools-lambda-python/issues/2984)) - **deps-dev:** bump mkdocs-material from 9.2.0 to 9.2.3 ([#2988](https://github.com/aws-powertools/powertools-lambda-python/issues/2988)) ## [v2.23.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.22.0...v2.23.0) - 2023-08-18 ## Bug Fixes - **logger:** strip xray_trace_id when explicitly disabled ([#2852](https://github.com/aws-powertools/powertools-lambda-python/issues/2852)) - **metrics:** proxy service and namespace attrs to provider ([#2910](https://github.com/aws-powertools/powertools-lambda-python/issues/2910)) - **parser:** API Gateway V2 request context scope field should be optional ([#2961](https://github.com/aws-powertools/powertools-lambda-python/issues/2961)) ## Code Refactoring - **e2e:** support fail fast in get_lambda_response ([#2912](https://github.com/aws-powertools/powertools-lambda-python/issues/2912)) - **metrics:** move from protocol to ABC; split provider tests ([#2934](https://github.com/aws-powertools/powertools-lambda-python/issues/2934)) ## Documentation - **batch:** new visuals and error handling section ([#2857](https://github.com/aws-powertools/powertools-lambda-python/issues/2857)) - **batch:** explain record type discrepancy in failure and success handler ([#2868](https://github.com/aws-powertools/powertools-lambda-python/issues/2868)) - **metrics:** update Datadog integration diagram ([#2954](https://github.com/aws-powertools/powertools-lambda-python/issues/2954)) - **navigation:** remove nofollow attribute for internal links ([#2867](https://github.com/aws-powertools/powertools-lambda-python/issues/2867)) - **navigation:** add nofollow attribute ([#2842](https://github.com/aws-powertools/powertools-lambda-python/issues/2842)) - **roadmap:** update roadmap themes ([#2915](https://github.com/aws-powertools/powertools-lambda-python/issues/2915)) - **roadmap:** add GovCloud and China region item ([#2960](https://github.com/aws-powertools/powertools-lambda-python/issues/2960)) - **tutorial:** add support for Python 3.11 ([#2860](https://github.com/aws-powertools/powertools-lambda-python/issues/2860)) ## Features - **event_handler:** allow stripping route prefixes using regexes ([#2521](https://github.com/aws-powertools/powertools-lambda-python/issues/2521)) - **layers:** add new comercial region Israel(Tel Aviv) ([#2907](https://github.com/aws-powertools/powertools-lambda-python/issues/2907)) - **metrics:** add Datadog observability provider ([#2906](https://github.com/aws-powertools/powertools-lambda-python/issues/2906)) - **metrics:** support to bring your own metrics provider ([#2194](https://github.com/aws-powertools/powertools-lambda-python/issues/2194)) ## Maintenance - version bump - **ci:** enable protected branch auditing ([#2913](https://github.com/aws-powertools/powertools-lambda-python/issues/2913)) - **ci:** group dependabot updates ([#2896](https://github.com/aws-powertools/powertools-lambda-python/issues/2896)) - **deps:** bump github.com/aws/aws-sdk-go-v2 from 1.19.0 to 1.19.1 in /layer/scripts/layer-balancer ([#2877](https://github.com/aws-powertools/powertools-lambda-python/issues/2877)) - **deps:** bump pypa/gh-action-pypi-publish from 1.8.8 to 1.8.9 ([#2943](https://github.com/aws-powertools/powertools-lambda-python/issues/2943)) - **deps:** bump github.com/aws/aws-sdk-go-v2/service/lambda from 1.38.0 to 1.38.1 in /layer/scripts/layer-balancer ([#2876](https://github.com/aws-powertools/powertools-lambda-python/issues/2876)) - **deps:** bump actions/dependency-review-action from 3.0.6 to 3.0.7 ([#2941](https://github.com/aws-powertools/powertools-lambda-python/issues/2941)) - **deps:** bump github.com/aws/aws-sdk-go-v2/config from 1.18.29 to 1.18.30 in /layer/scripts/layer-balancer ([#2875](https://github.com/aws-powertools/powertools-lambda-python/issues/2875)) - **deps:** bump actions/dependency-review-action from 3.0.7 to 3.0.8 ([#2963](https://github.com/aws-powertools/powertools-lambda-python/issues/2963)) - **deps:** bump gitpython from 3.1.31 to 3.1.32 in /docs ([#2948](https://github.com/aws-powertools/powertools-lambda-python/issues/2948)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 2 updates ([#2904](https://github.com/aws-powertools/powertools-lambda-python/issues/2904)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#2933](https://github.com/aws-powertools/powertools-lambda-python/issues/2933)) - **deps:** bump pypa/gh-action-pypi-publish from 1.8.9 to 1.8.10 ([#2946](https://github.com/aws-powertools/powertools-lambda-python/issues/2946)) - **deps:** bump actions/setup-node from 3.7.0 to 3.8.0 ([#2957](https://github.com/aws-powertools/powertools-lambda-python/issues/2957)) - **deps:** bump slsa-framework/slsa-github-generator from 1.7.0 to 1.8.0 ([#2927](https://github.com/aws-powertools/powertools-lambda-python/issues/2927)) - **deps:** bump github.com/aws/aws-sdk-go-v2/config from 1.18.28 to 1.18.29 in /layer/scripts/layer-balancer ([#2844](https://github.com/aws-powertools/powertools-lambda-python/issues/2844)) - **deps:** bump github.com/aws/aws-sdk-go-v2/service/lambda from 1.37.1 to 1.38.0 in /layer/scripts/layer-balancer ([#2843](https://github.com/aws-powertools/powertools-lambda-python/issues/2843)) - **deps:** bump pydantic from 1.10.11 to 1.10.12 ([#2846](https://github.com/aws-powertools/powertools-lambda-python/issues/2846)) - **deps:** bump the layer-balancer group in /layer/scripts/layer-balancer with 3 updates ([#2971](https://github.com/aws-powertools/powertools-lambda-python/issues/2971)) - **deps:** bump actions/setup-node from 3.8.0 to 3.8.1 ([#2970](https://github.com/aws-powertools/powertools-lambda-python/issues/2970)) - **deps:** bump github.com/aws/aws-sdk-go-v2/config from 1.18.30 to 1.18.31 in /layer/scripts/layer-balancer ([#2889](https://github.com/aws-powertools/powertools-lambda-python/issues/2889)) - **deps:** bump github.com/aws/aws-sdk-go-v2/service/lambda from 1.38.1 to 1.39.0 in /layer/scripts/layer-balancer ([#2890](https://github.com/aws-powertools/powertools-lambda-python/issues/2890)) - **deps:** bump squidfunk/mkdocs-material from `33e28bd` to `cd3a522` in /docs ([#2859](https://github.com/aws-powertools/powertools-lambda-python/issues/2859)) - **deps-dev:** bump ruff from 0.0.283 to 0.0.284 ([#2940](https://github.com/aws-powertools/powertools-lambda-python/issues/2940)) - **deps-dev:** bump cfn-lint from 0.79.5 to 0.79.6 ([#2899](https://github.com/aws-powertools/powertools-lambda-python/issues/2899)) - **deps-dev:** bump the boto-typing group with 11 updates ([#2901](https://github.com/aws-powertools/powertools-lambda-python/issues/2901)) - **deps-dev:** bump ruff from 0.0.281 to 0.0.282 ([#2905](https://github.com/aws-powertools/powertools-lambda-python/issues/2905)) - **deps-dev:** bump aws-cdk from 2.88.0 to 2.89.0 ([#2887](https://github.com/aws-powertools/powertools-lambda-python/issues/2887)) - **deps-dev:** bump aws-cdk from 2.89.0 to 2.90.0 ([#2932](https://github.com/aws-powertools/powertools-lambda-python/issues/2932)) - **deps-dev:** bump mkdocs-material from 9.1.19 to 9.1.21 ([#2894](https://github.com/aws-powertools/powertools-lambda-python/issues/2894)) - **deps-dev:** bump the boto-typing group with 3 updates ([#2967](https://github.com/aws-powertools/powertools-lambda-python/issues/2967)) - **deps-dev:** bump radon from 5.1.0 to 6.0.1 ([#2964](https://github.com/aws-powertools/powertools-lambda-python/issues/2964)) - **deps-dev:** bump the boto-typing group with 4 updates ([#2928](https://github.com/aws-powertools/powertools-lambda-python/issues/2928)) - **deps-dev:** bump mypy-boto3-logs from 1.28.1 to 1.28.15 ([#2880](https://github.com/aws-powertools/powertools-lambda-python/issues/2880)) - **deps-dev:** bump mypy-boto3-appconfigdata from 1.28.0 to 1.28.15 ([#2879](https://github.com/aws-powertools/powertools-lambda-python/issues/2879)) - **deps-dev:** bump mypy-boto3-lambda from 1.28.11 to 1.28.15 ([#2878](https://github.com/aws-powertools/powertools-lambda-python/issues/2878)) - **deps-dev:** bump mypy-boto3-xray from 1.28.0 to 1.28.15 ([#2881](https://github.com/aws-powertools/powertools-lambda-python/issues/2881)) - **deps-dev:** bump ruff from 0.0.282 to 0.0.283 ([#2937](https://github.com/aws-powertools/powertools-lambda-python/issues/2937)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.28.0 to 1.28.11 ([#2847](https://github.com/aws-powertools/powertools-lambda-python/issues/2847)) - **deps-dev:** bump sentry-sdk from 1.28.1 to 1.29.0 ([#2900](https://github.com/aws-powertools/powertools-lambda-python/issues/2900)) - **deps-dev:** bump cfn-lint from 0.79.4 to 0.79.5 ([#2870](https://github.com/aws-powertools/powertools-lambda-python/issues/2870)) - **deps-dev:** bump the boto-typing group with 1 update ([#2944](https://github.com/aws-powertools/powertools-lambda-python/issues/2944)) - **deps-dev:** bump mypy-boto3-cloudformation from 1.28.10 to 1.28.12 ([#2864](https://github.com/aws-powertools/powertools-lambda-python/issues/2864)) - **deps-dev:** bump mypy-boto3-cloudwatch from 1.28.0 to 1.28.12 ([#2865](https://github.com/aws-powertools/powertools-lambda-python/issues/2865)) - **deps-dev:** bump cfn-lint from 0.79.3 to 0.79.4 ([#2862](https://github.com/aws-powertools/powertools-lambda-python/issues/2862)) - **deps-dev:** bump mypy-boto3-appconfig from 1.28.0 to 1.28.12 ([#2861](https://github.com/aws-powertools/powertools-lambda-python/issues/2861)) - **deps-dev:** bump mypy-boto3-ssm from 1.28.0 to 1.28.12 ([#2863](https://github.com/aws-powertools/powertools-lambda-python/issues/2863)) - **deps-dev:** bump aws-cdk from 2.90.0 to 2.91.0 ([#2947](https://github.com/aws-powertools/powertools-lambda-python/issues/2947)) - **deps-dev:** bump xenon from 0.9.0 to 0.9.1 ([#2955](https://github.com/aws-powertools/powertools-lambda-python/issues/2955)) - **deps-dev:** bump cfn-lint from 0.78.2 to 0.79.3 ([#2854](https://github.com/aws-powertools/powertools-lambda-python/issues/2854)) - **deps-dev:** bump mypy-boto3-lambda from 1.28.0 to 1.28.11 ([#2845](https://github.com/aws-powertools/powertools-lambda-python/issues/2845)) - **deps-dev:** bump cfn-lint from 0.79.6 to 0.79.7 ([#2956](https://github.com/aws-powertools/powertools-lambda-python/issues/2956)) - **deps-dev:** bump aws-cdk from 2.91.0 to 2.92.0 ([#2965](https://github.com/aws-powertools/powertools-lambda-python/issues/2965)) - **deps-dev:** bump ruff from 0.0.280 to 0.0.281 ([#2891](https://github.com/aws-powertools/powertools-lambda-python/issues/2891)) - **docs:** include the environment variables section in the utilities documentation ([#2925](https://github.com/aws-powertools/powertools-lambda-python/issues/2925)) - **docs:** disable line length rule using older syntax ([#2920](https://github.com/aws-powertools/powertools-lambda-python/issues/2920)) - **maintenance:** enables publishing docs and changelog, running e2e tests only in the main repository ([#2924](https://github.com/aws-powertools/powertools-lambda-python/issues/2924)) ## [v2.22.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.21.0...v2.22.0) - 2023-07-25 ## Bug Fixes - **parameters:** distinct cache key for single vs path with same name ([#2839](https://github.com/aws-powertools/powertools-lambda-python/issues/2839)) ## Documentation - **community:** new batch processing article ([#2828](https://github.com/aws-powertools/powertools-lambda-python/issues/2828)) - **parameters:** improve readability on error handling get_parameter… ([#2833](https://github.com/aws-powertools/powertools-lambda-python/issues/2833)) ## Features - **general:** add support for Python 3.11 ([#2820](https://github.com/aws-powertools/powertools-lambda-python/issues/2820)) ## Maintenance - version bump - **ci:** add baking time for layer build ([#2834](https://github.com/aws-powertools/powertools-lambda-python/issues/2834)) - **ci:** build changelog on a schedule only ([#2832](https://github.com/aws-powertools/powertools-lambda-python/issues/2832)) - **deps:** bump actions/setup-python from 4.6.1 to 4.7.0 ([#2821](https://github.com/aws-powertools/powertools-lambda-python/issues/2821)) - **deps-dev:** bump ruff from 0.0.278 to 0.0.279 ([#2822](https://github.com/aws-powertools/powertools-lambda-python/issues/2822)) - **deps-dev:** bump cfn-lint from 0.78.1 to 0.78.2 ([#2823](https://github.com/aws-powertools/powertools-lambda-python/issues/2823)) - **deps-dev:** bump ruff from 0.0.279 to 0.0.280 ([#2836](https://github.com/aws-powertools/powertools-lambda-python/issues/2836)) - **deps-dev:** bump mypy-boto3-cloudformation from 1.28.0 to 1.28.10 ([#2837](https://github.com/aws-powertools/powertools-lambda-python/issues/2837)) ## [v2.21.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.20.0...v2.21.0) - 2023-07-21 ## Bug Fixes - **docs:** remove redundant code ([#2796](https://github.com/aws-powertools/powertools-lambda-python/issues/2796)) ## Documentation - **customer-reference:** add Jit Security as a customer reference ([#2801](https://github.com/aws-powertools/powertools-lambda-python/issues/2801)) ## Features - **parser:** add support for Pydantic v2 ([#2733](https://github.com/aws-powertools/powertools-lambda-python/issues/2733)) ## Maintenance - version bump - **deps:** bump squidfunk/mkdocs-material from `a28ed81` to `33e28bd` in /docs ([#2797](https://github.com/aws-powertools/powertools-lambda-python/issues/2797)) - **deps-dev:** bump mypy-boto3-s3 from 1.28.3.post2 to 1.28.8 ([#2808](https://github.com/aws-powertools/powertools-lambda-python/issues/2808)) - **deps-dev:** bump types-python-dateutil from 2.8.19.13 to 2.8.19.14 ([#2807](https://github.com/aws-powertools/powertools-lambda-python/issues/2807)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.28.3.post1 to 1.28.3.post2 ([#2794](https://github.com/aws-powertools/powertools-lambda-python/issues/2794)) - **deps-dev:** bump types-requests from 2.31.0.1 to 2.31.0.2 ([#2806](https://github.com/aws-powertools/powertools-lambda-python/issues/2806)) - **deps-dev:** bump mypy-boto3-s3 from 1.28.3.post1 to 1.28.3.post2 ([#2793](https://github.com/aws-powertools/powertools-lambda-python/issues/2793)) - **deps-dev:** bump aws-cdk from 2.87.0 to 2.88.0 ([#2812](https://github.com/aws-powertools/powertools-lambda-python/issues/2812)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.28.3 to 1.28.3.post1 ([#2785](https://github.com/aws-powertools/powertools-lambda-python/issues/2785)) - **deps-dev:** bump mypy-boto3-s3 from 1.28.3 to 1.28.3.post1 ([#2786](https://github.com/aws-powertools/powertools-lambda-python/issues/2786)) - **deps-dev:** bump mkdocs-material from 9.1.18 to 9.1.19 ([#2798](https://github.com/aws-powertools/powertools-lambda-python/issues/2798)) - **security:** improve debugging for provenance script ([#2784](https://github.com/aws-powertools/powertools-lambda-python/issues/2784)) ## [v2.20.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.19.0...v2.20.0) - 2023-07-14 ## Bug Fixes - **docs:** ensure alias is applied to versioned releases ([#2644](https://github.com/aws-powertools/powertools-lambda-python/issues/2644)) - **docs:** ensure version alias is in an array to prevent "you're not viewing the latest version" incorrect message ([#2629](https://github.com/aws-powertools/powertools-lambda-python/issues/2629)) - **logger:** ensure logs stream to stdout by default, not stderr ([#2736](https://github.com/aws-powertools/powertools-lambda-python/issues/2736)) ## Code Refactoring - **parser:** convert functional tests to unit tests ([#2656](https://github.com/aws-powertools/powertools-lambda-python/issues/2656)) ## Documentation - **batch:** fix custom batch processor example ([#2714](https://github.com/aws-powertools/powertools-lambda-python/issues/2714)) - **contributing:** add code integration journey graph ([#2685](https://github.com/aws-powertools/powertools-lambda-python/issues/2685)) - **maintainers:** add cicd pipeline diagram ([#2692](https://github.com/aws-powertools/powertools-lambda-python/issues/2692)) - **process:** explain our integration automated checks; revamp navigation ([#2764](https://github.com/aws-powertools/powertools-lambda-python/issues/2764)) ## Features - **metrics:** support to set default dimension in EphemeralMetrics ([#2748](https://github.com/aws-powertools/powertools-lambda-python/issues/2748)) ## Maintenance - version bump - **ci:** enforce pip --require-hashes to maybe satistify scorecard ([#2679](https://github.com/aws-powertools/powertools-lambda-python/issues/2679)) - **ci:** prevent merging PRs that do not meet minimum requirements ([#2639](https://github.com/aws-powertools/powertools-lambda-python/issues/2639)) - **ci:** enforce top-level permission to minimum fail-safe permission as per openssf ([#2638](https://github.com/aws-powertools/powertools-lambda-python/issues/2638)) - **ci:** propagate checkout permission to nested workflows ([#2642](https://github.com/aws-powertools/powertools-lambda-python/issues/2642)) - **ci:** improves dependabot based on ossf scorecard recommendations ([#2647](https://github.com/aws-powertools/powertools-lambda-python/issues/2647)) - **ci:** use deps sha for docs and gitpod images based on ossf findings ([#2662](https://github.com/aws-powertools/powertools-lambda-python/issues/2662)) - **ci:** use sast on every commit on any supported language ([#2646](https://github.com/aws-powertools/powertools-lambda-python/issues/2646)) - **ci:** add gitleaks in pre-commit hooks as an extra safety measure ([#2677](https://github.com/aws-powertools/powertools-lambda-python/issues/2677)) - **ci:** address ossf scorecard findings on npm, pip, and top-level permission leftover ([#2694](https://github.com/aws-powertools/powertools-lambda-python/issues/2694)) - **ci:** prevent sast codeql to run in forks ([#2711](https://github.com/aws-powertools/powertools-lambda-python/issues/2711)) - **ci:** introduce provenance and attestation in release ([#2746](https://github.com/aws-powertools/powertools-lambda-python/issues/2746)) - **deps:** bump pypa/gh-action-pypi-publish from 1.8.7 to 1.8.8 ([#2754](https://github.com/aws-powertools/powertools-lambda-python/issues/2754)) - **deps:** bump github.com/aws/aws-sdk-go-v2/service/lambda from 1.37.0 to 1.37.1 in /layer/scripts/layer-balancer ([#2769](https://github.com/aws-powertools/powertools-lambda-python/issues/2769)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 2.1.3 to 2.1.4 ([#2738](https://github.com/aws-powertools/powertools-lambda-python/issues/2738)) - **deps:** bump github.com/aws/aws-sdk-go-v2/config from 1.18.27 to 1.18.28 in /layer/scripts/layer-balancer ([#2770](https://github.com/aws-powertools/powertools-lambda-python/issues/2770)) - **deps:** bump actions/setup-python from 4.6.1 to 4.7.0 ([#2768](https://github.com/aws-powertools/powertools-lambda-python/issues/2768)) - **deps:** bump github.com/aws/aws-sdk-go-v2 from 1.16.16 to 1.18.1 in /layer/scripts/layer-balancer ([#2654](https://github.com/aws-powertools/powertools-lambda-python/issues/2654)) - **deps:** bump golang.org/x/sync from 0.1.0 to 0.3.0 in /layer/scripts/layer-balancer ([#2649](https://github.com/aws-powertools/powertools-lambda-python/issues/2649)) - **deps:** bump github.com/aws/aws-sdk-go-v2/service/lambda from 1.24.6 to 1.37.0 in /layer/scripts/layer-balancer ([#2653](https://github.com/aws-powertools/powertools-lambda-python/issues/2653)) - **deps:** bump docker/setup-buildx-action from 2.8.0 to 2.9.0 ([#2718](https://github.com/aws-powertools/powertools-lambda-python/issues/2718)) - **deps:** bump github.com/aws/aws-sdk-go-v2/config from 1.17.8 to 1.18.27 in /layer/scripts/layer-balancer ([#2651](https://github.com/aws-powertools/powertools-lambda-python/issues/2651)) - **deps:** bump github.com/aws/aws-sdk-go-v2 from 1.18.1 to 1.19.0 in /layer/scripts/layer-balancer ([#2771](https://github.com/aws-powertools/powertools-lambda-python/issues/2771)) - **deps:** migrate from retry to retry2 to address CVE-2022-42969 ([#2665](https://github.com/aws-powertools/powertools-lambda-python/issues/2665)) - **deps:** bump pydantic from 1.10.9 to 1.10.10 ([#2624](https://github.com/aws-powertools/powertools-lambda-python/issues/2624)) - **deps:** bump squidfunk/mkdocs-material from `3837c0f` to `a28ed81` in /docs ([#2669](https://github.com/aws-powertools/powertools-lambda-python/issues/2669)) - **deps:** bump pydantic from 1.10.10 to 1.10.11 ([#2671](https://github.com/aws-powertools/powertools-lambda-python/issues/2671)) - **deps:** bump docker/setup-buildx-action from 2.9.0 to 2.9.1 ([#2755](https://github.com/aws-powertools/powertools-lambda-python/issues/2755)) - **deps:** bump actions/dependency-review-action from 2.5.1 to 3.0.6 ([#2650](https://github.com/aws-powertools/powertools-lambda-python/issues/2650)) - **deps:** bump actions/setup-node from 3.6.0 to 3.7.0 ([#2689](https://github.com/aws-powertools/powertools-lambda-python/issues/2689)) - **deps-dev:** bump mypy-boto3-lambda from 1.27.0 to 1.28.0 ([#2698](https://github.com/aws-powertools/powertools-lambda-python/issues/2698)) - **deps-dev:** bump sentry-sdk from 1.27.0 to 1.27.1 ([#2701](https://github.com/aws-powertools/powertools-lambda-python/issues/2701)) - **deps-dev:** bump mypy-boto3-cloudformation from 1.27.0 to 1.28.0 ([#2700](https://github.com/aws-powertools/powertools-lambda-python/issues/2700)) - **deps-dev:** bump mypy-boto3-appconfigdata from 1.27.0 to 1.28.0 ([#2699](https://github.com/aws-powertools/powertools-lambda-python/issues/2699)) - **deps-dev:** bump ruff from 0.0.276 to 0.0.277 ([#2682](https://github.com/aws-powertools/powertools-lambda-python/issues/2682)) - **deps-dev:** bump pytest-asyncio from 0.21.0 to 0.21.1 ([#2756](https://github.com/aws-powertools/powertools-lambda-python/issues/2756)) - **deps-dev:** bump cfn-lint from 0.77.10 to 0.78.1 ([#2757](https://github.com/aws-powertools/powertools-lambda-python/issues/2757)) - **deps-dev:** bump aws-cdk from 2.86.0 to 2.87.0 ([#2696](https://github.com/aws-powertools/powertools-lambda-python/issues/2696)) - **deps-dev:** bump typed-ast from 1.5.4 to 1.5.5 ([#2670](https://github.com/aws-powertools/powertools-lambda-python/issues/2670)) - **deps-dev:** bump mypy-boto3-cloudwatch from 1.27.0 to 1.28.0 ([#2697](https://github.com/aws-powertools/powertools-lambda-python/issues/2697)) - **deps-dev:** bump ruff from 0.0.275 to 0.0.276 ([#2655](https://github.com/aws-powertools/powertools-lambda-python/issues/2655)) - **deps-dev:** bump sentry-sdk from 1.26.0 to 1.27.0 ([#2652](https://github.com/aws-powertools/powertools-lambda-python/issues/2652)) - **deps-dev:** bump ruff from 0.0.277 to 0.0.278 ([#2758](https://github.com/aws-powertools/powertools-lambda-python/issues/2758)) - **deps-dev:** bump mypy-boto3-s3 from 1.28.0 to 1.28.3 ([#2774](https://github.com/aws-powertools/powertools-lambda-python/issues/2774)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.26.158 to 1.26.164 ([#2622](https://github.com/aws-powertools/powertools-lambda-python/issues/2622)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.28.0 to 1.28.3 ([#2773](https://github.com/aws-powertools/powertools-lambda-python/issues/2773)) - **deps-dev:** bump sentry-sdk from 1.27.1 to 1.28.0 ([#2741](https://github.com/aws-powertools/powertools-lambda-python/issues/2741)) - **deps-dev:** bump mypy-boto3-s3 from 1.27.0 to 1.28.0 ([#2721](https://github.com/aws-powertools/powertools-lambda-python/issues/2721)) - **deps-dev:** bump mypy-boto3-appconfig from 1.27.0 to 1.28.0 ([#2722](https://github.com/aws-powertools/powertools-lambda-python/issues/2722)) - **deps-dev:** bump mypy-boto3-logs from 1.27.0 to 1.28.1 ([#2723](https://github.com/aws-powertools/powertools-lambda-python/issues/2723)) - **deps-dev:** bump mypy-boto3-ssm from 1.27.0 to 1.28.0 ([#2724](https://github.com/aws-powertools/powertools-lambda-python/issues/2724)) - **deps-dev:** bump mypy-boto3-xray from 1.27.0 to 1.28.0 ([#2720](https://github.com/aws-powertools/powertools-lambda-python/issues/2720)) - **deps-dev:** bump sentry-sdk from 1.28.0 to 1.28.1 ([#2772](https://github.com/aws-powertools/powertools-lambda-python/issues/2772)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.27.0 to 1.28.0 ([#2740](https://github.com/aws-powertools/powertools-lambda-python/issues/2740)) - **deps-dev:** bump mypy-boto3-appconfigdata from 1.26.70 to 1.27.0 ([#2636](https://github.com/aws-powertools/powertools-lambda-python/issues/2636)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.27.0 to 1.28.0 ([#2739](https://github.com/aws-powertools/powertools-lambda-python/issues/2739)) - **governance:** update active maintainers list ([#2715](https://github.com/aws-powertools/powertools-lambda-python/issues/2715)) - **streaming:** replace deprecated Version classes from distutils ([#2752](https://github.com/aws-powertools/powertools-lambda-python/issues/2752)) - **user-agent:** support patching botocore session ([#2614](https://github.com/aws-powertools/powertools-lambda-python/issues/2614)) ## [v2.19.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.18.0...v2.19.0) - 2023-06-30 ## Bug Fixes - **e2e:** fix idempotency tests ([#2576](https://github.com/aws-powertools/powertools-lambda-python/issues/2576)) ## Code Refactoring - **event_source:** convert functional tests to unit tests ([#2506](https://github.com/aws-powertools/powertools-lambda-python/issues/2506)) ## Documentation - **i-made-this:** added new article on idempotency ([#2582](https://github.com/aws-powertools/powertools-lambda-python/issues/2582)) - **i-made-this:** article on idempotency w/ CDK and Powertools ([#2569](https://github.com/aws-powertools/powertools-lambda-python/issues/2569)) - **idempotency:** split snippets, improve wording and lint examples ([#2492](https://github.com/aws-powertools/powertools-lambda-python/issues/2492)) ## Features - **event_handler:** add VPCLatticeResolver ([#2601](https://github.com/aws-powertools/powertools-lambda-python/issues/2601)) - **event_source:** decode nested messages on SQS events ([#2349](https://github.com/aws-powertools/powertools-lambda-python/issues/2349)) - **parser:** add support to VpcLatticeModel ([#2584](https://github.com/aws-powertools/powertools-lambda-python/issues/2584)) ## Maintenance - version bump - **analytics:** update docs base origin url ([#2560](https://github.com/aws-powertools/powertools-lambda-python/issues/2560)) - **ci:** replace flake8 with Ruff as a linter ([#2495](https://github.com/aws-powertools/powertools-lambda-python/issues/2495)) - **ci:** enable Ruff rule E501 and fix errors ([#2587](https://github.com/aws-powertools/powertools-lambda-python/issues/2587)) - **ci:** enable Ruff rule COM812 and fix the errors ([#2595](https://github.com/aws-powertools/powertools-lambda-python/issues/2595)) - **ci:** enable Ruff rules PLW, PLR, PLC and PLE and fix the errors ([#2593](https://github.com/aws-powertools/powertools-lambda-python/issues/2593)) - **ci:** enable Ruff autofix rules ([#2599](https://github.com/aws-powertools/powertools-lambda-python/issues/2599)) - **ci:** enable Ruff rules ISC, I001, B018 and fix the errors ([#2597](https://github.com/aws-powertools/powertools-lambda-python/issues/2597)) - **ci:** enable Ruff rule ERA001 and fix errors ([#2591](https://github.com/aws-powertools/powertools-lambda-python/issues/2591)) - **deps:** bump pypa/gh-action-pypi-publish from 1.8.6 to 1.8.7 ([#2573](https://github.com/aws-powertools/powertools-lambda-python/issues/2573)) - **deps:** bump docker/setup-buildx-action from 2.7.0 to 2.8.0 ([#2604](https://github.com/aws-powertools/powertools-lambda-python/issues/2604)) - **deps:** bump ossf/scorecard-action from 2.1.3 to 2.2.0 ([#2563](https://github.com/aws-powertools/powertools-lambda-python/issues/2563)) - **deps:** bump release-drafter/release-drafter from 5.23.0 to 5.24.0 ([#2603](https://github.com/aws-powertools/powertools-lambda-python/issues/2603)) - **deps-dev:** bump mypy-boto3-s3 from 1.26.155 to 1.26.163 ([#2608](https://github.com/aws-powertools/powertools-lambda-python/issues/2608)) - **deps-dev:** bump mypy-boto3-lambda from 1.26.157 to 1.26.163 ([#2607](https://github.com/aws-powertools/powertools-lambda-python/issues/2607)) - **deps-dev:** bump mkdocs-material from 9.1.16 to 9.1.17 ([#2564](https://github.com/aws-powertools/powertools-lambda-python/issues/2564)) - **deps-dev:** bump ruff from 0.0.272 to 0.0.275 ([#2586](https://github.com/aws-powertools/powertools-lambda-python/issues/2586)) - **deps-dev:** bump mypy-boto3-ssm from 1.26.97 to 1.26.162 ([#2606](https://github.com/aws-powertools/powertools-lambda-python/issues/2606)) - **deps-dev:** bump pytest from 7.3.2 to 7.4.0 ([#2557](https://github.com/aws-powertools/powertools-lambda-python/issues/2557)) - **deps-dev:** bump aws-cdk from 2.85.0 to 2.86.0 ([#2613](https://github.com/aws-powertools/powertools-lambda-python/issues/2613)) - **deps-dev:** bump mypy from 1.4.0 to 1.4.1 ([#2574](https://github.com/aws-powertools/powertools-lambda-python/issues/2574)) ## [v2.18.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.17.0...v2.18.0) - 2023-06-23 ## Bug Fixes - **docs:** ensure versions.json is updated ([#2505](https://github.com/aws-powertools/powertools-lambda-python/issues/2505)) - **event_source:** centralizing helper functions for query, header and base64 ([#2496](https://github.com/aws-powertools/powertools-lambda-python/issues/2496)) ## Documentation - **homepage:** fix .NET repository link ([#2549](https://github.com/aws-powertools/powertools-lambda-python/issues/2549)) - **homepage:** add Open Source Security Foundation badge; update links to new url ([#2545](https://github.com/aws-powertools/powertools-lambda-python/issues/2545)) - **navigation:** make Key Feature the first section ([#2517](https://github.com/aws-powertools/powertools-lambda-python/issues/2517)) ## Features - **event_handler:** support to enable or disable compression in custom responses ([#2544](https://github.com/aws-powertools/powertools-lambda-python/issues/2544)) - **feature_flags:** add modulo range condition for segmented experimentation support ([#2331](https://github.com/aws-powertools/powertools-lambda-python/issues/2331)) ## Maintenance - version bump - **ci:** fix changelog build permissions ([#2519](https://github.com/aws-powertools/powertools-lambda-python/issues/2519)) - **ci:** remove GH pages action ([#2501](https://github.com/aws-powertools/powertools-lambda-python/issues/2501)) - **ci:** updates runner names in workflows ([#2510](https://github.com/aws-powertools/powertools-lambda-python/issues/2510)) - **ci:** introduces OSSF Scorecard ([#2512](https://github.com/aws-powertools/powertools-lambda-python/issues/2512)) - **ci:** fix codeowners team name ([#2516](https://github.com/aws-powertools/powertools-lambda-python/issues/2516)) - **deps:** bump actions/upload-artifact from 3.1.0 to 3.1.2 ([#2522](https://github.com/aws-powertools/powertools-lambda-python/issues/2522)) - **deps:** bump actions/checkout from 3.1.0 to 3.5.3 ([#2523](https://github.com/aws-powertools/powertools-lambda-python/issues/2523)) - **deps-dev:** bump mypy-boto3-s3 from 1.26.153 to 1.26.155 ([#2498](https://github.com/aws-powertools/powertools-lambda-python/issues/2498)) - **deps-dev:** bump aws-cdk from 2.84.0 to 2.85.0 ([#2524](https://github.com/aws-powertools/powertools-lambda-python/issues/2524)) - **deps-dev:** bump mypy-boto3-lambda from 1.26.147 to 1.26.157 ([#2507](https://github.com/aws-powertools/powertools-lambda-python/issues/2507)) - **deps-dev:** bump cfn-lint from 0.77.9 to 0.77.10 ([#2508](https://github.com/aws-powertools/powertools-lambda-python/issues/2508)) - **deps-dev:** bump mypy-boto3-cloudformation from 1.26.149 to 1.26.156 ([#2503](https://github.com/aws-powertools/powertools-lambda-python/issues/2503)) - **deps-dev:** bump sentry-sdk from 1.25.1 to 1.26.0 ([#2527](https://github.com/aws-powertools/powertools-lambda-python/issues/2527)) - **deps-dev:** bump hvac from 1.1.0 to 1.1.1 ([#2497](https://github.com/aws-powertools/powertools-lambda-python/issues/2497)) - **deps-dev:** bump flake8-variables-names from 0.0.5 to 0.0.6 ([#2525](https://github.com/aws-powertools/powertools-lambda-python/issues/2525)) - **deps-dev:** bump ijson from 3.2.1 to 3.2.2 ([#2526](https://github.com/aws-powertools/powertools-lambda-python/issues/2526)) - **deps-dev:** bump pytest-mock from 3.10.0 to 3.11.1 ([#2485](https://github.com/aws-powertools/powertools-lambda-python/issues/2485)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.26.152 to 1.26.158 ([#2528](https://github.com/aws-powertools/powertools-lambda-python/issues/2528)) - **deps-dev:** bump mypy from 1.3.0 to 1.4.0 ([#2509](https://github.com/aws-powertools/powertools-lambda-python/issues/2509)) - **documentation:** updating repository URL and name to the new location ([#2499](https://github.com/aws-powertools/powertools-lambda-python/issues/2499)) ## [v2.17.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.16.2...v2.17.0) - 2023-06-16 ## Bug Fixes - **event_handler:** prioritize static over dynamic route to prevent order of route registration mismatch ([#2458](https://github.com/aws-powertools/powertools-lambda-python/issues/2458)) - **idempotency:** treat missing idempotency key as non-idempotent transaction (no-op) when raise_on_no_idempotency_key is False ([#2477](https://github.com/aws-powertools/powertools-lambda-python/issues/2477)) ## Documentation - **event_handler:** improve compress example using Response class ([#2426](https://github.com/aws-powertools/powertools-lambda-python/issues/2426)) - **event_sources:** fix DynamoDB stream event docstring ([#2468](https://github.com/aws-powertools/powertools-lambda-python/issues/2468)) - **idempotency:** new sequence flow when idempotency key is optional ([#2480](https://github.com/aws-powertools/powertools-lambda-python/issues/2480)) - **idempotency:** add CDK example ([#2434](https://github.com/aws-powertools/powertools-lambda-python/issues/2434)) - **maintainers:** visual representation of release process ([#2399](https://github.com/aws-powertools/powertools-lambda-python/issues/2399)) - **navigation:** standardize link targets to enhance customer experience ([#2420](https://github.com/aws-powertools/powertools-lambda-python/issues/2420)) - **we-made-this:** new article about idempotency design ([#2425](https://github.com/aws-powertools/powertools-lambda-python/issues/2425)) ## Features - **event_sources:** add AWS Config Rule event data class ([#2175](https://github.com/aws-powertools/powertools-lambda-python/issues/2175)) - **event_sources:** add support for VPC Lattice events ([#2358](https://github.com/aws-powertools/powertools-lambda-python/issues/2358)) - **logger:** type log record in LambdaPowertoolsFormatter with TypedDict ([#2419](https://github.com/aws-powertools/powertools-lambda-python/issues/2419)) - **parser:** support for CloudFormation Custom Resources ([#2335](https://github.com/aws-powertools/powertools-lambda-python/issues/2335)) ## Maintenance - version bump - **ci:** document all github action workflows and enforce least-privilege ([#2395](https://github.com/aws-powertools/powertools-lambda-python/issues/2395)) - **ci:** fix PR labeling permission scope ([#2396](https://github.com/aws-powertools/powertools-lambda-python/issues/2396)) - **deps:** bump aws-actions/configure-aws-credentials from 2.1.0 to 2.2.0 ([#2469](https://github.com/aws-powertools/powertools-lambda-python/issues/2469)) - **deps:** bump docker/setup-buildx-action from 2.5.0 to 2.6.0 ([#2403](https://github.com/aws-powertools/powertools-lambda-python/issues/2403)) - **deps:** bump docker/setup-qemu-action from 2.1.0 to 2.2.0 ([#2404](https://github.com/aws-powertools/powertools-lambda-python/issues/2404)) - **deps:** bump docker/setup-buildx-action from 2.6.0 to 2.7.0 ([#2450](https://github.com/aws-powertools/powertools-lambda-python/issues/2450)) - **deps:** bump pydantic from 1.10.8 to 1.10.9 ([#2405](https://github.com/aws-powertools/powertools-lambda-python/issues/2405)) - **deps:** bump actions/checkout from 3.5.2 to 3.5.3 ([#2431](https://github.com/aws-powertools/powertools-lambda-python/issues/2431)) - **deps-dev:** bump ijson from 3.2.0.post0 to 3.2.1 ([#2441](https://github.com/aws-powertools/powertools-lambda-python/issues/2441)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.26.115 to 1.26.152 ([#2444](https://github.com/aws-powertools/powertools-lambda-python/issues/2444)) - **deps-dev:** bump filelock from 3.12.0 to 3.12.2 ([#2446](https://github.com/aws-powertools/powertools-lambda-python/issues/2446)) - **deps-dev:** bump aws-cdk from 2.83.0 to 2.83.1 ([#2432](https://github.com/aws-powertools/powertools-lambda-python/issues/2432)) - **deps-dev:** bump cfn-lint from 0.77.6 to 0.77.7 ([#2414](https://github.com/aws-powertools/powertools-lambda-python/issues/2414)) - **deps-dev:** bump pytest from 7.3.1 to 7.3.2 ([#2443](https://github.com/aws-powertools/powertools-lambda-python/issues/2443)) - **deps-dev:** bump sentry-sdk from 1.25.0 to 1.25.1 ([#2408](https://github.com/aws-powertools/powertools-lambda-python/issues/2408)) - **deps-dev:** bump mypy-boto3-cloudformation from 1.26.147 to 1.26.149 ([#2410](https://github.com/aws-powertools/powertools-lambda-python/issues/2410)) - **deps-dev:** bump aws-cdk from 2.82.0 to 2.83.0 ([#2406](https://github.com/aws-powertools/powertools-lambda-python/issues/2406)) - **deps-dev:** bump mypy-boto3-logs from 1.26.53 to 1.26.149 ([#2409](https://github.com/aws-powertools/powertools-lambda-python/issues/2409)) - **deps-dev:** bump cfn-lint from 0.77.7 to 0.77.8 ([#2451](https://github.com/aws-powertools/powertools-lambda-python/issues/2451)) - **deps-dev:** bump mypy-boto3-s3 from 1.26.127 to 1.26.153 ([#2452](https://github.com/aws-powertools/powertools-lambda-python/issues/2452)) - **deps-dev:** bump cfn-lint from 0.77.8 to 0.77.9 ([#2472](https://github.com/aws-powertools/powertools-lambda-python/issues/2472)) - **deps-dev:** bump flake8-comprehensions from 3.12.0 to 3.13.0 ([#2471](https://github.com/aws-powertools/powertools-lambda-python/issues/2471)) - **deps-dev:** bump mkdocs-material from 9.1.15 to 9.1.16 ([#2470](https://github.com/aws-powertools/powertools-lambda-python/issues/2470)) - **deps-dev:** bump aws-cdk from 2.83.1 to 2.84.0 ([#2460](https://github.com/aws-powertools/powertools-lambda-python/issues/2460)) ## [v2.16.2](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.16.1...v2.16.2) - 2023-06-06 ## Bug Fixes - **parameters:** AppConfigProvider when retrieving multiple unique configuration names ([#2378](https://github.com/aws-powertools/powertools-lambda-python/issues/2378)) - **shared:** move to static version bumping to prevent issues with customers custom builds ([#2386](https://github.com/aws-powertools/powertools-lambda-python/issues/2386)) ## Maintenance - version bump - **deps-dev:** bump mypy-boto3-cloudformation from 1.26.108 to 1.26.147 ([#2383](https://github.com/aws-powertools/powertools-lambda-python/issues/2383)) - **deps-dev:** bump mypy-boto3-lambda from 1.26.122 to 1.26.147 ([#2382](https://github.com/aws-powertools/powertools-lambda-python/issues/2382)) - **deps-dev:** bump sentry-sdk from 1.24.0 to 1.25.0 ([#2374](https://github.com/aws-powertools/powertools-lambda-python/issues/2374)) - **deps-dev:** bump aws-cdk from 2.81.0 to 2.82.0 ([#2373](https://github.com/aws-powertools/powertools-lambda-python/issues/2373)) - **typing:** add setLevel and addHandler to Logger for mypy/pyright ([#2388](https://github.com/aws-powertools/powertools-lambda-python/issues/2388)) ## [v2.16.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.16.0...v2.16.1) - 2023-06-02 ## Bug Fixes - **shared:** skip user agent on much older botocore versions ([#2366](https://github.com/aws-powertools/powertools-lambda-python/issues/2366)) ## Maintenance - version bump ## [v2.16.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.15.0...v2.16.0) - 2023-06-02 ## Bug Fixes - **docs:** use concrete secrets from settings ([#2322](https://github.com/aws-powertools/powertools-lambda-python/issues/2322)) - **event_source:** change the import location of boto3 in CodePipelineJobEvent data class ([#2353](https://github.com/aws-powertools/powertools-lambda-python/issues/2353)) - **logger:** add setLevel function to set level programmatically ([#2320](https://github.com/aws-powertools/powertools-lambda-python/issues/2320)) ## Code Refactoring - **logger:** remove subclassing and move unnecessary APIs ([#2334](https://github.com/aws-powertools/powertools-lambda-python/issues/2334)) ## Documentation - **batch:** add encryption at rest for SQS ([#2290](https://github.com/aws-powertools/powertools-lambda-python/issues/2290)) - **batch_processing:** snippets split, improved, and lint ([#2231](https://github.com/aws-powertools/powertools-lambda-python/issues/2231)) - **feature_flags:** snippets split, improved, and lint ([#2222](https://github.com/aws-powertools/powertools-lambda-python/issues/2222)) - **project:** rename project to Powertools for AWS Lambda (Python) ([#2313](https://github.com/aws-powertools/powertools-lambda-python/issues/2313)) ## Features - **docs:** Move docs to S3 ([#2277](https://github.com/aws-powertools/powertools-lambda-python/issues/2277)) - **event_source:** allow multiple CORS origins ([#2279](https://github.com/aws-powertools/powertools-lambda-python/issues/2279)) - **parser:** add support for parsing SQS events wrapped in Kinesis Firehose ([#2294](https://github.com/aws-powertools/powertools-lambda-python/issues/2294)) - **user-agent:** add custom header User-Agent to AWS SDK requests ([#2267](https://github.com/aws-powertools/powertools-lambda-python/issues/2267)) ## Maintenance - version bump - **ci:** remove auto-merge workflow ([#2214](https://github.com/aws-powertools/powertools-lambda-python/issues/2214)) - **ci:** schedule changelog to rebuild daily at 8am, and on release only ([#2216](https://github.com/aws-powertools/powertools-lambda-python/issues/2216)) - **ci:** create pull request on changelog update ([#2224](https://github.com/aws-powertools/powertools-lambda-python/issues/2224)) - **ci:** skip analytics on forks ([#2225](https://github.com/aws-powertools/powertools-lambda-python/issues/2225)) - **ci:** enforce zero trust for third party workflows ([#2215](https://github.com/aws-powertools/powertools-lambda-python/issues/2215)) - **ci:** convert create-pr steps into composite action ([#2238](https://github.com/aws-powertools/powertools-lambda-python/issues/2238)) - **ci:** bump package version after release via pull request ([#2239](https://github.com/aws-powertools/powertools-lambda-python/issues/2239)) - **ci:** update layer ARN docs and create PR during release ([#2240](https://github.com/aws-powertools/powertools-lambda-python/issues/2240)) - **ci:** fail create-pr when branch cannot be created or behind tip - **ci:** filter out bot commits from CHANGELOG - **ci:** add more permissions to analytics - **ci:** source code tampering protection for release ([#2301](https://github.com/aws-powertools/powertools-lambda-python/issues/2301)) - **deps:** bump fastjsonschema from 2.16.3 to 2.17.1 ([#2307](https://github.com/aws-powertools/powertools-lambda-python/issues/2307)) - **deps:** bump aws-actions/configure-aws-credentials from 2.0.0 to 2.1.0 ([#2350](https://github.com/aws-powertools/powertools-lambda-python/issues/2350)) - **deps:** bump typing-extensions from 4.5.0 to 4.6.2 ([#2345](https://github.com/aws-powertools/powertools-lambda-python/issues/2345)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 2.1.2 to 2.1.3 ([#2227](https://github.com/aws-powertools/powertools-lambda-python/issues/2227)) - **deps:** bump actions/setup-python from 4.6.0 to 4.6.1 ([#2325](https://github.com/aws-powertools/powertools-lambda-python/issues/2325)) - **deps:** update mkdocs configuration to support pymdown-extensions 10.0 ([#2271](https://github.com/aws-powertools/powertools-lambda-python/issues/2271)) - **deps:** bump pymdown-extensions from 9.11 to 10.0 ([#2262](https://github.com/aws-powertools/powertools-lambda-python/issues/2262)) - **deps:** bump pydantic from 1.10.7 to 1.10.8 ([#2316](https://github.com/aws-powertools/powertools-lambda-python/issues/2316)) - **deps:** bump codecov/codecov-action from 3.1.3 to 3.1.4 ([#2263](https://github.com/aws-powertools/powertools-lambda-python/issues/2263)) - **deps:** bump requests from 2.28.2 to 2.31.0 ([#2308](https://github.com/aws-powertools/powertools-lambda-python/issues/2308)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.26.116 to 1.26.135 ([#2282](https://github.com/aws-powertools/powertools-lambda-python/issues/2282)) - **deps-dev:** bump pytest-xdist from 3.2.1 to 3.3.0 ([#2251](https://github.com/aws-powertools/powertools-lambda-python/issues/2251)) - **deps-dev:** bump aws-cdk from 2.79.0 to 2.79.1 ([#2252](https://github.com/aws-powertools/powertools-lambda-python/issues/2252)) - **deps-dev:** bump mkdocs-material from 9.1.11 to 9.1.12 ([#2253](https://github.com/aws-powertools/powertools-lambda-python/issues/2253)) - **deps-dev:** bump aws-cdk from 2.79.1 to 2.80.0 ([#2305](https://github.com/aws-powertools/powertools-lambda-python/issues/2305)) - **deps-dev:** bump mkdocs-material from 9.1.13 to 9.1.14 ([#2304](https://github.com/aws-powertools/powertools-lambda-python/issues/2304)) - **deps-dev:** bump mkdocs-material from 9.1.12 to 9.1.13 ([#2280](https://github.com/aws-powertools/powertools-lambda-python/issues/2280)) - **deps-dev:** bump aws-cdk from 2.80.0 to 2.81.0 ([#2332](https://github.com/aws-powertools/powertools-lambda-python/issues/2332)) - **deps-dev:** bump sentry-sdk from 1.22.2 to 1.23.0 ([#2264](https://github.com/aws-powertools/powertools-lambda-python/issues/2264)) - **deps-dev:** bump sentry-sdk from 1.23.1 to 1.24.0 ([#2314](https://github.com/aws-powertools/powertools-lambda-python/issues/2314)) - **deps-dev:** bump types-requests from 2.30.0.0 to 2.31.0.0 ([#2315](https://github.com/aws-powertools/powertools-lambda-python/issues/2315)) - **deps-dev:** bump httpx from 0.24.0 to 0.24.1 ([#2298](https://github.com/aws-powertools/powertools-lambda-python/issues/2298)) - **deps-dev:** bump aws-cdk from 2.78.0 to 2.79.0 ([#2235](https://github.com/aws-powertools/powertools-lambda-python/issues/2235)) - **deps-dev:** bump mypy from 1.2.0 to 1.3.0 ([#2233](https://github.com/aws-powertools/powertools-lambda-python/issues/2233)) - **deps-dev:** bump pytest-cov from 4.0.0 to 4.1.0 ([#2327](https://github.com/aws-powertools/powertools-lambda-python/issues/2327)) - **deps-dev:** bump types-python-dateutil from 2.8.19.12 to 2.8.19.13 ([#2234](https://github.com/aws-powertools/powertools-lambda-python/issues/2234)) - **deps-dev:** bump coverage from 7.2.5 to 7.2.6 ([#2326](https://github.com/aws-powertools/powertools-lambda-python/issues/2326)) - **deps-dev:** bump mkdocs-material from 9.1.14 to 9.1.15 ([#2337](https://github.com/aws-powertools/powertools-lambda-python/issues/2337)) - **deps-dev:** bump mkdocs-material from 9.1.9 to 9.1.11 ([#2229](https://github.com/aws-powertools/powertools-lambda-python/issues/2229)) - **deps-dev:** bump cfn-lint from 0.77.4 to 0.77.5 ([#2228](https://github.com/aws-powertools/powertools-lambda-python/issues/2228)) - **deps-dev:** bump cfn-lint from 0.77.5 to 0.77.6 ([#2360](https://github.com/aws-powertools/powertools-lambda-python/issues/2360)) - **deps-dev:** bump coverage from 7.2.6 to 7.2.7 ([#2338](https://github.com/aws-powertools/powertools-lambda-python/issues/2338)) - **deps-dev:** bump types-requests from 2.31.0.0 to 2.31.0.1 ([#2339](https://github.com/aws-powertools/powertools-lambda-python/issues/2339)) - **deps-dev:** bump mypy-boto3-cloudwatch from 1.26.99 to 1.26.127 ([#2219](https://github.com/aws-powertools/powertools-lambda-python/issues/2219)) - **deps-dev:** bump types-requests from 2.29.0.0 to 2.30.0.0 ([#2220](https://github.com/aws-powertools/powertools-lambda-python/issues/2220)) - **deps-dev:** bump mypy-boto3-s3 from 1.26.116 to 1.26.127 ([#2218](https://github.com/aws-powertools/powertools-lambda-python/issues/2218)) - **deps-dev:** bump pytest-xdist from 3.3.0 to 3.3.1 ([#2297](https://github.com/aws-powertools/powertools-lambda-python/issues/2297)) - **deps-dev:** bump sentry-sdk from 1.23.0 to 1.23.1 ([#2283](https://github.com/aws-powertools/powertools-lambda-python/issues/2283)) - **deps-dev:** bump aws-cdk from 2.77.0 to 2.78.0 ([#2202](https://github.com/aws-powertools/powertools-lambda-python/issues/2202)) - **governance:** Fix python version in issue templates ([#2275](https://github.com/aws-powertools/powertools-lambda-python/issues/2275)) ## [v2.15.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.14.1...v2.15.0) - 2023-05-04 ## Bug Fixes - typo - **ci:** pypi publishing was targetting test endpoint ## Documentation - **batch:** fixed typo in DynamoDB Streams section ([#2189](https://github.com/aws-powertools/powertools-lambda-python/issues/2189)) - **examples:** standardize lambda handler function name ([#2192](https://github.com/aws-powertools/powertools-lambda-python/issues/2192)) - **homepage:** add customer references section ([#2159](https://github.com/aws-powertools/powertools-lambda-python/issues/2159)) - **jmespath:** fix MD037/no-space-in-emphasis - **tutorial:** use newer sam cli template; update to py3.10 ([#2167](https://github.com/aws-powertools/powertools-lambda-python/issues/2167)) - **we-made-this:** add serverless transactional message app ([#2182](https://github.com/aws-powertools/powertools-lambda-python/issues/2182)) ## Features - **ci:** dispatch GitHub analytics action ([#2161](https://github.com/aws-powertools/powertools-lambda-python/issues/2161)) - **event_source:** support custom json_deserializer; add json_body in SQSEvent ([#2200](https://github.com/aws-powertools/powertools-lambda-python/issues/2200)) - **event_source:** add support for dynamic partitions in the Api Gateway Authorizer event ([#2176](https://github.com/aws-powertools/powertools-lambda-python/issues/2176)) - **event_sources:** Add **str** to Data Classes base DictWrapper ([#2129](https://github.com/aws-powertools/powertools-lambda-python/issues/2129)) - **jmespath:** new built-in envelopes to unwrap S3 events ([#2169](https://github.com/aws-powertools/powertools-lambda-python/issues/2169)) - **logger:** add DatadogLogFormatter and observability provider ([#2183](https://github.com/aws-powertools/powertools-lambda-python/issues/2183)) - **metrics:** add flush_metrics() method to allow manual flushing of metrics ([#2171](https://github.com/aws-powertools/powertools-lambda-python/issues/2171)) - **parser:** add support for SQS-wrapped S3 event notifications ([#2108](https://github.com/aws-powertools/powertools-lambda-python/issues/2108)) ## Maintenance - update v2 layer ARN on documentation - add dummy reusable dispatch analytics job - **ci:** remove build step from release env; no more secrets need - **ci:** use new pypi trusted publisher for increase security ([#2198](https://github.com/aws-powertools/powertools-lambda-python/issues/2198)) - **deps:** bump pypa/gh-action-pypi-publish from 1.8.5 to 1.8.6 ([#2201](https://github.com/aws-powertools/powertools-lambda-python/issues/2201)) - **deps-dev:** bump cfn-lint from 0.77.3 to 0.77.4 ([#2178](https://github.com/aws-powertools/powertools-lambda-python/issues/2178)) - **deps-dev:** bump types-requests from 2.28.11.17 to 2.29.0.0 ([#2187](https://github.com/aws-powertools/powertools-lambda-python/issues/2187)) - **deps-dev:** bump coverage from 7.2.4 to 7.2.5 ([#2186](https://github.com/aws-powertools/powertools-lambda-python/issues/2186)) - **deps-dev:** bump mkdocs-material from 9.1.8 to 9.1.9 ([#2190](https://github.com/aws-powertools/powertools-lambda-python/issues/2190)) - **deps-dev:** bump importlib-metadata from 6.5.0 to 6.6.0 ([#2163](https://github.com/aws-powertools/powertools-lambda-python/issues/2163)) - **deps-dev:** bump mypy-boto3-xray from 1.26.11.post1 to 1.26.122 ([#2173](https://github.com/aws-powertools/powertools-lambda-python/issues/2173)) - **deps-dev:** bump aws-cdk from 2.76.0 to 2.77.0 ([#2174](https://github.com/aws-powertools/powertools-lambda-python/issues/2174)) - **deps-dev:** bump mypy-boto3-lambda from 1.26.115 to 1.26.122 ([#2172](https://github.com/aws-powertools/powertools-lambda-python/issues/2172)) - **deps-dev:** bump cfn-lint from 0.77.2 to 0.77.3 ([#2165](https://github.com/aws-powertools/powertools-lambda-python/issues/2165)) - **deps-dev:** bump mkdocs-material from 9.1.6 to 9.1.8 ([#2162](https://github.com/aws-powertools/powertools-lambda-python/issues/2162)) - **deps-dev:** bump coverage from 7.2.3 to 7.2.4 ([#2179](https://github.com/aws-powertools/powertools-lambda-python/issues/2179)) - **governance:** add Lambda Powertools for .NET in issue templates ([#2196](https://github.com/aws-powertools/powertools-lambda-python/issues/2196)) ## [v2.14.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.14.0...v2.14.1) - 2023-04-21 ## Bug Fixes - **batch:** resolve use of ValidationError in batch ([#2157](https://github.com/aws-powertools/powertools-lambda-python/issues/2157)) - **e2e:** fix test brittleness ([#2152](https://github.com/aws-powertools/powertools-lambda-python/issues/2152)) ## Documentation - **readme:** update python version badge to 3.10 ## Features - **event_sources:** add queue_url field in SQS EventSource DataClass ([#2146](https://github.com/aws-powertools/powertools-lambda-python/issues/2146)) ## Maintenance - update v2 layer ARN on documentation - add Python 3.10 PyPi language classifier ([#2144](https://github.com/aws-powertools/powertools-lambda-python/issues/2144)) - update v2 layer ARN on documentation - **batch:** safeguard custom use of BatchProcessingError exception ([#2155](https://github.com/aws-powertools/powertools-lambda-python/issues/2155)) - **deps:** bump codecov/codecov-action from 3.1.2 to 3.1.3 ([#2153](https://github.com/aws-powertools/powertools-lambda-python/issues/2153)) - **deps:** bump dependabot/fetch-metadata from 1.3.6 to 1.4.0 ([#2140](https://github.com/aws-powertools/powertools-lambda-python/issues/2140)) - **deps-dev:** bump aws-cdk from 2.75.0 to 2.75.1 ([#2150](https://github.com/aws-powertools/powertools-lambda-python/issues/2150)) - **deps-dev:** bump aws-cdk from 2.75.1 to 2.76.0 ([#2154](https://github.com/aws-powertools/powertools-lambda-python/issues/2154)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.26.89 to 1.26.116 ([#2147](https://github.com/aws-powertools/powertools-lambda-python/issues/2147)) - **deps-dev:** bump importlib-metadata from 6.4.1 to 6.5.0 ([#2141](https://github.com/aws-powertools/powertools-lambda-python/issues/2141)) - **deps-dev:** bump mypy-boto3-s3 from 1.26.104 to 1.26.116 ([#2149](https://github.com/aws-powertools/powertools-lambda-python/issues/2149)) - **deps-dev:** bump filelock from 3.11.0 to 3.12.0 ([#2142](https://github.com/aws-powertools/powertools-lambda-python/issues/2142)) - **deps-dev:** bump cfn-lint from 0.77.1 to 0.77.2 ([#2148](https://github.com/aws-powertools/powertools-lambda-python/issues/2148)) ## [v2.14.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.13.0...v2.14.0) - 2023-04-18 ## Bug Fixes - enable python 3.10 on SAR template - **ci:** fix layer version in tracer, logger and metrics - **ci:** typo - **docs:** add Layer ARN for new 5 regions - **layers:** add debug to update layer arn script ## Features - **runtime:** add support for python 3.10 ([#2137](https://github.com/aws-powertools/powertools-lambda-python/issues/2137)) ## Maintenance - update v2 layer ARN on documentation - update v2 layer ARN on documentation - update v2 layer ARN on documentation - **ci:** add support for x86-64 regions only ([#2122](https://github.com/aws-powertools/powertools-lambda-python/issues/2122)) - **deps-dev:** bump importlib-metadata from 6.3.0 to 6.4.1 ([#2134](https://github.com/aws-powertools/powertools-lambda-python/issues/2134)) - **deps-dev:** bump cfn-lint from 0.77.0 to 0.77.1 ([#2133](https://github.com/aws-powertools/powertools-lambda-python/issues/2133)) - **deps-dev:** bump pytest from 7.3.0 to 7.3.1 ([#2127](https://github.com/aws-powertools/powertools-lambda-python/issues/2127)) - **deps-dev:** bump mypy-boto3-lambda from 1.26.109 to 1.26.114 ([#2126](https://github.com/aws-powertools/powertools-lambda-python/issues/2126)) - **deps-dev:** bump mypy-boto3-lambda from 1.26.114 to 1.26.115 ([#2135](https://github.com/aws-powertools/powertools-lambda-python/issues/2135)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.26.97.post1 to 1.26.115 ([#2132](https://github.com/aws-powertools/powertools-lambda-python/issues/2132)) - **github:** new tech debt issue form ([#2131](https://github.com/aws-powertools/powertools-lambda-python/issues/2131)) - **layer:** change layer-balance script to support new regions ## Reverts - chore: update v2 layer ARN on documentation ## [v2.13.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.12.0...v2.13.0) - 2023-04-14 ## Bug Fixes - **ci:** replace the correct files for Layer ARN - **ci:** fix working directory - **ci:** add debug log to NPM install - **ci:** use project's CDK version when building layers - **ci:** add the rest of the changed docs - **ci:** update layer version on logger, tracer and metrics docs ([#2120](https://github.com/aws-powertools/powertools-lambda-python/issues/2120)) - **event_sources:** Update CodePipeline event source to include optional encryption_key field and make user_parameters field optional ([#2113](https://github.com/aws-powertools/powertools-lambda-python/issues/2113)) ## Features - **parameters:** Configure max_age and decrypt parameters via environment variables ([#2088](https://github.com/aws-powertools/powertools-lambda-python/issues/2088)) ## Maintenance - update v2 layer ARN on documentation - **ci:** bump the cdk-aws-lambda-powertools-layer version ([#2121](https://github.com/aws-powertools/powertools-lambda-python/issues/2121)) - **deps:** bump codecov/codecov-action from 3.1.1 to 3.1.2 ([#2110](https://github.com/aws-powertools/powertools-lambda-python/issues/2110)) - **deps-dev:** bump httpx from 0.23.3 to 0.24.0 ([#2111](https://github.com/aws-powertools/powertools-lambda-python/issues/2111)) - **deps-dev:** bump aws-cdk-lib from 2.73.0 to 2.74.0 ([#2123](https://github.com/aws-powertools/powertools-lambda-python/issues/2123)) - **deps-dev:** bump mkdocs-material from 9.1.5 to 9.1.6 ([#2104](https://github.com/aws-powertools/powertools-lambda-python/issues/2104)) - **deps-dev:** bump aws-cdk from 2.73.0 to 2.74.0 ([#2125](https://github.com/aws-powertools/powertools-lambda-python/issues/2125)) - **deps-dev:** bump flake8-comprehensions from 3.11.1 to 3.12.0 ([#2124](https://github.com/aws-powertools/powertools-lambda-python/issues/2124)) - **deps-dev:** bump mypy from 1.1.1 to 1.2.0 ([#2096](https://github.com/aws-powertools/powertools-lambda-python/issues/2096)) - **deps-dev:** bump cfn-lint from 0.76.2 to 0.77.0 ([#2107](https://github.com/aws-powertools/powertools-lambda-python/issues/2107)) - **deps-dev:** bump pytest from 7.2.2 to 7.3.0 ([#2106](https://github.com/aws-powertools/powertools-lambda-python/issues/2106)) - **deps-dev:** bump importlib-metadata from 6.1.0 to 6.3.0 ([#2105](https://github.com/aws-powertools/powertools-lambda-python/issues/2105)) - **deps-dev:** bump mypy-boto3-lambda from 1.26.80 to 1.26.109 ([#2103](https://github.com/aws-powertools/powertools-lambda-python/issues/2103)) - **maintenance:** validate acknowledgement section is present ([#2112](https://github.com/aws-powertools/powertools-lambda-python/issues/2112)) ## [v2.12.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.11.0...v2.12.0) - 2023-04-07 ## Bug Fixes - **batch:** handle early validation errors for pydantic models (poison pill) [#2091](https://github.com/aws-powertools/powertools-lambda-python/issues/2091) ([#2099](https://github.com/aws-powertools/powertools-lambda-python/issues/2099)) ## Documentation - **batch:** use newly supported Json model ([#2100](https://github.com/aws-powertools/powertools-lambda-python/issues/2100)) - **homepage:** remove banner for end-of-support v1 ([#2098](https://github.com/aws-powertools/powertools-lambda-python/issues/2098)) - **idempotency:** fixes to testing your code section ([#2073](https://github.com/aws-powertools/powertools-lambda-python/issues/2073)) - **idempotency:** new sequence diagrams, fix idempotency record vs DynamoDB TTL confusion ([#2074](https://github.com/aws-powertools/powertools-lambda-python/issues/2074)) - **parser:** fix highlighted line ([#2064](https://github.com/aws-powertools/powertools-lambda-python/issues/2064)) ## Features - **batch:** reduce boilerplate with process_partial_response ([#2090](https://github.com/aws-powertools/powertools-lambda-python/issues/2090)) - **idempotency:** allow custom sdk clients in DynamoDBPersistenceLayer ([#2087](https://github.com/aws-powertools/powertools-lambda-python/issues/2087)) ## Maintenance - update v2 layer ARN on documentation - **deps:** bump peaceiris/actions-gh-pages from 3.9.2 to 3.9.3 ([#2069](https://github.com/aws-powertools/powertools-lambda-python/issues/2069)) - **deps:** bump aws-xray-sdk from 2.11.0 to 2.12.0 ([#2080](https://github.com/aws-powertools/powertools-lambda-python/issues/2080)) - **deps-dev:** bump coverage from 7.2.2 to 7.2.3 ([#2092](https://github.com/aws-powertools/powertools-lambda-python/issues/2092)) - **deps-dev:** bump aws-cdk from 2.72.1 to 2.73.0 ([#2093](https://github.com/aws-powertools/powertools-lambda-python/issues/2093)) - **deps-dev:** bump mypy-boto3-cloudformation from 1.26.60 to 1.26.108 ([#2095](https://github.com/aws-powertools/powertools-lambda-python/issues/2095)) - **deps-dev:** bump types-python-dateutil from 2.8.19.11 to 2.8.19.12 ([#2085](https://github.com/aws-powertools/powertools-lambda-python/issues/2085)) - **deps-dev:** bump cfn-lint from 0.76.1 to 0.76.2 ([#2084](https://github.com/aws-powertools/powertools-lambda-python/issues/2084)) - **deps-dev:** bump aws-cdk from 2.72.0 to 2.72.1 ([#2081](https://github.com/aws-powertools/powertools-lambda-python/issues/2081)) - **deps-dev:** bump filelock from 3.10.7 to 3.11.0 ([#2094](https://github.com/aws-powertools/powertools-lambda-python/issues/2094)) - **deps-dev:** bump mkdocs-material from 9.1.4 to 9.1.5 ([#2077](https://github.com/aws-powertools/powertools-lambda-python/issues/2077)) - **deps-dev:** bump aws-cdk-lib from 2.72.0 to 2.72.1 ([#2076](https://github.com/aws-powertools/powertools-lambda-python/issues/2076)) - **deps-dev:** bump mypy-boto3-s3 from 1.26.99 to 1.26.104 ([#2075](https://github.com/aws-powertools/powertools-lambda-python/issues/2075)) - **deps-dev:** bump aws-cdk from 2.71.0 to 2.72.0 ([#2071](https://github.com/aws-powertools/powertools-lambda-python/issues/2071)) - **deps-dev:** bump aws-cdk-lib from 2.72.1 to 2.73.0 ([#2097](https://github.com/aws-powertools/powertools-lambda-python/issues/2097)) - **deps-dev:** bump aws-cdk-lib from 2.71.0 to 2.72.0 ([#2070](https://github.com/aws-powertools/powertools-lambda-python/issues/2070)) - **deps-dev:** bump black from 23.1.0 to 23.3.0 ([#2066](https://github.com/aws-powertools/powertools-lambda-python/issues/2066)) - **deps-dev:** bump aws-cdk from 2.70.0 to 2.71.0 ([#2067](https://github.com/aws-powertools/powertools-lambda-python/issues/2067)) - **deps-dev:** bump aws-cdk-lib from 2.70.0 to 2.71.0 ([#2065](https://github.com/aws-powertools/powertools-lambda-python/issues/2065)) ## [v2.11.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.10.0...v2.11.0) - 2023-03-29 ## Bug Fixes - **feature_flags:** make test conditions deterministic ([#2059](https://github.com/aws-powertools/powertools-lambda-python/issues/2059)) - **feature_flags:** handle expected falsy values in conditions ([#2052](https://github.com/aws-powertools/powertools-lambda-python/issues/2052)) ## Documentation - **logger:** warn append_keys on not being thread-safe ([#2046](https://github.com/aws-powertools/powertools-lambda-python/issues/2046)) ## Features - **event_sources:** support for S3 Event Notifications through EventBridge ([#2024](https://github.com/aws-powertools/powertools-lambda-python/issues/2024)) ## Maintenance - update v2 layer ARN on documentation - **deps:** bump pydantic from 1.10.6 to 1.10.7 ([#2034](https://github.com/aws-powertools/powertools-lambda-python/issues/2034)) - **deps-dev:** bump mypy-boto3-s3 from 1.26.97 to 1.26.97.post2 ([#2043](https://github.com/aws-powertools/powertools-lambda-python/issues/2043)) - **deps-dev:** bump cfn-lint from 0.75.1 to 0.76.1 ([#2056](https://github.com/aws-powertools/powertools-lambda-python/issues/2056)) - **deps-dev:** bump types-python-dateutil from 2.8.19.10 to 2.8.19.11 ([#2057](https://github.com/aws-powertools/powertools-lambda-python/issues/2057)) - **deps-dev:** bump mypy-boto3-s3 from 1.26.97.post2 to 1.26.99 ([#2054](https://github.com/aws-powertools/powertools-lambda-python/issues/2054)) - **deps-dev:** bump mkdocs-material from 9.1.3 to 9.1.4 ([#2050](https://github.com/aws-powertools/powertools-lambda-python/issues/2050)) - **deps-dev:** bump filelock from 3.10.2 to 3.10.4 ([#2048](https://github.com/aws-powertools/powertools-lambda-python/issues/2048)) - **deps-dev:** bump mypy-boto3-cloudwatch from 1.26.52 to 1.26.99 ([#2049](https://github.com/aws-powertools/powertools-lambda-python/issues/2049)) - **deps-dev:** bump filelock from 3.10.1 to 3.10.2 ([#2045](https://github.com/aws-powertools/powertools-lambda-python/issues/2045)) - **deps-dev:** bump types-requests from 2.28.11.15 to 2.28.11.16 ([#2044](https://github.com/aws-powertools/powertools-lambda-python/issues/2044)) - **deps-dev:** bump filelock from 3.10.4 to 3.10.7 ([#2055](https://github.com/aws-powertools/powertools-lambda-python/issues/2055)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.26.97 to 1.26.97.post1 ([#2042](https://github.com/aws-powertools/powertools-lambda-python/issues/2042)) - **deps-dev:** bump filelock from 3.10.0 to 3.10.1 ([#2036](https://github.com/aws-powertools/powertools-lambda-python/issues/2036)) - **deps-dev:** bump aws-cdk from 2.69.0 to 2.70.0 ([#2039](https://github.com/aws-powertools/powertools-lambda-python/issues/2039)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.26.87 to 1.26.97 ([#2035](https://github.com/aws-powertools/powertools-lambda-python/issues/2035)) - **deps-dev:** bump mypy-boto3-s3 from 1.26.62 to 1.26.97 ([#2037](https://github.com/aws-powertools/powertools-lambda-python/issues/2037)) - **deps-dev:** bump aws-cdk-lib from 2.69.0 to 2.70.0 ([#2038](https://github.com/aws-powertools/powertools-lambda-python/issues/2038)) - **deps-dev:** bump types-requests from 2.28.11.16 to 2.28.11.17 ([#2061](https://github.com/aws-powertools/powertools-lambda-python/issues/2061)) - **deps-dev:** bump mypy-boto3-ssm from 1.26.77 to 1.26.97 ([#2033](https://github.com/aws-powertools/powertools-lambda-python/issues/2033)) - **deps-dev:** bump flake8-comprehensions from 3.11.0 to 3.11.1 ([#2029](https://github.com/aws-powertools/powertools-lambda-python/issues/2029)) - **deps-dev:** bump cfn-lint from 0.75.0 to 0.75.1 ([#2027](https://github.com/aws-powertools/powertools-lambda-python/issues/2027)) - **deps-dev:** bump pytest-asyncio from 0.20.3 to 0.21.0 ([#2026](https://github.com/aws-powertools/powertools-lambda-python/issues/2026)) ## [v2.10.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.9.1...v2.10.0) - 2023-03-17 ## Bug Fixes - only allow one e2e test at a time - **build:** auto-generate setup.py for legacy build tools ([#2013](https://github.com/aws-powertools/powertools-lambda-python/issues/2013)) - **ci:** bump CDK version - **typing:** swap NoReturn with None for methods with no return value ([#2004](https://github.com/aws-powertools/powertools-lambda-python/issues/2004)) ## Documentation - **homepage:** revamp install UX & share how we build Lambda Layer ([#1978](https://github.com/aws-powertools/powertools-lambda-python/issues/1978)) - **metrics:** fix high-resolution metrics announcement link ([#2017](https://github.com/aws-powertools/powertools-lambda-python/issues/2017)) ## Features - **event_sources:** support for custom properties in ActiveMQEvent ([#1999](https://github.com/aws-powertools/powertools-lambda-python/issues/1999)) - **parser:** support for S3 Event Notifications via EventBridge ([#1982](https://github.com/aws-powertools/powertools-lambda-python/issues/1982)) ## Maintenance - update v2 layer ARN on documentation - **ci:** allow dependabot to upgrade CDK for JS - **deps:** bump docker/setup-buildx-action from 2.4.1 to 2.5.0 ([#1995](https://github.com/aws-powertools/powertools-lambda-python/issues/1995)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 2.1.1 to 2.1.2 ([#1979](https://github.com/aws-powertools/powertools-lambda-python/issues/1979)) - **deps:** bump aws-actions/configure-aws-credentials from 1 to 2 ([#1987](https://github.com/aws-powertools/powertools-lambda-python/issues/1987)) - **deps:** bump pydantic from 1.10.5 to 1.10.6 ([#1991](https://github.com/aws-powertools/powertools-lambda-python/issues/1991)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.26.49 to 1.26.89 ([#1996](https://github.com/aws-powertools/powertools-lambda-python/issues/1996)) - **deps-dev:** bump cfn-lint from 0.74.2 to 0.74.3 ([#2008](https://github.com/aws-powertools/powertools-lambda-python/issues/2008)) - **deps-dev:** bump filelock from 3.9.0 to 3.9.1 ([#2006](https://github.com/aws-powertools/powertools-lambda-python/issues/2006)) - **deps-dev:** bump aws-cdk-lib from 2.68.0 to 2.69.0 ([#2007](https://github.com/aws-powertools/powertools-lambda-python/issues/2007)) - **deps-dev:** bump cfn-lint from 0.74.1 to 0.74.2 ([#2005](https://github.com/aws-powertools/powertools-lambda-python/issues/2005)) - **deps-dev:** bump mypy from 0.982 to 1.1.1 ([#1985](https://github.com/aws-powertools/powertools-lambda-python/issues/1985)) - **deps-dev:** bump pytest-xdist from 3.2.0 to 3.2.1 ([#2000](https://github.com/aws-powertools/powertools-lambda-python/issues/2000)) - **deps-dev:** bump flake8-bugbear from 23.2.13 to 23.3.12 ([#2001](https://github.com/aws-powertools/powertools-lambda-python/issues/2001)) - **deps-dev:** bump bandit from 1.7.4 to 1.7.5 ([#1997](https://github.com/aws-powertools/powertools-lambda-python/issues/1997)) - **deps-dev:** bump mkdocs-material from 9.1.2 to 9.1.3 ([#2009](https://github.com/aws-powertools/powertools-lambda-python/issues/2009)) - **deps-dev:** bump aws-cdk from 2.67.0 to 2.69.0 ([#2010](https://github.com/aws-powertools/powertools-lambda-python/issues/2010)) - **deps-dev:** bump mkdocs-material from 9.1.1 to 9.1.2 ([#1994](https://github.com/aws-powertools/powertools-lambda-python/issues/1994)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.26.84 to 1.26.87 ([#1993](https://github.com/aws-powertools/powertools-lambda-python/issues/1993)) - **deps-dev:** bump filelock from 3.9.1 to 3.10.0 ([#2019](https://github.com/aws-powertools/powertools-lambda-python/issues/2019)) - **deps-dev:** bump aws-cdk-lib from 2.67.0 to 2.68.0 ([#1992](https://github.com/aws-powertools/powertools-lambda-python/issues/1992)) - **deps-dev:** bump cfn-lint from 0.74.0 to 0.74.1 ([#1988](https://github.com/aws-powertools/powertools-lambda-python/issues/1988)) - **deps-dev:** bump coverage from 7.2.1 to 7.2.2 ([#2021](https://github.com/aws-powertools/powertools-lambda-python/issues/2021)) - **deps-dev:** bump pytest from 7.2.1 to 7.2.2 ([#1980](https://github.com/aws-powertools/powertools-lambda-python/issues/1980)) - **deps-dev:** bump cfn-lint from 0.74.3 to 0.75.0 ([#2020](https://github.com/aws-powertools/powertools-lambda-python/issues/2020)) - **deps-dev:** bump types-python-dateutil from 2.8.19.9 to 2.8.19.10 ([#1973](https://github.com/aws-powertools/powertools-lambda-python/issues/1973)) - **deps-dev:** bump hvac from 1.0.2 to 1.1.0 ([#1983](https://github.com/aws-powertools/powertools-lambda-python/issues/1983)) - **deps-dev:** bump mkdocs-material from 9.1.0 to 9.1.1 ([#1984](https://github.com/aws-powertools/powertools-lambda-python/issues/1984)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.26.24 to 1.26.84 ([#1981](https://github.com/aws-powertools/powertools-lambda-python/issues/1981)) - **deps-dev:** bump mkdocs-material from 9.0.15 to 9.1.0 ([#1976](https://github.com/aws-powertools/powertools-lambda-python/issues/1976)) - **deps-dev:** bump cfn-lint from 0.67.0 to 0.74.0 ([#1974](https://github.com/aws-powertools/powertools-lambda-python/issues/1974)) - **deps-dev:** bump aws-cdk-lib from 2.66.1 to 2.67.0 ([#1977](https://github.com/aws-powertools/powertools-lambda-python/issues/1977)) ## [v2.9.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.9.0...v2.9.1) - 2023-03-01 ## Bug Fixes - **idempotency:** revert dict mutation that impacted static_pk_value feature ([#1970](https://github.com/aws-powertools/powertools-lambda-python/issues/1970)) ## Documentation - **appsync:** add mutation example and infrastructure fix ([#1964](https://github.com/aws-powertools/powertools-lambda-python/issues/1964)) - **parameters:** fix typos and inconsistencies ([#1966](https://github.com/aws-powertools/powertools-lambda-python/issues/1966)) ## Maintenance - update project description - update v2 layer ARN on documentation - **ci:** disable pypi test due to maintenance mode - **ci:** replace deprecated set-output commands ([#1957](https://github.com/aws-powertools/powertools-lambda-python/issues/1957)) - **deps:** bump fastjsonschema from 2.16.2 to 2.16.3 ([#1961](https://github.com/aws-powertools/powertools-lambda-python/issues/1961)) - **deps:** bump release-drafter/release-drafter from 5.22.0 to 5.23.0 ([#1947](https://github.com/aws-powertools/powertools-lambda-python/issues/1947)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 2.1.0 to 2.1.1 ([#1958](https://github.com/aws-powertools/powertools-lambda-python/issues/1958)) - **deps-dev:** bump coverage from 7.2.0 to 7.2.1 ([#1963](https://github.com/aws-powertools/powertools-lambda-python/issues/1963)) - **deps-dev:** bump types-python-dateutil from 2.8.19.8 to 2.8.19.9 ([#1960](https://github.com/aws-powertools/powertools-lambda-python/issues/1960)) - **deps-dev:** bump mkdocs-material from 9.0.14 to 9.0.15 ([#1959](https://github.com/aws-powertools/powertools-lambda-python/issues/1959)) - **deps-dev:** bump mypy-boto3-lambda from 1.26.55 to 1.26.80 ([#1967](https://github.com/aws-powertools/powertools-lambda-python/issues/1967)) - **deps-dev:** bump types-requests from 2.28.11.14 to 2.28.11.15 ([#1962](https://github.com/aws-powertools/powertools-lambda-python/issues/1962)) - **deps-dev:** bump aws-cdk-lib from 2.66.0 to 2.66.1 ([#1954](https://github.com/aws-powertools/powertools-lambda-python/issues/1954)) - **deps-dev:** bump coverage from 7.1.0 to 7.2.0 ([#1951](https://github.com/aws-powertools/powertools-lambda-python/issues/1951)) - **deps-dev:** bump mkdocs-material from 9.0.13 to 9.0.14 ([#1952](https://github.com/aws-powertools/powertools-lambda-python/issues/1952)) - **deps-dev:** bump mypy-boto3-ssm from 1.26.43 to 1.26.77 ([#1949](https://github.com/aws-powertools/powertools-lambda-python/issues/1949)) - **deps-dev:** bump types-requests from 2.28.11.13 to 2.28.11.14 ([#1946](https://github.com/aws-powertools/powertools-lambda-python/issues/1946)) - **deps-dev:** bump aws-cdk-lib from 2.65.0 to 2.66.0 ([#1948](https://github.com/aws-powertools/powertools-lambda-python/issues/1948)) - **deps-dev:** bump types-python-dateutil from 2.8.19.7 to 2.8.19.8 ([#1945](https://github.com/aws-powertools/powertools-lambda-python/issues/1945)) - **parser:** add workaround to make API GW test button work ([#1971](https://github.com/aws-powertools/powertools-lambda-python/issues/1971)) ## [v2.9.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.8.0...v2.9.0) - 2023-02-21 ## Bug Fixes - **ci:** upgraded cdk to match the version used on e2e tests - **feature-flags:** revert RuleAction Enum inheritance on str ([#1910](https://github.com/aws-powertools/powertools-lambda-python/issues/1910)) - **logger:** support exception and exception_name fields at any log level ([#1930](https://github.com/aws-powertools/powertools-lambda-python/issues/1930)) - **metrics:** clarify no-metrics user warning ([#1935](https://github.com/aws-powertools/powertools-lambda-python/issues/1935)) ## Documentation - **event_handlers:** Fix REST API - HTTP Methods documentation ([#1936](https://github.com/aws-powertools/powertools-lambda-python/issues/1936)) - **home:** update powertools definition - **we-made-this:** add CI/CD using Feature Flags video ([#1940](https://github.com/aws-powertools/powertools-lambda-python/issues/1940)) - **we-made-this:** add Feature Flags post ([#1939](https://github.com/aws-powertools/powertools-lambda-python/issues/1939)) ## Features - **batch:** add support to SQS FIFO queues (SqsFifoPartialProcessor) ([#1934](https://github.com/aws-powertools/powertools-lambda-python/issues/1934)) ## Maintenance - update v2 layer ARN on documentation - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 2.0.5 to 2.1.0 ([#1943](https://github.com/aws-powertools/powertools-lambda-python/issues/1943)) - **deps:** bump pydantic from 1.10.4 to 1.10.5 ([#1931](https://github.com/aws-powertools/powertools-lambda-python/issues/1931)) - **deps-dev:** bump mkdocs-material from 9.0.12 to 9.0.13 ([#1944](https://github.com/aws-powertools/powertools-lambda-python/issues/1944)) - **deps-dev:** bump aws-cdk-lib from 2.64.0 to 2.65.0 ([#1938](https://github.com/aws-powertools/powertools-lambda-python/issues/1938)) - **deps-dev:** bump types-python-dateutil from 2.8.19.6 to 2.8.19.7 ([#1932](https://github.com/aws-powertools/powertools-lambda-python/issues/1932)) - **deps-dev:** bump types-requests from 2.28.11.12 to 2.28.11.13 ([#1933](https://github.com/aws-powertools/powertools-lambda-python/issues/1933)) - **deps-dev:** bump mypy-boto3-appconfig from 1.26.63 to 1.26.71 ([#1928](https://github.com/aws-powertools/powertools-lambda-python/issues/1928)) - **deps-dev:** bump flake8-bugbear from 23.1.20 to 23.2.13 ([#1924](https://github.com/aws-powertools/powertools-lambda-python/issues/1924)) - **deps-dev:** bump mypy-boto3-appconfigdata from 1.26.0.post1 to 1.26.70 ([#1925](https://github.com/aws-powertools/powertools-lambda-python/issues/1925)) ## [v2.8.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.7.1...v2.8.0) - 2023-02-10 ## Bug Fixes - **idempotency:** make idempotent_function decorator thread safe ([#1899](https://github.com/aws-powertools/powertools-lambda-python/issues/1899)) ## Documentation - **engine:** re-enable clipboard button for code snippets - **homepage:** Replace poetry command to add group parameter ([#1917](https://github.com/aws-powertools/powertools-lambda-python/issues/1917)) - **homepage:** set url for end-of-support in announce block ([#1893](https://github.com/aws-powertools/powertools-lambda-python/issues/1893)) - **idempotency:** add IAM permissions section ([#1902](https://github.com/aws-powertools/powertools-lambda-python/issues/1902)) - **metrics:** remove reduntant wording before release - **metrics:** fix syntax highlighting for new default_dimensions ## Features - **batch:** add async_batch_processor for concurrent processing ([#1724](https://github.com/aws-powertools/powertools-lambda-python/issues/1724)) - **metrics:** add default_dimensions to single_metric ([#1880](https://github.com/aws-powertools/powertools-lambda-python/issues/1880)) ## Maintenance - update v2 layer ARN on documentation - **deps:** bump docker/setup-buildx-action from 2.4.0 to 2.4.1 ([#1903](https://github.com/aws-powertools/powertools-lambda-python/issues/1903)) - **deps-dev:** bump aws-cdk-lib from 2.63.0 to 2.63.2 ([#1904](https://github.com/aws-powertools/powertools-lambda-python/issues/1904)) - **deps-dev:** bump black from 22.12.0 to 23.1.0 ([#1886](https://github.com/aws-powertools/powertools-lambda-python/issues/1886)) - **deps-dev:** bump types-requests from 2.28.11.8 to 2.28.11.12 ([#1906](https://github.com/aws-powertools/powertools-lambda-python/issues/1906)) - **deps-dev:** bump pytest-xdist from 3.1.0 to 3.2.0 ([#1905](https://github.com/aws-powertools/powertools-lambda-python/issues/1905)) - **deps-dev:** bump aws-cdk-lib from 2.63.2 to 2.64.0 ([#1918](https://github.com/aws-powertools/powertools-lambda-python/issues/1918)) - **deps-dev:** bump mkdocs-material from 9.0.11 to 9.0.12 ([#1919](https://github.com/aws-powertools/powertools-lambda-python/issues/1919)) - **deps-dev:** bump mkdocs-material from 9.0.10 to 9.0.11 ([#1896](https://github.com/aws-powertools/powertools-lambda-python/issues/1896)) - **deps-dev:** bump mypy-boto3-appconfig from 1.26.0.post1 to 1.26.63 ([#1895](https://github.com/aws-powertools/powertools-lambda-python/issues/1895)) - **deps-dev:** bump mypy-boto3-s3 from 1.26.58 to 1.26.62 ([#1889](https://github.com/aws-powertools/powertools-lambda-python/issues/1889)) - **deps-dev:** bump mkdocs-material from 9.0.9 to 9.0.10 ([#1888](https://github.com/aws-powertools/powertools-lambda-python/issues/1888)) - **deps-dev:** bump aws-cdk-lib from 2.62.2 to 2.63.0 ([#1887](https://github.com/aws-powertools/powertools-lambda-python/issues/1887)) - **maintainers:** fix release workflow rename - **pypi:** add new links to Pypi package homepage ([#1912](https://github.com/aws-powertools/powertools-lambda-python/issues/1912)) ## [v2.7.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.7.0...v2.7.1) - 2023-02-01 ## Bug Fixes - parallel_run should fail when e2e tests fail - bump aws-cdk version - **ci:** scope e2e tests by python version - **ci:** add auth to API HTTP Gateway and Lambda Function Url ([#1882](https://github.com/aws-powertools/powertools-lambda-python/issues/1882)) - **license:** correction to MIT + MIT-0 (no proprietary anymore) ([#1883](https://github.com/aws-powertools/powertools-lambda-python/issues/1883)) - **license:** add MIT-0 license header ([#1871](https://github.com/aws-powertools/powertools-lambda-python/issues/1871)) - **tests:** make logs fetching more robust ([#1878](https://github.com/aws-powertools/powertools-lambda-python/issues/1878)) - **tests:** remove custom workers - **tests:** make sure multiple e2e tests run concurrently ([#1861](https://github.com/aws-powertools/powertools-lambda-python/issues/1861)) ## Documentation - **event-source:** fix incorrect method in example CloudWatch Logs ([#1857](https://github.com/aws-powertools/powertools-lambda-python/issues/1857)) - **homepage:** add banner for end-of-support v1 ([#1879](https://github.com/aws-powertools/powertools-lambda-python/issues/1879)) - **parameters:** snippets split, improved, and lint ([#1564](https://github.com/aws-powertools/powertools-lambda-python/issues/1564)) ## Maintenance - update v2 layer ARN on documentation - **deps:** bump docker/setup-buildx-action from 2.0.0 to 2.4.0 ([#1873](https://github.com/aws-powertools/powertools-lambda-python/issues/1873)) - **deps:** bump dependabot/fetch-metadata from 1.3.5 to 1.3.6 ([#1855](https://github.com/aws-powertools/powertools-lambda-python/issues/1855)) - **deps-dev:** bump mypy-boto3-s3 from 1.26.0.post1 to 1.26.58 ([#1868](https://github.com/aws-powertools/powertools-lambda-python/issues/1868)) - **deps-dev:** bump isort from 5.11.4 to 5.11.5 ([#1875](https://github.com/aws-powertools/powertools-lambda-python/issues/1875)) - **deps-dev:** bump aws-cdk-lib from 2.62.1 to 2.62.2 ([#1869](https://github.com/aws-powertools/powertools-lambda-python/issues/1869)) - **deps-dev:** bump mkdocs-material from 9.0.6 to 9.0.8 ([#1874](https://github.com/aws-powertools/powertools-lambda-python/issues/1874)) - **deps-dev:** bump aws-cdk-lib from 2.62.0 to 2.62.1 ([#1866](https://github.com/aws-powertools/powertools-lambda-python/issues/1866)) - **deps-dev:** bump mypy-boto3-cloudformation from 1.26.35.post1 to 1.26.57 ([#1865](https://github.com/aws-powertools/powertools-lambda-python/issues/1865)) - **deps-dev:** bump coverage from 7.0.5 to 7.1.0 ([#1862](https://github.com/aws-powertools/powertools-lambda-python/issues/1862)) - **deps-dev:** bump aws-cdk-lib from 2.61.1 to 2.62.0 ([#1863](https://github.com/aws-powertools/powertools-lambda-python/issues/1863)) - **deps-dev:** bump flake8-bugbear from 22.12.6 to 23.1.20 ([#1854](https://github.com/aws-powertools/powertools-lambda-python/issues/1854)) - **deps-dev:** bump mypy-boto3-lambda from 1.26.49 to 1.26.55 ([#1856](https://github.com/aws-powertools/powertools-lambda-python/issues/1856)) ## Reverts - fix(tests): remove custom workers ## [v2.7.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.6.0...v2.7.0) - 2023-01-24 ## Bug Fixes - git-chlg docker image is broken ## Features - **feature_flags:** Add Time based feature flags actions ([#1846](https://github.com/aws-powertools/powertools-lambda-python/issues/1846)) ## Maintenance - update v2 layer ARN on documentation - **deps:** bump peaceiris/actions-gh-pages from 3.9.1 to 3.9.2 ([#1841](https://github.com/aws-powertools/powertools-lambda-python/issues/1841)) - **deps:** bump future from 0.18.2 to 0.18.3 ([#1836](https://github.com/aws-powertools/powertools-lambda-python/issues/1836)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 2.0.4 to 2.0.5 ([#1837](https://github.com/aws-powertools/powertools-lambda-python/issues/1837)) - **deps-dev:** bump mkdocs-material from 9.0.4 to 9.0.5 ([#1840](https://github.com/aws-powertools/powertools-lambda-python/issues/1840)) - **deps-dev:** bump types-requests from 2.28.11.7 to 2.28.11.8 ([#1843](https://github.com/aws-powertools/powertools-lambda-python/issues/1843)) - **deps-dev:** bump mypy-boto3-cloudwatch from 1.26.30 to 1.26.52 ([#1847](https://github.com/aws-powertools/powertools-lambda-python/issues/1847)) - **deps-dev:** bump pytest from 7.2.0 to 7.2.1 ([#1838](https://github.com/aws-powertools/powertools-lambda-python/issues/1838)) - **deps-dev:** bump aws-cdk-lib from 2.60.0 to 2.61.1 ([#1849](https://github.com/aws-powertools/powertools-lambda-python/issues/1849)) - **deps-dev:** bump mypy-boto3-logs from 1.26.49 to 1.26.53 ([#1850](https://github.com/aws-powertools/powertools-lambda-python/issues/1850)) - **deps-dev:** bump mkdocs-material from 9.0.5 to 9.0.6 ([#1851](https://github.com/aws-powertools/powertools-lambda-python/issues/1851)) - **deps-dev:** bump mkdocs-material from 9.0.3 to 9.0.4 ([#1833](https://github.com/aws-powertools/powertools-lambda-python/issues/1833)) - **deps-dev:** bump mypy-boto3-logs from 1.26.43 to 1.26.49 ([#1834](https://github.com/aws-powertools/powertools-lambda-python/issues/1834)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.26.40 to 1.26.49 ([#1835](https://github.com/aws-powertools/powertools-lambda-python/issues/1835)) - **deps-dev:** bump mypy-boto3-lambda from 1.26.18 to 1.26.49 ([#1832](https://github.com/aws-powertools/powertools-lambda-python/issues/1832)) - **deps-dev:** bump aws-cdk-lib from 2.59.0 to 2.60.0 ([#1831](https://github.com/aws-powertools/powertools-lambda-python/issues/1831)) ## [v2.6.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.5.0...v2.6.0) - 2023-01-12 ## Bug Fixes - **api_gateway:** fixed custom metrics issue when using debug mode ([#1827](https://github.com/aws-powertools/powertools-lambda-python/issues/1827)) ## Documentation - **logger:** fix incorrect field names in example structured logs ([#1830](https://github.com/aws-powertools/powertools-lambda-python/issues/1830)) - **logger:** Add warning of uncaught exceptions ([#1826](https://github.com/aws-powertools/powertools-lambda-python/issues/1826)) ## Maintenance - update v2 layer ARN on documentation - **deps:** bump pydantic from 1.10.2 to 1.10.4 ([#1817](https://github.com/aws-powertools/powertools-lambda-python/issues/1817)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 2.0.1 to 2.0.3 ([#1801](https://github.com/aws-powertools/powertools-lambda-python/issues/1801)) - **deps:** bump release-drafter/release-drafter from 5.21.1 to 5.22.0 ([#1802](https://github.com/aws-powertools/powertools-lambda-python/issues/1802)) - **deps:** bump gitpython from 3.1.29 to 3.1.30 ([#1812](https://github.com/aws-powertools/powertools-lambda-python/issues/1812)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 2.0.3 to 2.0.4 ([#1821](https://github.com/aws-powertools/powertools-lambda-python/issues/1821)) - **deps:** bump peaceiris/actions-gh-pages from 3.9.0 to 3.9.1 ([#1814](https://github.com/aws-powertools/powertools-lambda-python/issues/1814)) - **deps-dev:** bump mkdocs-material from 8.5.11 to 9.0.2 ([#1808](https://github.com/aws-powertools/powertools-lambda-python/issues/1808)) - **deps-dev:** bump mypy-boto3-ssm from 1.26.11.post1 to 1.26.43 ([#1819](https://github.com/aws-powertools/powertools-lambda-python/issues/1819)) - **deps-dev:** bump mypy-boto3-logs from 1.26.27 to 1.26.43 ([#1820](https://github.com/aws-powertools/powertools-lambda-python/issues/1820)) - **deps-dev:** bump filelock from 3.8.2 to 3.9.0 ([#1816](https://github.com/aws-powertools/powertools-lambda-python/issues/1816)) - **deps-dev:** bump mypy-boto3-cloudformation from 1.26.11.post1 to 1.26.35.post1 ([#1818](https://github.com/aws-powertools/powertools-lambda-python/issues/1818)) - **deps-dev:** bump ijson from 3.1.4 to 3.2.0.post0 ([#1815](https://github.com/aws-powertools/powertools-lambda-python/issues/1815)) - **deps-dev:** bump coverage from 6.5.0 to 7.0.3 ([#1806](https://github.com/aws-powertools/powertools-lambda-python/issues/1806)) - **deps-dev:** bump flake8-builtins from 2.0.1 to 2.1.0 ([#1799](https://github.com/aws-powertools/powertools-lambda-python/issues/1799)) - **deps-dev:** bump coverage from 7.0.3 to 7.0.4 ([#1822](https://github.com/aws-powertools/powertools-lambda-python/issues/1822)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.26.12 to 1.26.40 ([#1811](https://github.com/aws-powertools/powertools-lambda-python/issues/1811)) - **deps-dev:** bump isort from 5.11.3 to 5.11.4 ([#1809](https://github.com/aws-powertools/powertools-lambda-python/issues/1809)) - **deps-dev:** bump aws-cdk-lib from 2.55.1 to 2.59.0 ([#1810](https://github.com/aws-powertools/powertools-lambda-python/issues/1810)) - **deps-dev:** bump importlib-metadata from 5.1.0 to 6.0.0 ([#1804](https://github.com/aws-powertools/powertools-lambda-python/issues/1804)) - **deps-dev:** bump mkdocs-material from 9.0.2 to 9.0.3 ([#1823](https://github.com/aws-powertools/powertools-lambda-python/issues/1823)) - **deps-dev:** bump black from 22.10.0 to 22.12.0 ([#1770](https://github.com/aws-powertools/powertools-lambda-python/issues/1770)) - **deps-dev:** bump flake8-black from 0.3.5 to 0.3.6 ([#1792](https://github.com/aws-powertools/powertools-lambda-python/issues/1792)) - **deps-dev:** bump coverage from 7.0.4 to 7.0.5 ([#1829](https://github.com/aws-powertools/powertools-lambda-python/issues/1829)) - **deps-dev:** bump types-requests from 2.28.11.5 to 2.28.11.7 ([#1795](https://github.com/aws-powertools/powertools-lambda-python/issues/1795)) ## [v2.5.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.4.0...v2.5.0) - 2022-12-21 ## Bug Fixes - **event_handlers:** omit explicit None HTTP header values ([#1793](https://github.com/aws-powertools/powertools-lambda-python/issues/1793)) ## Documentation - **idempotency:** fix, improve, and increase visibility for batch integration ([#1776](https://github.com/aws-powertools/powertools-lambda-python/issues/1776)) - **validation:** fix broken link; enrich built-in jmespath links ([#1777](https://github.com/aws-powertools/powertools-lambda-python/issues/1777)) ## Features - **logger:** unwrap event from common models if asked to log ([#1778](https://github.com/aws-powertools/powertools-lambda-python/issues/1778)) ## Maintenance - update v2 layer ARN on documentation - **common:** reusable function to extract event from models - **deps:** bump certifi from 2022.9.24 to 2022.12.7 ([#1768](https://github.com/aws-powertools/powertools-lambda-python/issues/1768)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 1.4.0 to 2.0.1 ([#1752](https://github.com/aws-powertools/powertools-lambda-python/issues/1752)) - **deps:** bump zgosalvez/github-actions-ensure-sha-pinned-actions from 1.3.0 to 1.4.0 ([#1749](https://github.com/aws-powertools/powertools-lambda-python/issues/1749)) - **deps-dev:** bump pytest-asyncio from 0.20.2 to 0.20.3 ([#1767](https://github.com/aws-powertools/powertools-lambda-python/issues/1767)) - **deps-dev:** bump mypy-boto3-cloudwatch from 1.26.0.post1 to 1.26.17 ([#1753](https://github.com/aws-powertools/powertools-lambda-python/issues/1753)) - **deps-dev:** bump isort from 5.10.1 to 5.11.2 ([#1782](https://github.com/aws-powertools/powertools-lambda-python/issues/1782)) - **deps-dev:** bump mypy-boto3-cloudwatch from 1.26.17 to 1.26.30 ([#1785](https://github.com/aws-powertools/powertools-lambda-python/issues/1785)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.26.13.post16 to 1.26.24 ([#1765](https://github.com/aws-powertools/powertools-lambda-python/issues/1765)) - **deps-dev:** bump aws-cdk-lib from 2.54.0 to 2.55.1 ([#1787](https://github.com/aws-powertools/powertools-lambda-python/issues/1787)) - **deps-dev:** bump aws-cdk-lib from 2.53.0 to 2.54.0 ([#1764](https://github.com/aws-powertools/powertools-lambda-python/issues/1764)) - **deps-dev:** bump flake8-bugbear from 22.10.27 to 22.12.6 ([#1760](https://github.com/aws-powertools/powertools-lambda-python/issues/1760)) - **deps-dev:** bump filelock from 3.8.0 to 3.8.2 ([#1759](https://github.com/aws-powertools/powertools-lambda-python/issues/1759)) - **deps-dev:** bump pytest-xdist from 3.0.2 to 3.1.0 ([#1758](https://github.com/aws-powertools/powertools-lambda-python/issues/1758)) - **deps-dev:** bump mkdocs-material from 8.5.10 to 8.5.11 ([#1756](https://github.com/aws-powertools/powertools-lambda-python/issues/1756)) - **deps-dev:** bump importlib-metadata from 4.13.0 to 5.1.0 ([#1750](https://github.com/aws-powertools/powertools-lambda-python/issues/1750)) - **deps-dev:** bump isort from 5.11.2 to 5.11.3 ([#1788](https://github.com/aws-powertools/powertools-lambda-python/issues/1788)) - **deps-dev:** bump flake8-black from 0.3.3 to 0.3.5 ([#1738](https://github.com/aws-powertools/powertools-lambda-python/issues/1738)) - **deps-dev:** bump mypy-boto3-logs from 1.26.17 to 1.26.27 ([#1775](https://github.com/aws-powertools/powertools-lambda-python/issues/1775)) - **tests:** move shared_functions to unit tests ## [v2.4.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.3.1...v2.4.0) - 2022-11-24 ## Bug Fixes - **ci:** use gh-pages env as official docs are wrong - **ci:** api docs path ## Documentation - **idempotency:** fix register_lambda_context order ([#1747](https://github.com/aws-powertools/powertools-lambda-python/issues/1747)) - **streaming:** fix leftover newline ## Features - **streaming:** add new s3 streaming utility ([#1719](https://github.com/aws-powertools/powertools-lambda-python/issues/1719)) ## Maintenance - update v2 layer ARN on documentation - **ci:** re-create versioned API docs for new pages deployment - **ci:** re-create versioned API docs for new pages deployment - **ci:** increase permission in parent job for docs publishing - **ci:** attempt gh-pages deployment via beta route - **deps:** bump aws-xray-sdk from 2.10.0 to 2.11.0 ([#1730](https://github.com/aws-powertools/powertools-lambda-python/issues/1730)) - **deps-dev:** bump mypy-boto3-lambda from 1.26.0.post1 to 1.26.12 ([#1742](https://github.com/aws-powertools/powertools-lambda-python/issues/1742)) - **deps-dev:** bump mypy-boto3-cloudformation from 1.26.0.post1 to 1.26.11.post1 ([#1746](https://github.com/aws-powertools/powertools-lambda-python/issues/1746)) - **deps-dev:** bump aws-cdk-lib from 2.50.0 to 2.51.1 ([#1741](https://github.com/aws-powertools/powertools-lambda-python/issues/1741)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.26.0.post1 to 1.26.13.post16 ([#1743](https://github.com/aws-powertools/powertools-lambda-python/issues/1743)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.26.0.post1 to 1.26.12 ([#1744](https://github.com/aws-powertools/powertools-lambda-python/issues/1744)) - **deps-dev:** bump mypy-boto3-ssm from 1.26.4 to 1.26.11.post1 ([#1740](https://github.com/aws-powertools/powertools-lambda-python/issues/1740)) - **deps-dev:** bump types-requests from 2.28.11.4 to 2.28.11.5 ([#1729](https://github.com/aws-powertools/powertools-lambda-python/issues/1729)) - **deps-dev:** bump mkdocs-material from 8.5.9 to 8.5.10 ([#1731](https://github.com/aws-powertools/powertools-lambda-python/issues/1731)) - **governance:** remove markdown rendering from docs issue template ## Regression - **ci:** new gh-pages beta doesn't work either; reverting as gh-pages is disrupted ## [v2.3.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.3.0...v2.3.1) - 2022-11-21 ## Bug Fixes - **apigateway:** support dynamic routes with equal sign (RFC3986) ([#1737](https://github.com/aws-powertools/powertools-lambda-python/issues/1737)) ## Maintenance - update v2 layer ARN on documentation - test build layer hardware to 8 core - **deps-dev:** bump mypy-boto3-xray from 1.26.9 to 1.26.11.post1 ([#1734](https://github.com/aws-powertools/powertools-lambda-python/issues/1734)) ## [v2.3.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.2.0...v2.3.0) - 2022-11-17 ## Bug Fixes - **apigateway:** support nested router decorators ([#1709](https://github.com/aws-powertools/powertools-lambda-python/issues/1709)) - **ci:** increase permission to allow version sync back to repo - **ci:** disable pre-commit hook download from version bump - **ci:** setup git client earlier to prevent dirty stash error - **parameters:** get_secret correctly return SecretBinary value ([#1717](https://github.com/aws-powertools/powertools-lambda-python/issues/1717)) ## Documentation - project name consistency - **apigateway:** add all resolvers in testing your code section for accuracy ([#1688](https://github.com/aws-powertools/powertools-lambda-python/issues/1688)) - **examples:** linting unnecessary whitespace - **homepage:** update default value for `POWERTOOLS_DEV` ([#1695](https://github.com/aws-powertools/powertools-lambda-python/issues/1695)) - **idempotency:** add missing Lambda Context; note on thread-safe ([#1732](https://github.com/aws-powertools/powertools-lambda-python/issues/1732)) - **logger:** update uncaught exception message value ## Features - **apigateway:** multiple exceptions in exception_handler ([#1707](https://github.com/aws-powertools/powertools-lambda-python/issues/1707)) - **event_sources:** extract CloudWatch Logs in Kinesis streams ([#1710](https://github.com/aws-powertools/powertools-lambda-python/issues/1710)) - **logger:** log uncaught exceptions via system's exception hook ([#1727](https://github.com/aws-powertools/powertools-lambda-python/issues/1727)) - **parser:** export Pydantic.errors through escape hatch ([#1728](https://github.com/aws-powertools/powertools-lambda-python/issues/1728)) - **parser:** extract CloudWatch Logs in Kinesis streams ([#1726](https://github.com/aws-powertools/powertools-lambda-python/issues/1726)) ## Maintenance - apigw test event wrongly set with base64 - update v2 layer ARN on documentation - **ci:** revert custom hw for E2E due to lack of hw - **ci:** try bigger hardware for e2e test - **ci:** uncomment test pypi, fix version bump sync - **ci:** limit to src only to prevent dependabot failures - **ci:** use new custom hw for E2E - **ci:** prevent dependabot updates to trigger E2E - **ci:** bump hardware for build steps - **deps:** bump dependabot/fetch-metadata from 1.3.4 to 1.3.5 ([#1689](https://github.com/aws-powertools/powertools-lambda-python/issues/1689)) - **deps-dev:** bump types-requests from 2.28.11.3 to 2.28.11.4 ([#1701](https://github.com/aws-powertools/powertools-lambda-python/issues/1701)) - **deps-dev:** bump mypy-boto3-s3 from 1.25.0 to 1.26.0.post1 ([#1716](https://github.com/aws-powertools/powertools-lambda-python/issues/1716)) - **deps-dev:** bump mypy-boto3-appconfigdata from 1.25.0 to 1.26.0.post1 ([#1704](https://github.com/aws-powertools/powertools-lambda-python/issues/1704)) - **deps-dev:** bump mypy-boto3-xray from 1.25.0 to 1.26.0.post1 ([#1703](https://github.com/aws-powertools/powertools-lambda-python/issues/1703)) - **deps-dev:** bump mypy-boto3-cloudwatch from 1.25.0 to 1.26.0.post1 ([#1714](https://github.com/aws-powertools/powertools-lambda-python/issues/1714)) - **deps-dev:** bump flake8-bugbear from 22.10.25 to 22.10.27 ([#1665](https://github.com/aws-powertools/powertools-lambda-python/issues/1665)) - **deps-dev:** bump mypy-boto3-lambda from 1.25.0 to 1.26.0.post1 ([#1705](https://github.com/aws-powertools/powertools-lambda-python/issues/1705)) - **deps-dev:** bump mypy-boto3-xray from 1.26.0.post1 to 1.26.9 ([#1720](https://github.com/aws-powertools/powertools-lambda-python/issues/1720)) - **deps-dev:** bump mypy-boto3-logs from 1.25.0 to 1.26.3 ([#1702](https://github.com/aws-powertools/powertools-lambda-python/issues/1702)) - **deps-dev:** bump mypy-boto3-ssm from 1.26.0.post1 to 1.26.4 ([#1721](https://github.com/aws-powertools/powertools-lambda-python/issues/1721)) - **deps-dev:** bump mypy-boto3-appconfig from 1.25.0 to 1.26.0.post1 ([#1722](https://github.com/aws-powertools/powertools-lambda-python/issues/1722)) - **deps-dev:** bump pytest-asyncio from 0.20.1 to 0.20.2 ([#1723](https://github.com/aws-powertools/powertools-lambda-python/issues/1723)) - **deps-dev:** bump flake8-builtins from 2.0.0 to 2.0.1 ([#1715](https://github.com/aws-powertools/powertools-lambda-python/issues/1715)) - **deps-dev:** bump pytest-xdist from 2.5.0 to 3.0.2 ([#1655](https://github.com/aws-powertools/powertools-lambda-python/issues/1655)) - **deps-dev:** bump mkdocs-material from 8.5.7 to 8.5.9 ([#1697](https://github.com/aws-powertools/powertools-lambda-python/issues/1697)) - **deps-dev:** bump flake8-comprehensions from 3.10.0 to 3.10.1 ([#1699](https://github.com/aws-powertools/powertools-lambda-python/issues/1699)) - **deps-dev:** bump types-requests from 2.28.11.2 to 2.28.11.3 ([#1698](https://github.com/aws-powertools/powertools-lambda-python/issues/1698)) - **deps-dev:** bump pytest-benchmark from 3.4.1 to 4.0.0 ([#1659](https://github.com/aws-powertools/powertools-lambda-python/issues/1659)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.25.0 to 1.26.0.post1 ([#1691](https://github.com/aws-powertools/powertools-lambda-python/issues/1691)) - **deps-dev:** bump mypy-boto3-ssm from 1.25.0 to 1.26.0.post1 ([#1690](https://github.com/aws-powertools/powertools-lambda-python/issues/1690)) - **logger:** uncaught exception to use exception value as message - **logger:** overload inject_lambda_context with generics ([#1583](https://github.com/aws-powertools/powertools-lambda-python/issues/1583)) ## [v2.2.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.1.0...v2.2.0) - 2022-11-07 ## Documentation - **homepage:** remove v1 layer limitation on pydantic not being included - **tracer:** add note on why X-Ray SDK over ADOT closes [#1675](https://github.com/aws-powertools/powertools-lambda-python/issues/1675) ## Features - **metrics:** add EphemeralMetrics as a non-singleton option ([#1676](https://github.com/aws-powertools/powertools-lambda-python/issues/1676)) - **parameters:** add get_parameters_by_name for SSM params in distinct paths ([#1678](https://github.com/aws-powertools/powertools-lambda-python/issues/1678)) ## Maintenance - update v2 layer ARN on documentation - **deps:** bump package to 2.2.0 - **deps-dev:** bump aws-cdk-lib from 2.49.0 to 2.50.0 ([#1683](https://github.com/aws-powertools/powertools-lambda-python/issues/1683)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.25.0 to 1.26.0.post1 ([#1682](https://github.com/aws-powertools/powertools-lambda-python/issues/1682)) - **deps-dev:** bump mypy-boto3-cloudformation from 1.25.0 to 1.26.0.post1 ([#1679](https://github.com/aws-powertools/powertools-lambda-python/issues/1679)) - **package:** correct pyproject version manually ## [v2.1.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v2.0.0...v2.1.0) - 2022-10-31 ## Bug Fixes - **ci:** linting issues after flake8-blackbear,mypy upgrades - **deps:** update build system to poetry-core ([#1651](https://github.com/aws-powertools/powertools-lambda-python/issues/1651)) - **idempotency:** idempotent_function should support standalone falsy values ([#1669](https://github.com/aws-powertools/powertools-lambda-python/issues/1669)) - **logger:** fix unknown attributes being ignored by mypy ([#1670](https://github.com/aws-powertools/powertools-lambda-python/issues/1670)) ## Documentation - **community:** fix social handlers for Ran ([#1654](https://github.com/aws-powertools/powertools-lambda-python/issues/1654)) - **community:** fix twitch parent domain for embedded video - **homepage:** remove 3.6 and add hero image - **homepage:** add Pulumi code example ([#1652](https://github.com/aws-powertools/powertools-lambda-python/issues/1652)) - **index:** fold support us banner - **index:** add quotes to pip for zsh customers - **install:** address early v2 feedback on installation and project support - **we-made-this:** new community content section ([#1650](https://github.com/aws-powertools/powertools-lambda-python/issues/1650)) ## Features - **layers:** add layer balancer script ([#1643](https://github.com/aws-powertools/powertools-lambda-python/issues/1643)) - **logger:** add use_rfc3339 and auto-complete formatter opts in Logger ([#1662](https://github.com/aws-powertools/powertools-lambda-python/issues/1662)) - **logger:** accept arbitrary keyword=value for ephemeral metadata ([#1658](https://github.com/aws-powertools/powertools-lambda-python/issues/1658)) ## Maintenance - update v2 layer ARN on documentation - **ci:** fix typo on version description - **deps:** bump peaceiris/actions-gh-pages from 3.8.0 to 3.9.0 ([#1649](https://github.com/aws-powertools/powertools-lambda-python/issues/1649)) - **deps:** bump docker/setup-qemu-action from 2.0.0 to 2.1.0 ([#1627](https://github.com/aws-powertools/powertools-lambda-python/issues/1627)) - **deps-dev:** bump aws-cdk-lib from 2.47.0 to 2.48.0 ([#1664](https://github.com/aws-powertools/powertools-lambda-python/issues/1664)) - **deps-dev:** bump flake8-variables-names from 0.0.4 to 0.0.5 ([#1628](https://github.com/aws-powertools/powertools-lambda-python/issues/1628)) - **deps-dev:** bump pytest-asyncio from 0.16.0 to 0.20.1 ([#1635](https://github.com/aws-powertools/powertools-lambda-python/issues/1635)) - **deps-dev:** bump aws-cdk-lib from 2.48.0 to 2.49.0 ([#1671](https://github.com/aws-powertools/powertools-lambda-python/issues/1671)) - **docs:** remove v2 banner on top of the docs - **governance:** remove 'area/' from PR labels ## [v2.0.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.31.1...v2.0.0) - 2022-10-24 ## Bug Fixes - lock dependencies - mypy errors - lint files - **ci:** temporarly remove pypi test deployment - **ci:** use docker driver on buildx - **ci:** new artifact path, sed gnu/linux syntax, and pypi test - **ci:** secret and OIDC inheritance in nested children workflow - **ci:** build without buildkit - **ci:** fix arm64 layer builds - **ci:** remove v2 suffix from SAR apps ([#1633](https://github.com/aws-powertools/powertools-lambda-python/issues/1633)) - **ci:** workflow should use npx for CDK CLI - **parser:** S3Model Object Deleted omits size and eTag attr ([#1638](https://github.com/aws-powertools/powertools-lambda-python/issues/1638)) ## Code Refactoring - **apigateway:** remove POWERTOOLS_EVENT_HANDLER_DEBUG env var ([#1620](https://github.com/aws-powertools/powertools-lambda-python/issues/1620)) - **batch:** remove legacy sqs_batch_processor ([#1492](https://github.com/aws-powertools/powertools-lambda-python/issues/1492)) - **e2e:** make table name dynamic - **e2e:** fix idempotency typing ## Documentation - **batch:** remove legacy reference to sqs processor - **homepage:** note about v2 version - **homepage:** auto-update Layer ARN on every release ([#1610](https://github.com/aws-powertools/powertools-lambda-python/issues/1610)) - **roadmap:** refresh roadmap post-v2 launch - **roadmap:** include observability provider and lambda layer themes before v2 - **upgrade_guide:** add latest changes and quick summary ([#1623](https://github.com/aws-powertools/powertools-lambda-python/issues/1623)) - **v2:** document optional dependencies and local dev ([#1574](https://github.com/aws-powertools/powertools-lambda-python/issues/1574)) ## Features - **apigateway:** ignore trailing slashes in routes (APIGatewayRestResolver) ([#1609](https://github.com/aws-powertools/powertools-lambda-python/issues/1609)) - **ci:** release docs as alpha when doing a pre-release ([#1624](https://github.com/aws-powertools/powertools-lambda-python/issues/1624)) - **data-classes:** replace AttributeValue in DynamoDBStreamEvent with deserialized Python values ([#1619](https://github.com/aws-powertools/powertools-lambda-python/issues/1619)) - **data_classes:** add KinesisFirehoseEvent ([#1540](https://github.com/aws-powertools/powertools-lambda-python/issues/1540)) - **event_handler:** improved support for headers and cookies in v2 ([#1455](https://github.com/aws-powertools/powertools-lambda-python/issues/1455)) - **event_handler:** add cookies as 1st class citizen in v2 ([#1487](https://github.com/aws-powertools/powertools-lambda-python/issues/1487)) - **idempotency:** support methods with the same name (ABCs) by including fully qualified name in v2 ([#1535](https://github.com/aws-powertools/powertools-lambda-python/issues/1535)) - **layer:** publish SAR v2 via Github actions ([#1585](https://github.com/aws-powertools/powertools-lambda-python/issues/1585)) - **layers:** add support for publishing v2 layer ([#1558](https://github.com/aws-powertools/powertools-lambda-python/issues/1558)) - **parameters:** migrate AppConfig to new APIs due to API deprecation ([#1553](https://github.com/aws-powertools/powertools-lambda-python/issues/1553)) - **tracer:** support methods with the same name (ABCs) by including fully qualified name in v2 ([#1486](https://github.com/aws-powertools/powertools-lambda-python/issues/1486)) ## Maintenance - update v2 layer ARN on documentation - update v2 layer ARN on documentation - update v2 layer ARN on documentation - update v2 layer ARN on documentation - merge v2 branch - bump pyproject version to 2.0 - **ci:** make release process manual - **ci:** migrate E2E tests to CDK CLI and off Docker ([#1501](https://github.com/aws-powertools/powertools-lambda-python/issues/1501)) - **ci:** remove v1 workflows ([#1617](https://github.com/aws-powertools/powertools-lambda-python/issues/1617)) - **core:** expose modules in the Top-level package ([#1517](https://github.com/aws-powertools/powertools-lambda-python/issues/1517)) - **dep:** add cfn-lint as a dev dependency; pre-commit ([#1612](https://github.com/aws-powertools/powertools-lambda-python/issues/1612)) - **deps:** remove email-validator; use Str over EmailStr in SES model ([#1608](https://github.com/aws-powertools/powertools-lambda-python/issues/1608)) - **deps:** bump release-drafter/release-drafter from 5.21.0 to 5.21.1 ([#1611](https://github.com/aws-powertools/powertools-lambda-python/issues/1611)) - **deps:** lock importlib to 4.x - **deps-dev:** bump mypy-boto3-s3 from 1.24.76 to 1.24.94 ([#1622](https://github.com/aws-powertools/powertools-lambda-python/issues/1622)) - **deps-dev:** bump aws-cdk-lib from 2.46.0 to 2.47.0 ([#1629](https://github.com/aws-powertools/powertools-lambda-python/issues/1629)) - **layer:** bump to 1.31.1 (v39) ## [v1.31.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.31.0...v1.31.1) - 2022-10-14 ## Bug Fixes - **parser:** loose validation on SNS fields to support FIFO ([#1606](https://github.com/aws-powertools/powertools-lambda-python/issues/1606)) ## Documentation - **governance:** allow community to suggest feature content ([#1593](https://github.com/aws-powertools/powertools-lambda-python/issues/1593)) - **governance:** new form to allow customers self-nominate as public reference ([#1589](https://github.com/aws-powertools/powertools-lambda-python/issues/1589)) - **homepage:** include .NET powertools - **idempotency:** "persisntence" typo ([#1596](https://github.com/aws-powertools/powertools-lambda-python/issues/1596)) - **logger:** fix typo. ([#1587](https://github.com/aws-powertools/powertools-lambda-python/issues/1587)) ## Maintenance - add dummy v2 sar deploy job - bump layer version to 38 - **deps-dev:** bump mypy-boto3-ssm from 1.24.81 to 1.24.90 ([#1594](https://github.com/aws-powertools/powertools-lambda-python/issues/1594)) - **deps-dev:** bump flake8-builtins from 1.5.3 to 2.0.0 ([#1582](https://github.com/aws-powertools/powertools-lambda-python/issues/1582)) ## [v1.31.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.30.0...v1.31.0) - 2022-10-10 ## Bug Fixes - **metrics:** ensure dimension_set is reused across instances (pointer) ([#1581](https://github.com/aws-powertools/powertools-lambda-python/issues/1581)) ## Documentation - **readme:** add lambda layer latest version badge ## Features - **parser:** add KinesisFirehoseModel ([#1556](https://github.com/aws-powertools/powertools-lambda-python/issues/1556)) ## Maintenance - **deps-dev:** bump types-requests from 2.28.11.1 to 2.28.11.2 ([#1576](https://github.com/aws-powertools/powertools-lambda-python/issues/1576)) - **deps-dev:** bump typing-extensions from 4.3.0 to 4.4.0 ([#1575](https://github.com/aws-powertools/powertools-lambda-python/issues/1575)) - **layer:** remove unsused GetFunction permission for the canary - **layer:** bump to latest version 37 ## [v1.30.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.29.2...v1.30.0) - 2022-10-05 ## Bug Fixes - **apigateway:** update Response class to require status_code only ([#1560](https://github.com/aws-powertools/powertools-lambda-python/issues/1560)) - **ci:** integrate isort 5.0 with black to resolve conflicts - **event_sources:** implement Mapping protocol on DictWrapper for better interop with existing middlewares ([#1516](https://github.com/aws-powertools/powertools-lambda-python/issues/1516)) - **typing:** fix mypy error - **typing:** level arg in copy_config_to_registered_loggers ([#1534](https://github.com/aws-powertools/powertools-lambda-python/issues/1534)) ## Documentation - **batch:** document the new lambda context feature - **homepage:** introduce POWERTOOLS_DEV env var ([#1569](https://github.com/aws-powertools/powertools-lambda-python/issues/1569)) - **multiple:** fix highlighting after new isort/black integration - **parser:** add JSON string field extension example ([#1526](https://github.com/aws-powertools/powertools-lambda-python/issues/1526)) ## Features - **batch:** inject lambda_context if record handler signature accepts it ([#1561](https://github.com/aws-powertools/powertools-lambda-python/issues/1561)) - **event-handler:** context support to share data between routers ([#1567](https://github.com/aws-powertools/powertools-lambda-python/issues/1567)) - **logger:** introduce POWERTOOLS_DEBUG for internal debugging ([#1572](https://github.com/aws-powertools/powertools-lambda-python/issues/1572)) - **logger:** include logger name attribute when copy_config_to_registered_logger is used ([#1568](https://github.com/aws-powertools/powertools-lambda-python/issues/1568)) - **logger:** pretty-print JSON when POWERTOOLS_DEV is set ([#1548](https://github.com/aws-powertools/powertools-lambda-python/issues/1548)) ## Maintenance - **dep:** bump pyproject to pypi sync - **deps:** bump fastjsonschema from 2.16.1 to 2.16.2 ([#1530](https://github.com/aws-powertools/powertools-lambda-python/issues/1530)) - **deps:** bump actions/setup-python from 3 to 4 ([#1528](https://github.com/aws-powertools/powertools-lambda-python/issues/1528)) - **deps:** bump codecov/codecov-action from 3.1.0 to 3.1.1 ([#1529](https://github.com/aws-powertools/powertools-lambda-python/issues/1529)) - **deps:** bump dependabot/fetch-metadata from 1.3.3 to 1.3.4 ([#1565](https://github.com/aws-powertools/powertools-lambda-python/issues/1565)) - **deps:** bump email-validator from 1.2.1 to 1.3.0 ([#1533](https://github.com/aws-powertools/powertools-lambda-python/issues/1533)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.24.54 to 1.24.83 ([#1557](https://github.com/aws-powertools/powertools-lambda-python/issues/1557)) - **deps-dev:** bump mkdocs-material from 8.5.3 to 8.5.4 ([#1563](https://github.com/aws-powertools/powertools-lambda-python/issues/1563)) - **deps-dev:** bump pytest-cov from 3.0.0 to 4.0.0 ([#1551](https://github.com/aws-powertools/powertools-lambda-python/issues/1551)) - **deps-dev:** bump flake8-bugbear from 22.9.11 to 22.9.23 ([#1541](https://github.com/aws-powertools/powertools-lambda-python/issues/1541)) - **deps-dev:** bump types-requests from 2.28.11 to 2.28.11.1 ([#1571](https://github.com/aws-powertools/powertools-lambda-python/issues/1571)) - **deps-dev:** bump mypy-boto3-ssm from 1.24.69 to 1.24.80 ([#1542](https://github.com/aws-powertools/powertools-lambda-python/issues/1542)) - **deps-dev:** bump mako from 1.2.2 to 1.2.3 ([#1537](https://github.com/aws-powertools/powertools-lambda-python/issues/1537)) - **deps-dev:** bump types-requests from 2.28.10 to 2.28.11 ([#1538](https://github.com/aws-powertools/powertools-lambda-python/issues/1538)) - **deps-dev:** bump mkdocs-material from 8.5.1 to 8.5.3 ([#1532](https://github.com/aws-powertools/powertools-lambda-python/issues/1532)) - **deps-dev:** bump mypy-boto3-ssm from 1.24.80 to 1.24.81 ([#1544](https://github.com/aws-powertools/powertools-lambda-python/issues/1544)) - **deps-dev:** bump mypy-boto3-s3 from 1.24.36.post1 to 1.24.76 ([#1531](https://github.com/aws-powertools/powertools-lambda-python/issues/1531)) - **docs:** bump layer version to 36 (1.29.2) - **layers:** add dummy v2 layer automation - **lint:** use new isort black integration - **multiple:** localize powertools_dev env logic and warning ([#1570](https://github.com/aws-powertools/powertools-lambda-python/issues/1570)) ## [v1.29.2](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.29.1...v1.29.2) - 2022-09-19 ## Bug Fixes - **deps:** bump dev dep mako version to address CVE-2022-40023 ([#1524](https://github.com/aws-powertools/powertools-lambda-python/issues/1524)) ## Maintenance - **deps:** bump release-drafter/release-drafter from 5.20.1 to 5.21.0 ([#1520](https://github.com/aws-powertools/powertools-lambda-python/issues/1520)) - **deps-dev:** bump mkdocs-material from 8.5.0 to 8.5.1 ([#1521](https://github.com/aws-powertools/powertools-lambda-python/issues/1521)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.24.60 to 1.24.74 ([#1522](https://github.com/aws-powertools/powertools-lambda-python/issues/1522)) ## [v1.29.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.29.0...v1.29.1) - 2022-09-13 ## [v1.29.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.28.0...v1.29.0) - 2022-09-13 ## Bug Fixes - **ci:** ignore v2 action for now - **ci:** only run e2e tests on py 3.7 - **ci:** pass core fns to large pr workflow script - **ci:** on_label permissioning model & workflow execution - **ci:** ensure PR_AUTHOR is present for large_pr_split workflow - **ci:** gracefully and successful exit changelog upon no changes - **ci:** event resolution for on_label_added workflow - **core:** fixes leftovers from rebase ## Documentation - **layer:** upgrade to 1.28.0 (v33) ## Features - **ci:** add actionlint in pre-commit hook - **data-classes:** add KafkaEvent and KafkaEventRecord ([#1485](https://github.com/aws-powertools/powertools-lambda-python/issues/1485)) - **event_sources:** add CloudWatch dashboard custom widget event ([#1474](https://github.com/aws-powertools/powertools-lambda-python/issues/1474)) - **parser:** add KafkaMskEventModel and KafkaSelfManagedEventModel ([#1499](https://github.com/aws-powertools/powertools-lambda-python/issues/1499)) ## Maintenance - **ci:** add workflow to suggest splitting large PRs ([#1480](https://github.com/aws-powertools/powertools-lambda-python/issues/1480)) - **ci:** remove unused and undeclared OS matrix env - **ci:** disable v2 docs - **ci:** limit E2E workflow run for source code change - **ci:** add missing description fields - **ci:** sync package version with pypi - **ci:** fix invalid dependency leftover - **ci:** create adhoc docs workflow for v2 - **ci:** create adhoc docs workflow for v2 - **ci:** remove dangling debug step - **ci:** create docs workflow for v2 - **ci:** create reusable docs publishing workflow ([#1482](https://github.com/aws-powertools/powertools-lambda-python/issues/1482)) - **ci:** format comment on comment_large_pr script - **ci:** add note for state persistence on comment_large_pr - **ci:** destructure assignment on comment_large_pr - **ci:** record pr details upon labeling - **ci:** add linter for GitHub Actions as pre-commit hook ([#1479](https://github.com/aws-powertools/powertools-lambda-python/issues/1479)) - **ci:** enable ci checks for v2 - **deps-dev:** bump black from 21.12b0 to 22.8.0 ([#1515](https://github.com/aws-powertools/powertools-lambda-python/issues/1515)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.24.55.post1 to 1.24.60 ([#1481](https://github.com/aws-powertools/powertools-lambda-python/issues/1481)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.24.55.post1 to 1.24.60 ([#306](https://github.com/aws-powertools/powertools-lambda-python/issues/306)) - **deps-dev:** bump mkdocs-material from 8.4.1 to 8.4.2 ([#1483](https://github.com/aws-powertools/powertools-lambda-python/issues/1483)) - **deps-dev:** revert to v1.28.0 dependencies - **deps-dev:** bump mkdocs-material from 8.4.4 to 8.5.0 ([#1514](https://github.com/aws-powertools/powertools-lambda-python/issues/1514)) - **maintainers:** update release workflow link - **maintenance:** add discord link to first PR and first issue ([#1493](https://github.com/aws-powertools/powertools-lambda-python/issues/1493)) ## [v1.28.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.27.0...v1.28.0) - 2022-08-25 ## Bug Fixes - **ci:** calculate parallel jobs based on infrastructure needs ([#1475](https://github.com/aws-powertools/powertools-lambda-python/issues/1475)) - **ci:** del flake8 direct dep over py3.6 conflicts and docs failure - **ci:** move from pip-tools to poetry on layers reusable workflow - **ci:** move from pip-tools to poetry on layers to fix conflicts - **ci:** typo and bust gh actions cache - **ci:** use poetry to resolve layer deps; pip for CDK - **ci:** disable poetry venv for layer workflow as cdk ignores venv - **ci:** add cdk v2 dep for layers workflow - **ci:** move from pip-tools to poetry on layers - **ci:** temporarily disable changelog upon release - **ci:** add explicit origin to fix release detached head - **jmespath_util:** snappy as dev dep and typing example ([#1446](https://github.com/aws-powertools/powertools-lambda-python/issues/1446)) ## Documentation - **apigateway:** removes duplicate admonition ([#1426](https://github.com/aws-powertools/powertools-lambda-python/issues/1426)) - **home:** fix discord syntax and add Discord badge - **home:** add discord invitation link ([#1471](https://github.com/aws-powertools/powertools-lambda-python/issues/1471)) - **jmespath_util:** snippets split, improved, and lint ([#1419](https://github.com/aws-powertools/powertools-lambda-python/issues/1419)) - **layer:** upgrade to 1.27.0 - **layer:** upgrade to 1.27.0 - **middleware-factory:** snippets split, improved, and lint ([#1451](https://github.com/aws-powertools/powertools-lambda-python/issues/1451)) - **parser:** minor grammar fix ([#1427](https://github.com/aws-powertools/powertools-lambda-python/issues/1427)) - **typing:** snippets split, improved, and lint ([#1465](https://github.com/aws-powertools/powertools-lambda-python/issues/1465)) - **validation:** snippets split, improved, and lint ([#1449](https://github.com/aws-powertools/powertools-lambda-python/issues/1449)) ## Features - **parser:** add support for Lambda Function URL ([#1442](https://github.com/aws-powertools/powertools-lambda-python/issues/1442)) ## Maintenance - **batch:** deprecate sqs_batch_processor ([#1463](https://github.com/aws-powertools/powertools-lambda-python/issues/1463)) - **ci:** prevent concurrent git update in critical workflows ([#1478](https://github.com/aws-powertools/powertools-lambda-python/issues/1478)) - **ci:** disable e2e py version matrix due to concurrent locking - **ci:** revert e2e py version matrix - **ci:** temp disable e2e matrix - **ci:** update changelog with latest changes - **ci:** update changelog with latest changes - **ci:** reduce payload and only send prod notification - **ci:** remove area/utilities conflicting label - **ci:** include py version in stack and cache lock - **ci:** remove conventional changelog commit to reduce noise - **ci:** update changelog with latest changes - **deps:** bump release-drafter/release-drafter from 5.20.0 to 5.20.1 ([#1458](https://github.com/aws-powertools/powertools-lambda-python/issues/1458)) - **deps:** bump pydantic from 1.9.1 to 1.9.2 ([#1448](https://github.com/aws-powertools/powertools-lambda-python/issues/1448)) - **deps-dev:** bump flake8-bugbear from 22.8.22 to 22.8.23 ([#1473](https://github.com/aws-powertools/powertools-lambda-python/issues/1473)) - **deps-dev:** bump types-requests from 2.28.7 to 2.28.8 ([#1423](https://github.com/aws-powertools/powertools-lambda-python/issues/1423)) - **maintainer:** add Leandro as maintainer ([#1468](https://github.com/aws-powertools/powertools-lambda-python/issues/1468)) - **tests:** build and deploy Lambda Layer stack once ([#1466](https://github.com/aws-powertools/powertools-lambda-python/issues/1466)) - **tests:** refactor E2E test mechanics to ease maintenance, writing tests and parallelization ([#1444](https://github.com/aws-powertools/powertools-lambda-python/issues/1444)) - **tests:** enable end-to-end test workflow ([#1470](https://github.com/aws-powertools/powertools-lambda-python/issues/1470)) - **tests:** refactor E2E logger to ease maintenance, writing tests and parallelization ([#1460](https://github.com/aws-powertools/powertools-lambda-python/issues/1460)) - **tests:** refactor E2E tracer to ease maintenance, writing tests and parallelization ([#1457](https://github.com/aws-powertools/powertools-lambda-python/issues/1457)) ## Reverts - fix(ci): add explicit origin to fix release detached head ## [v1.27.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.26.7...v1.27.0) - 2022-08-05 ## Bug Fixes - **ci:** changelog workflow must receive git tags too - **ci:** add additional input to accurately describe intent on skip - **ci:** job permissions - **event_sources:** add test for Function URL AuthZ ([#1421](https://github.com/aws-powertools/powertools-lambda-python/issues/1421)) ## Documentation - **layer:** upgrade to 1.26.7 ## Features - **ci:** create reusable changelog generation ([#1418](https://github.com/aws-powertools/powertools-lambda-python/issues/1418)) - **ci:** include changelog generation on docs build - **ci:** create reusable changelog generation - **event_handlers:** Add support for Lambda Function URLs ([#1408](https://github.com/aws-powertools/powertools-lambda-python/issues/1408)) - **metrics:** update max user-defined dimensions from 9 to 29 ([#1417](https://github.com/aws-powertools/powertools-lambda-python/issues/1417)) ## Maintenance - **ci:** sync area labels to prevent dedup - **ci:** update changelog with latest changes - **ci:** update changelog with latest changes - **ci:** add manual trigger for docs - **ci:** update changelog with latest changes - **ci:** temporarily disable changelog push on release - **ci:** update changelog with latest changes - **ci:** move changelog generation to rebuild_latest_doc workflow - **ci:** update project with version - **ci:** update release automated activities - **ci:** readd changelog step on release - **ci:** move changelog generation to rebuild_latest_doc workflow - **ci:** drop 3.6 from workflows - **deps:** bump constructs from 10.1.1 to 10.1.60 ([#1399](https://github.com/aws-powertools/powertools-lambda-python/issues/1399)) - **deps:** bump constructs from 10.1.1 to 10.1.66 ([#1414](https://github.com/aws-powertools/powertools-lambda-python/issues/1414)) - **deps:** bump jsii from 1.57.0 to 1.63.2 ([#1400](https://github.com/aws-powertools/powertools-lambda-python/issues/1400)) - **deps:** bump constructs from 10.1.1 to 10.1.64 ([#1405](https://github.com/aws-powertools/powertools-lambda-python/issues/1405)) - **deps:** bump attrs from 21.4.0 to 22.1.0 ([#1397](https://github.com/aws-powertools/powertools-lambda-python/issues/1397)) - **deps:** bump constructs from 10.1.1 to 10.1.63 ([#1402](https://github.com/aws-powertools/powertools-lambda-python/issues/1402)) - **deps:** bump constructs from 10.1.1 to 10.1.65 ([#1407](https://github.com/aws-powertools/powertools-lambda-python/issues/1407)) - **deps-dev:** bump types-requests from 2.28.5 to 2.28.6 ([#1401](https://github.com/aws-powertools/powertools-lambda-python/issues/1401)) - **deps-dev:** bump types-requests from 2.28.6 to 2.28.7 ([#1406](https://github.com/aws-powertools/powertools-lambda-python/issues/1406)) - **docs:** remove pause sentence from roadmap ([#1409](https://github.com/aws-powertools/powertools-lambda-python/issues/1409)) - **docs:** update site name to test ci changelog - **docs:** update CHANGELOG for v1.26.7 - **docs:** update description to trigger changelog generation - **governance:** remove devcontainer in favour of gitpod.io ([#1411](https://github.com/aws-powertools/powertools-lambda-python/issues/1411)) - **governance:** add pre-configured dev environment with GitPod.io to ease contributions ([#1403](https://github.com/aws-powertools/powertools-lambda-python/issues/1403)) - **layers:** upgrade cdk dep hashes to prevent ci fail ## [v1.26.7](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.26.6...v1.26.7) - 2022-07-29 ## Bug Fixes - **ci:** add missing oidc token generation permission - **event_handlers:** ImportError when importing Response from top-level event_handler ([#1388](https://github.com/aws-powertools/powertools-lambda-python/issues/1388)) ## Documentation - **examples:** enforce and fix all mypy errors ([#1393](https://github.com/aws-powertools/powertools-lambda-python/issues/1393)) ## Features - **idempotency:** handle lambda timeout scenarios for INPROGRESS records ([#1387](https://github.com/aws-powertools/powertools-lambda-python/issues/1387)) ## Maintenance - **ci:** increase skip_pypi logic to cover tests/changelog on re-run failures - **ci:** update project with version 1.26.6 - **ci:** drop 3.6 from workflows ([#1395](https://github.com/aws-powertools/powertools-lambda-python/issues/1395)) - **ci:** add conditional to skip pypi release ([#1366](https://github.com/aws-powertools/powertools-lambda-python/issues/1366)) - **ci:** remove leftover logic from on_merged_pr workflow - **ci:** update project with version 1.26.6 - **ci:** update project with version 1.26.6 - **deps:** bump jsii from 1.57.0 to 1.63.1 ([#1390](https://github.com/aws-powertools/powertools-lambda-python/issues/1390)) - **deps:** bump constructs from 10.1.1 to 10.1.59 ([#1396](https://github.com/aws-powertools/powertools-lambda-python/issues/1396)) - **deps-dev:** bump flake8-isort from 4.1.1 to 4.1.2.post0 ([#1384](https://github.com/aws-powertools/powertools-lambda-python/issues/1384)) - **layers:** bump to 1.26.6 using layer v26 - **maintainers:** add Ruben as a maintainer ([#1392](https://github.com/aws-powertools/powertools-lambda-python/issues/1392)) ## [v1.26.6](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.26.5...v1.26.6) - 2022-07-25 ## Bug Fixes - **ci:** remove unsupported env in workflow_call - **ci:** allow inherit secrets for reusable workflow - **ci:** remove unused secret - **ci:** label_related_issue unresolved var from history mixup - **ci:** cond doesnt support two expr w/ env - **ci:** only event is resolved in cond - **ci:** unexpected symbol due to double quotes... - **event_handlers:** handle lack of headers when using auto-compression feature ([#1325](https://github.com/aws-powertools/powertools-lambda-python/issues/1325)) ## Maintenance - dummy for PR test - print full event depth - print full workflow event depth - debug full event - remove leftover from fork one more time - **ci:** test env expr - **ci:** test upstream job skip - **ci:** lockdown workflow_run by origin ([#1350](https://github.com/aws-powertools/powertools-lambda-python/issues/1350)) - **ci:** test default env - **ci:** experiment hardening origin - **ci:** experiment hardening origin - **ci:** introduce codeowners ([#1352](https://github.com/aws-powertools/powertools-lambda-python/issues/1352)) - **ci:** use OIDC and encrypt release secrets ([#1355](https://github.com/aws-powertools/powertools-lambda-python/issues/1355)) - **ci:** remove core group from codeowners ([#1358](https://github.com/aws-powertools/powertools-lambda-python/issues/1358)) - **ci:** confirm workflow_run event - **ci:** use gh environment for beta and prod layer deploy ([#1356](https://github.com/aws-powertools/powertools-lambda-python/issues/1356)) - **ci:** update project with version 1.26.5 - **deps:** bump constructs from 10.1.1 to 10.1.52 ([#1343](https://github.com/aws-powertools/powertools-lambda-python/issues/1343)) - **deps-dev:** bump mypy-boto3-cloudwatch from 1.24.0 to 1.24.35 ([#1342](https://github.com/aws-powertools/powertools-lambda-python/issues/1342)) - **governance:** update wording tech debt to summary in maintenance template - **governance:** add new maintenance issue template for tech debt ([#1326](https://github.com/aws-powertools/powertools-lambda-python/issues/1326)) - **layers:** layer canary stack should not hardcode resource name - **layers:** replace layers account secret ([#1329](https://github.com/aws-powertools/powertools-lambda-python/issues/1329)) - **layers:** expand to all aws commercial regions ([#1324](https://github.com/aws-powertools/powertools-lambda-python/issues/1324)) - **layers:** bump to 1.26.5 ## Pull Requests - Merge pull request [#285](https://github.com/aws-powertools/powertools-lambda-python/issues/285) from heitorlessa/chore/skip-dep-workflow - Merge pull request [#284](https://github.com/aws-powertools/powertools-lambda-python/issues/284) from heitorlessa/chore/dummy ## [v1.26.5](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.26.4...v1.26.5) - 2022-07-20 ## Bug Fixes - mathc the name of the cdk synth from the build phase - typo in input for layer workflow - no need to cache npm since we only install cdk cli and don't have .lock files - add entire ARN role instead of account and role name - path to artefact - unzip the right artifact name - download artefact into the layer dir - sight, yes a whitespace character breaks the build - **ci:** checkout project before validating related issue workflow - **ci:** install poetry before calling setup/python with cache ([#1315](https://github.com/aws-powertools/powertools-lambda-python/issues/1315)) - **ci:** remove additional quotes in PR action ([#1317](https://github.com/aws-powertools/powertools-lambda-python/issues/1317)) - **ci:** lambda layer workflow release version and conditionals ([#1316](https://github.com/aws-powertools/powertools-lambda-python/issues/1316)) - **ci:** fetch all git info so we can check tags - **ci:** lambda layer workflow release version and conditionals ([#1316](https://github.com/aws-powertools/powertools-lambda-python/issues/1316)) - **ci:** keep layer version permission ([#1318](https://github.com/aws-powertools/powertools-lambda-python/issues/1318)) - **ci:** regex to catch combination of related issues workflow - **deps:** correct mypy types as dev dependency ([#1322](https://github.com/aws-powertools/powertools-lambda-python/issues/1322)) - **logger:** preserve std keys when using custom formatters ([#1264](https://github.com/aws-powertools/powertools-lambda-python/issues/1264)) ## Documentation - **event-handler:** snippets split, improved, and lint ([#1279](https://github.com/aws-powertools/powertools-lambda-python/issues/1279)) - **governance:** typos on PR template fixes [#1314](https://github.com/aws-powertools/powertools-lambda-python/issues/1314) - **governance:** add security doc to the root ## Maintenance - **ci:** limits concurrency for docs workflow - **ci:** adds caching when installing python dependencies ([#1311](https://github.com/aws-powertools/powertools-lambda-python/issues/1311)) - **ci:** update project with version 1.26.4 - **ci:** fix reference error in related_issue - **deps:** bump constructs from 10.1.1 to 10.1.51 ([#1323](https://github.com/aws-powertools/powertools-lambda-python/issues/1323)) - **deps-dev:** bump mypy from 0.961 to 0.971 ([#1320](https://github.com/aws-powertools/powertools-lambda-python/issues/1320)) - **governance:** fix typo on semantic commit link introduced in [#1](https://github.com/aws-powertools/powertools-lambda-python/issues/1)aef4 - **layers:** add release pipeline in GitHub Actions ([#1278](https://github.com/aws-powertools/powertools-lambda-python/issues/1278)) - **layers:** bump to 22 for 1.26.3 ## [v1.26.4](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.26.3...v1.26.4) - 2022-07-18 ## Bug Fixes - **ci:** checkout project before validating related issue workflow - **ci:** fixes typos and small issues on github scripts ([#1302](https://github.com/aws-powertools/powertools-lambda-python/issues/1302)) - **ci:** address conditional type on_merge - **ci:** address pr title semantic not found logic - **ci:** address gh-actions additional quotes; remove debug - **ci:** regex group name for on_merge workflow - **ci:** escape outputs as certain PRs can break GH Actions expressions - **ci:** move conditionals from yaml to code; leftover - **ci:** move conditionals from yaml to code - **ci:** accept core arg in label related issue workflow - **ci:** match the name of the cdk synth from the build phase - **ci:** regex to catch combination of related issues workflow - **logger:** preserve std keys when using custom formatters ([#1264](https://github.com/aws-powertools/powertools-lambda-python/issues/1264)) - **parser:** raise ValidationError when SNS->SQS keys are intentionally missing ([#1299](https://github.com/aws-powertools/powertools-lambda-python/issues/1299)) ## Documentation - **event-handler:** snippets split, improved, and lint ([#1279](https://github.com/aws-powertools/powertools-lambda-python/issues/1279)) - **graphql:** snippets split, improved, and lint ([#1287](https://github.com/aws-powertools/powertools-lambda-python/issues/1287)) - **homepage:** emphasize additional powertools languages ([#1292](https://github.com/aws-powertools/powertools-lambda-python/issues/1292)) - **metrics:** snippets split, improved, and lint ## Maintenance - **ci:** increase release automation and limit to one manual step ([#1297](https://github.com/aws-powertools/powertools-lambda-python/issues/1297)) - **ci:** make export PR reusable - **ci:** auto-merge cdk lib and lambda layer construct - **ci:** convert inline gh-script to file - **ci:** lockdown 3rd party workflows to pin sha ([#1301](https://github.com/aws-powertools/powertools-lambda-python/issues/1301)) - **ci:** automatically add area label based on title ([#1300](https://github.com/aws-powertools/powertools-lambda-python/issues/1300)) - **ci:** disable output debugging as pr body isnt accepted - **ci:** experiment with conditional on outputs - **ci:** improve error handling for non-issue numbers - **ci:** add end to end testing mechanism ([#1247](https://github.com/aws-powertools/powertools-lambda-python/issues/1247)) - **ci:** limits concurrency for docs workflow - **ci:** fix reference error in related_issue - **ci:** move error prone env to code as constants - **ci:** move all scripts under .github/scripts - **deps:** bump cdk-lambda-powertools-python-layer ([#1284](https://github.com/aws-powertools/powertools-lambda-python/issues/1284)) - **deps:** bump jsii from 1.61.0 to 1.62.0 ([#1294](https://github.com/aws-powertools/powertools-lambda-python/issues/1294)) - **deps:** bump constructs from 10.1.1 to 10.1.46 ([#1306](https://github.com/aws-powertools/powertools-lambda-python/issues/1306)) - **deps:** bump actions/setup-node from 2 to 3 ([#1281](https://github.com/aws-powertools/powertools-lambda-python/issues/1281)) - **deps:** bump fastjsonschema from 2.15.3 to 2.16.1 ([#1309](https://github.com/aws-powertools/powertools-lambda-python/issues/1309)) - **deps:** bump constructs from 10.1.1 to 10.1.49 ([#1308](https://github.com/aws-powertools/powertools-lambda-python/issues/1308)) - **deps:** bump attrs from 21.2.0 to 21.4.0 ([#1282](https://github.com/aws-powertools/powertools-lambda-python/issues/1282)) - **deps:** bump aws-cdk-lib from 2.29.0 to 2.31.1 ([#1290](https://github.com/aws-powertools/powertools-lambda-python/issues/1290)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.24.12 to 1.24.27 ([#1293](https://github.com/aws-powertools/powertools-lambda-python/issues/1293)) - **deps-dev:** bump mypy-boto3-appconfig from 1.24.0 to 1.24.29 ([#1295](https://github.com/aws-powertools/powertools-lambda-python/issues/1295)) - **governance:** remove any step relying on master branch - **governance:** update emeritus affiliation - **layers:** add release pipeline in GitHub Actions ([#1278](https://github.com/aws-powertools/powertools-lambda-python/issues/1278)) - **layers:** bump to 22 for 1.26.3 ## [v1.26.3](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.26.2...v1.26.3) - 2022-07-04 ## Bug Fixes - **ci:** remove utf-8 body in octokit body req - **ci:** improve msg visibility on closed issues - **ci:** disable merged_pr workflow - **ci:** merged_pr add issues write access - **ci:** quote prBody GH expr on_opened_pr - **ci:** reusable workflow secrets param - **logger:** support additional args for handlers when injecting lambda context ([#1276](https://github.com/aws-powertools/powertools-lambda-python/issues/1276)) - **logger:** preserve std keys when using custom formatters ([#1264](https://github.com/aws-powertools/powertools-lambda-python/issues/1264)) ## Documentation - **lint:** add markdownlint rules and automation ([#1256](https://github.com/aws-powertools/powertools-lambda-python/issues/1256)) - **logger:** document enriching logs with logrecord attributes ([#1271](https://github.com/aws-powertools/powertools-lambda-python/issues/1271)) - **logger:** snippets split, improved, and lint ([#1262](https://github.com/aws-powertools/powertools-lambda-python/issues/1262)) - **metrics:** snippets split, improved, and lint ([#1272](https://github.com/aws-powertools/powertools-lambda-python/issues/1272)) - **tracer:** snippets split, improved, and lint ([#1261](https://github.com/aws-powertools/powertools-lambda-python/issues/1261)) - **tracer:** split and lint code snippets ([#1260](https://github.com/aws-powertools/powertools-lambda-python/issues/1260)) ## Maintenance - move to approach B for multiple IaC - add sam build gitignore - bump to version 1.26.3 - **ci:** reactivate on_merged_pr workflow - **ci:** improve wording on closed issues action - **ci:** deactivate on_merged_pr workflow - **deps:** bump aws-xray-sdk from 2.9.0 to 2.10.0 ([#1270](https://github.com/aws-powertools/powertools-lambda-python/issues/1270)) - **deps:** bump dependabot/fetch-metadata from 1.1.1 to 1.3.2 ([#1269](https://github.com/aws-powertools/powertools-lambda-python/issues/1269)) - **deps:** bump dependabot/fetch-metadata from 1.3.2 to 1.3.3 ([#1273](https://github.com/aws-powertools/powertools-lambda-python/issues/1273)) - **deps-dev:** bump flake8-bugbear from 22.6.22 to 22.7.1 ([#1274](https://github.com/aws-powertools/powertools-lambda-python/issues/1274)) - **deps-dev:** bump flake8-bugbear from 22.4.25 to 22.6.22 ([#1258](https://github.com/aws-powertools/powertools-lambda-python/issues/1258)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.24.0 to 1.24.12 ([#1255](https://github.com/aws-powertools/powertools-lambda-python/issues/1255)) - **deps-dev:** bump mypy-boto3-secretsmanager ([#1252](https://github.com/aws-powertools/powertools-lambda-python/issues/1252)) - **governance:** fix on_merged_pr workflow syntax - **governance:** warn message on closed issues - **layers:** bump to 21 for 1.26.2 - **test-perf:** use pytest-benchmark to improve reliability ([#1250](https://github.com/aws-powertools/powertools-lambda-python/issues/1250)) ## [v1.26.2](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.26.1...v1.26.2) - 2022-06-16 ## Bug Fixes - **event-handler:** body to empty string in CORS preflight (ALB non-compliant) ([#1249](https://github.com/aws-powertools/powertools-lambda-python/issues/1249)) ## Code Refactoring - rename to clear_state - rename to remove_custom_keys ## Documentation - fix anchor ## Features - **logger:** add option to clear state per invocation ## Maintenance - bump to 1.26.2 - **deps:** bump actions/setup-python from 3 to 4 ([#1244](https://github.com/aws-powertools/powertools-lambda-python/issues/1244)) - **deps-dev:** bump mypy from 0.960 to 0.961 ([#1241](https://github.com/aws-powertools/powertools-lambda-python/issues/1241)) - **deps-dev:** bump mypy-boto3-ssm from 1.23.0.post1 to 1.24.0 ([#1231](https://github.com/aws-powertools/powertools-lambda-python/issues/1231)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.23.8 to 1.24.0 ([#1232](https://github.com/aws-powertools/powertools-lambda-python/issues/1232)) - **deps-dev:** bump mypy-boto3-dynamodb from 1.23.0.post1 to 1.24.0 ([#1234](https://github.com/aws-powertools/powertools-lambda-python/issues/1234)) - **deps-dev:** bump mypy-boto3-appconfig from 1.23.0.post1 to 1.24.0 ([#1233](https://github.com/aws-powertools/powertools-lambda-python/issues/1233)) - **governance:** auto-merge on all PR events - **governance:** add release label on pr merge - **governance:** enforce safe scope on pr merge labelling - **governance:** limit build workflow to code changes only - **governance:** auto-merge workflow_dispatch off - **governance:** auto-merge to use squash - **governance:** check for related issue in new PRs - **governance:** auto-merge mypy-stub dependabot - **governance:** address gh reusable workflow limitation - **governance:** fix workflow action requirements & syntax - **governance:** warn message on closed issues - **metrics:** revert dimensions test before splitting ([#1243](https://github.com/aws-powertools/powertools-lambda-python/issues/1243)) ## [v1.26.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.26.0...v1.26.1) - 2022-06-07 ## Bug Fixes - **metrics:** raise SchemaValidationError for >8 metric dimensions ([#1240](https://github.com/aws-powertools/powertools-lambda-python/issues/1240)) ## Documentation - **governance:** link roadmap and maintainers doc - **maintainers:** initial maintainers playbook ([#1222](https://github.com/aws-powertools/powertools-lambda-python/issues/1222)) - **roadmap:** use pinned pause issue instead ## Maintenance - bump version 1.26.1 - **deps-dev:** bump mypy from 0.950 to 0.960 ([#1224](https://github.com/aws-powertools/powertools-lambda-python/issues/1224)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.23.0.post1 to 1.23.8 ([#1225](https://github.com/aws-powertools/powertools-lambda-python/issues/1225)) ## [v1.26.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.25.10...v1.26.0) - 2022-05-20 ## Bug Fixes - **batch:** missing space in BatchProcessingError message ([#1201](https://github.com/aws-powertools/powertools-lambda-python/issues/1201)) - **batch:** docstring fix for success_handler() record parameter ([#1202](https://github.com/aws-powertools/powertools-lambda-python/issues/1202)) - **docs:** remove Slack link ([#1210](https://github.com/aws-powertools/powertools-lambda-python/issues/1210)) ## Documentation - **layer:** upgrade to 1.25.10 - **roadmap:** add new roadmap section ([#1204](https://github.com/aws-powertools/powertools-lambda-python/issues/1204)) ## Features - **parameters:** accept boto3_client to support private endpoints and ease testing ([#1096](https://github.com/aws-powertools/powertools-lambda-python/issues/1096)) ## Maintenance - bump to 1.26.0 - **deps:** bump pydantic from 1.9.0 to 1.9.1 ([#1221](https://github.com/aws-powertools/powertools-lambda-python/issues/1221)) - **deps:** bump email-validator from 1.1.3 to 1.2.1 ([#1199](https://github.com/aws-powertools/powertools-lambda-python/issues/1199)) - **deps-dev:** bump mypy-boto3-secretsmanager from 1.21.34 to 1.23.0.post1 ([#1218](https://github.com/aws-powertools/powertools-lambda-python/issues/1218)) - **deps-dev:** bump mypy-boto3-appconfig from 1.21.34 to 1.23.0.post1 ([#1219](https://github.com/aws-powertools/powertools-lambda-python/issues/1219)) - **deps-dev:** bump mypy-boto3-ssm from 1.21.34 to 1.23.0.post1 ([#1220](https://github.com/aws-powertools/powertools-lambda-python/issues/1220)) ## [v1.25.10](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.25.9...v1.25.10) - 2022-04-29 ## Bug Fixes - **data-classes:** Add missing SES fields and ([#1045](https://github.com/aws-powertools/powertools-lambda-python/issues/1045)) - **deps:** Ignore boto3 changes until needed ([#1151](https://github.com/aws-powertools/powertools-lambda-python/issues/1151)) - **deps-dev:** remove jmespath due to dev deps conflict ([#1148](https://github.com/aws-powertools/powertools-lambda-python/issues/1148)) - **event_handler:** exception_handler to handle ServiceError exceptions ([#1160](https://github.com/aws-powertools/powertools-lambda-python/issues/1160)) - **event_handler:** Allow for event_source support ([#1159](https://github.com/aws-powertools/powertools-lambda-python/issues/1159)) - **parser:** Add missing fields for SESEvent ([#1027](https://github.com/aws-powertools/powertools-lambda-python/issues/1027)) ## Documentation - **layer:** upgrade to 1.25.9 ## Features - **parameters:** add clear_cache method for providers ([#1194](https://github.com/aws-powertools/powertools-lambda-python/issues/1194)) ## Maintenance - include regression in changelog - bump to 1.25.10 - **ci:** changelog pre-generation to fetch tags from origin - **ci:** disable mergify configuration after breaking changes ([#1188](https://github.com/aws-powertools/powertools-lambda-python/issues/1188)) - **ci:** post release on tagged issues too - **deps:** bump codecov/codecov-action from 3.0.0 to 3.1.0 ([#1143](https://github.com/aws-powertools/powertools-lambda-python/issues/1143)) - **deps:** bump github/codeql-action from 1 to 2 ([#1154](https://github.com/aws-powertools/powertools-lambda-python/issues/1154)) - **deps-dev:** bump flake8-eradicate from 1.2.0 to 1.2.1 ([#1158](https://github.com/aws-powertools/powertools-lambda-python/issues/1158)) - **deps-dev:** bump mypy from 0.942 to 0.950 ([#1162](https://github.com/aws-powertools/powertools-lambda-python/issues/1162)) - **deps-dev:** bump mkdocs-git-revision-date-plugin ([#1146](https://github.com/aws-powertools/powertools-lambda-python/issues/1146)) - **deps-dev:** bump flake8-bugbear from 22.1.11 to 22.4.25 ([#1156](https://github.com/aws-powertools/powertools-lambda-python/issues/1156)) - **deps-dev:** bump xenon from 0.8.0 to 0.9.0 ([#1145](https://github.com/aws-powertools/powertools-lambda-python/issues/1145)) - **deps-dev:** bump mypy from 0.931 to 0.942 ([#1133](https://github.com/aws-powertools/powertools-lambda-python/issues/1133)) ## Regression - **parser:** Add missing fields for SESEvent ([#1027](https://github.com/aws-powertools/powertools-lambda-python/issues/1027)) ([#1190](https://github.com/aws-powertools/powertools-lambda-python/issues/1190)) ## [v1.25.9](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.25.8...v1.25.9) - 2022-04-21 ## Bug Fixes - **deps:** correct py36 marker for jmespath ## Maintenance - bump to 1.25.9 ## [v1.25.8](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.25.7...v1.25.8) - 2022-04-21 ## Bug Fixes - removed ambiguous quotes from labels. - **deps:** update jmespath marker to support 1.0 and py3.6 ([#1139](https://github.com/aws-powertools/powertools-lambda-python/issues/1139)) - **governance:** update label in names in issues ## Documentation - **install:** instructions to reduce pydantic package size ([#1077](https://github.com/aws-powertools/powertools-lambda-python/issues/1077)) - **layer:** remove link from clipboard button ([#1135](https://github.com/aws-powertools/powertools-lambda-python/issues/1135)) - **layer:** update to 1.25.7 ## Maintenance - bump to 1.25.8 - **deps:** bump codecov/codecov-action from 2.1.0 to 3.0.0 ([#1102](https://github.com/aws-powertools/powertools-lambda-python/issues/1102)) - **deps:** bump actions/upload-artifact from 2 to 3 ([#1103](https://github.com/aws-powertools/powertools-lambda-python/issues/1103)) - **deps-dev:** bump mkdocs-material from 8.2.4 to 8.2.7 ([#1131](https://github.com/aws-powertools/powertools-lambda-python/issues/1131)) - **deps-dev:** bump pytest from 6.2.5 to 7.0.1 ([#1063](https://github.com/aws-powertools/powertools-lambda-python/issues/1063)) ## [v1.25.7](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.25.6...v1.25.7) - 2022-04-08 ## Bug Fixes - **api_gateway:** allow whitespace in routes' path parameter ([#1099](https://github.com/aws-powertools/powertools-lambda-python/issues/1099)) - **api_gateway:** allow whitespace in routes' path parameter ([#1099](https://github.com/aws-powertools/powertools-lambda-python/issues/1099)) - **idempotency:** pass by value on idem key to guard inadvert mutations ([#1090](https://github.com/aws-powertools/powertools-lambda-python/issues/1090)) - **logger:** clear_state should keep custom key formats ([#1095](https://github.com/aws-powertools/powertools-lambda-python/issues/1095)) - **middleware_factory:** ret type annotation for handler dec ([#1066](https://github.com/aws-powertools/powertools-lambda-python/issues/1066)) ## Documentation - **layer:** update to 1.25.6; cosmetic changes ## Maintenance - bump to 1.25.7 - **governance:** refresh pull request template sections - **governance:** update external non-triage effort disclaimer - **governance:** update static typing to a form - **governance:** update rfc to a form - **governance:** update feat request to a form - **governance:** bug report form typo - **governance:** update docs report to a form - **governance:** update bug report to a form - **governance:** new ask a question - **governance:** new static typing report ## [v1.25.6](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.25.5...v1.25.6) - 2022-04-01 ## Bug Fixes - **logger:** clear_state regression on absent standard keys ([#1088](https://github.com/aws-powertools/powertools-lambda-python/issues/1088)) ## Documentation - **layer:** bump to 1.25.5 ## Maintenance - bump to 1.25.6 ## [v1.25.5](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.25.4...v1.25.5) - 2022-03-18 ## Bug Fixes - **logger-utils:** regression on exclude set leading to no formatter ([#1080](https://github.com/aws-powertools/powertools-lambda-python/issues/1080)) ## Maintenance - bump to 1.25.5 ## [v1.25.4](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.25.3...v1.25.4) - 2022-03-17 ## Bug Fixes - package_logger as const over logger instance - repurpose test to cover parent loggers case - use addHandler over monkeypatch ## Documentation - **appsync:** fix typo - **contributing:** operational excellence pause - **layer:** update to 1.25.3 ## Maintenance - bump to 1.25.4 - remove duplicate test - comment reason for change - remove unnecessary test - lint unused import ## Regression - service_name fixture ## Pull Requests - Merge pull request [#1075](https://github.com/aws-powertools/powertools-lambda-python/issues/1075) from mploski/fix/existing-loggers-duplicated-logs ## [v1.25.3](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.25.2...v1.25.3) - 2022-03-09 ## Bug Fixes - **logger:** ensure state is cleared for custom formatters ([#1072](https://github.com/aws-powertools/powertools-lambda-python/issues/1072)) ## Documentation - **plugin:** add mermaid to create diagram as code ([#1070](https://github.com/aws-powertools/powertools-lambda-python/issues/1070)) ## Maintenance - bump to 1.25.3 ## [v1.25.2](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.25.1...v1.25.2) - 2022-03-07 ## Bug Fixes - **event_handler:** docs snippets, high-level import CorsConfig ([#1019](https://github.com/aws-powertools/powertools-lambda-python/issues/1019)) - **lambda-authorizer:** allow proxy resources path in arn ([#1051](https://github.com/aws-powertools/powertools-lambda-python/issues/1051)) - **metrics:** flush upon a single metric 100th data point ([#1046](https://github.com/aws-powertools/powertools-lambda-python/issues/1046)) ## Documentation - **layer:** update to 1.25.1 - **parser:** APIGatewayProxyEvent to APIGatewayProxyEventModel ([#1061](https://github.com/aws-powertools/powertools-lambda-python/issues/1061)) ## Maintenance - bump to 1.25.2 - **deps:** bump actions/setup-python from 2.3.1 to 3 ([#1048](https://github.com/aws-powertools/powertools-lambda-python/issues/1048)) - **deps:** bump actions/checkout from 2 to 3 ([#1052](https://github.com/aws-powertools/powertools-lambda-python/issues/1052)) - **deps:** bump actions/github-script from 5 to 6 ([#1023](https://github.com/aws-powertools/powertools-lambda-python/issues/1023)) - **deps:** bump fastjsonschema from 2.15.2 to 2.15.3 ([#949](https://github.com/aws-powertools/powertools-lambda-python/issues/949)) - **deps-dev:** bump mkdocs-material from 8.1.9 to 8.2.4 ([#1054](https://github.com/aws-powertools/powertools-lambda-python/issues/1054)) ## [v1.25.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.25.0...v1.25.1) - 2022-02-14 ## Bug Fixes - **batch:** bugfix to clear exceptions between executions ([#1022](https://github.com/aws-powertools/powertools-lambda-python/issues/1022)) ## Maintenance - bump to 1.25.1 - **layers:** bump to 10 for 1.25.0 ## [v1.25.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.24.2...v1.25.0) - 2022-02-09 ## Bug Fixes - **apigateway:** remove indentation in debug_mode ([#987](https://github.com/aws-powertools/powertools-lambda-python/issues/987)) - **batch:** delete >10 messages in legacy sqs processor ([#818](https://github.com/aws-powertools/powertools-lambda-python/issues/818)) - **ci:** pr label regex for special chars in title - **logger:** exclude source_logger in copy_config_to_registered_loggers ([#1001](https://github.com/aws-powertools/powertools-lambda-python/issues/1001)) - **logger:** test generates logfile ## Documentation - fix syntax errors and line highlights ([#1004](https://github.com/aws-powertools/powertools-lambda-python/issues/1004)) - add better BDD coments - **event-handler:** improve testing section for graphql ([#996](https://github.com/aws-powertools/powertools-lambda-python/issues/996)) - **layer:** update to 1.24.2 - **parameters:** add testing your code section ([#1017](https://github.com/aws-powertools/powertools-lambda-python/issues/1017)) - **theme:** upgrade mkdocs-material to 8.x ([#1002](https://github.com/aws-powertools/powertools-lambda-python/issues/1002)) - **tutorial:** fix broken internal links ([#1000](https://github.com/aws-powertools/powertools-lambda-python/issues/1000)) ## Features - **event-handler:** new resolvers to fix current_event typing ([#978](https://github.com/aws-powertools/powertools-lambda-python/issues/978)) - **logger:** log_event support event data classes (e.g. S3Event) ([#984](https://github.com/aws-powertools/powertools-lambda-python/issues/984)) - **mypy:** complete mypy support for the entire codebase ([#943](https://github.com/aws-powertools/powertools-lambda-python/issues/943)) ## Maintenance - bump to 1.25.0 - correct docs - correct docs - use isinstance over type - **deps-dev:** bump flake8-bugbear from 21.11.29 to 22.1.11 ([#955](https://github.com/aws-powertools/powertools-lambda-python/issues/955)) - **metrics:** fix tests when warnings are disabled ([#994](https://github.com/aws-powertools/powertools-lambda-python/issues/994)) ## Pull Requests - Merge pull request [#971](https://github.com/aws-powertools/powertools-lambda-python/issues/971) from gyft/fix-logger-util-tests ## [v1.24.2](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.24.1...v1.24.2) - 2022-01-21 ## Bug Fixes - **data-classes:** underscore support in api gateway authorizer resource name ([#969](https://github.com/aws-powertools/powertools-lambda-python/issues/969)) ## Documentation - **layer:** update to 1.24.1 ## Maintenance - bump to 1.24.2 ## [v1.24.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.24.0...v1.24.1) - 2022-01-20 ## Bug Fixes - remove unused json import - remove apigw contract when using event-handler, apigw tracing - use decorators, split cold start to ease reading - incorrect log keys, indentation, snippet consistency - remove f-strings that doesn't evaluate expr - **batch:** report multiple failures ([#967](https://github.com/aws-powertools/powertools-lambda-python/issues/967)) - **data-classes:** docstring typos and clean up ([#937](https://github.com/aws-powertools/powertools-lambda-python/issues/937)) - **parameters:** appconfig internal \_get docstrings ([#934](https://github.com/aws-powertools/powertools-lambda-python/issues/934)) ## Documentation - rename quickstart to tutorial in readme - rename to tutorial given the size - add final consideration section - **batch:** snippet typo on batch processed messages iteration ([#951](https://github.com/aws-powertools/powertools-lambda-python/issues/951)) - **batch:** fix typo in context manager keyword ([#938](https://github.com/aws-powertools/powertools-lambda-python/issues/938)) - **homepage:** link to typescript version ([#950](https://github.com/aws-powertools/powertools-lambda-python/issues/950)) - **install:** new lambda layer for 1.24.0 release - **metrics:** keep it consistent with other sections, update metric names - **nav:** make REST and GraphQL event handlers more explicit ([#959](https://github.com/aws-powertools/powertools-lambda-python/issues/959)) - **quickstart:** expand on intro line - **quickstart:** tidy requirements up - **quickstart:** make section agnostic to json lib - **quickstart:** same process for Logger - **quickstart:** add sub-sections, fix highlight & code - **quickstart:** sentence fragmentation, tidy up - **tenets:** make core, non-core more explicit - **tracer:** warning to note on local traces - **tracer:** add initial image, requirements - **tracer:** add annotation, metadata, and image - **tracer:** update ServiceLens image w/ API GW, copywriting - **tutorial:** fix path to images ([#963](https://github.com/aws-powertools/powertools-lambda-python/issues/963)) ## Features - **ci:** auto-notify & close issues on release - **logger:** clone powertools logger config to any Python logger ([#927](https://github.com/aws-powertools/powertools-lambda-python/issues/927)) ## Maintenance - bump to 1.24.1 - bump to 1.24.1 - **ci:** run codeql analysis on push only - **ci:** fix mergify dependabot queue - **ci:** add queue name in mergify - **ci:** remove mergify legacy key - **ci:** update mergify bot breaking change - **ci:** safely label PR based on title - **deps:** bump pydantic from 1.8.2 to 1.9.0 ([#933](https://github.com/aws-powertools/powertools-lambda-python/issues/933)) - **deps-dev:** bump mypy from 0.930 to 0.931 ([#941](https://github.com/aws-powertools/powertools-lambda-python/issues/941)) ## Regression - order to APP logger/service name due to screenshots ## Pull Requests - Merge pull request [#769](https://github.com/aws-powertools/powertools-lambda-python/issues/769) from mploski/docs/quick-start ## [v1.24.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.23.0...v1.24.0) - 2021-12-31 ## Bug Fixes - **apigateway:** support [@app](https://github.com/app).not_found() syntax & housekeeping ([#926](https://github.com/aws-powertools/powertools-lambda-python/issues/926)) - **event-sources:** handle dynamodb null type as none, not bool ([#929](https://github.com/aws-powertools/powertools-lambda-python/issues/929)) - **warning:** future distutils deprecation ([#921](https://github.com/aws-powertools/powertools-lambda-python/issues/921)) ## Documentation - consistency around admonitions and snippets ([#919](https://github.com/aws-powertools/powertools-lambda-python/issues/919)) - Added GraphQL Sample API to Examples section of README.md ([#930](https://github.com/aws-powertools/powertools-lambda-python/issues/930)) - **batch:** remove leftover from legacy - **layer:** bump Lambda Layer to version 6 - **tracer:** new ignore_endpoint feature ([#931](https://github.com/aws-powertools/powertools-lambda-python/issues/931)) ## Features - **event-sources:** cache parsed json in data class ([#909](https://github.com/aws-powertools/powertools-lambda-python/issues/909)) - **feature_flags:** support beyond boolean values (JSON values) ([#804](https://github.com/aws-powertools/powertools-lambda-python/issues/804)) - **idempotency:** support dataclasses & pydantic models payloads ([#908](https://github.com/aws-powertools/powertools-lambda-python/issues/908)) - **logger:** support use_datetime_directive for timestamps ([#920](https://github.com/aws-powertools/powertools-lambda-python/issues/920)) - **tracer:** ignore tracing for certain hostname(s) or url(s) ([#910](https://github.com/aws-powertools/powertools-lambda-python/issues/910)) ## Maintenance - bump to 1.24.0 - **deps-dev:** bump mypy from 0.920 to 0.930 ([#925](https://github.com/aws-powertools/powertools-lambda-python/issues/925)) ## [v1.23.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.22.0...v1.23.0) - 2021-12-20 ## Bug Fixes - **apigateway:** allow list of HTTP methods in route method ([#838](https://github.com/aws-powertools/powertools-lambda-python/issues/838)) - **event-sources:** Pass authorizer data to APIGatewayEventAuthorizer ([#897](https://github.com/aws-powertools/powertools-lambda-python/issues/897)) - **event-sources:** handle claimsOverrideDetails set to null ([#878](https://github.com/aws-powertools/powertools-lambda-python/issues/878)) - **idempotency:** include decorated fn name in hash ([#869](https://github.com/aws-powertools/powertools-lambda-python/issues/869)) - **metrics:** explicit type to single_metric ctx manager ([#865](https://github.com/aws-powertools/powertools-lambda-python/issues/865)) - **parameters:** appconfig transform and return types ([#877](https://github.com/aws-powertools/powertools-lambda-python/issues/877)) - **parser:** overload parse when using envelope ([#885](https://github.com/aws-powertools/powertools-lambda-python/issues/885)) - **parser:** kinesis sequence number is str, not int ([#907](https://github.com/aws-powertools/powertools-lambda-python/issues/907)) - **parser:** mypy support for payload type override as models ([#883](https://github.com/aws-powertools/powertools-lambda-python/issues/883)) - **tracer:** add warm start annotation (ColdStart=False) ([#851](https://github.com/aws-powertools/powertools-lambda-python/issues/851)) ## Documentation - external reference to cloudformation custom resource helper ([#914](https://github.com/aws-powertools/powertools-lambda-python/issues/914)) - add new public Slack invite - disable search blur in non-prod env - update Lambda Layers version - **apigateway:** add new not_found feature ([#915](https://github.com/aws-powertools/powertools-lambda-python/issues/915)) - **apigateway:** fix sample layout provided ([#864](https://github.com/aws-powertools/powertools-lambda-python/issues/864)) - **appsync:** fix users.py typo to locations [#830](https://github.com/aws-powertools/powertools-lambda-python/issues/830) - **lambda_layer:** fix CDK layer syntax ## Features - **apigateway:** add exception_handler support ([#898](https://github.com/aws-powertools/powertools-lambda-python/issues/898)) - **apigateway:** access parent api resolver from router ([#842](https://github.com/aws-powertools/powertools-lambda-python/issues/842)) - **batch:** new BatchProcessor for SQS, DynamoDB, Kinesis ([#886](https://github.com/aws-powertools/powertools-lambda-python/issues/886)) - **logger:** allow handler with custom kwargs signature ([#913](https://github.com/aws-powertools/powertools-lambda-python/issues/913)) - **tracer:** add service annotation when service is set ([#861](https://github.com/aws-powertools/powertools-lambda-python/issues/861)) ## Maintenance - correct pr label order - minor housekeeping before release ([#912](https://github.com/aws-powertools/powertools-lambda-python/issues/912)) - bump to 1.23.0 - **ci:** split latest docs workflow - **deps:** bump fastjsonschema from 2.15.1 to 2.15.2 ([#891](https://github.com/aws-powertools/powertools-lambda-python/issues/891)) - **deps:** bump actions/setup-python from 2.2.2 to 2.3.0 ([#831](https://github.com/aws-powertools/powertools-lambda-python/issues/831)) - **deps:** bump aws-xray-sdk from 2.8.0 to 2.9.0 ([#876](https://github.com/aws-powertools/powertools-lambda-python/issues/876)) - **deps:** support arm64 when developing locally ([#862](https://github.com/aws-powertools/powertools-lambda-python/issues/862)) - **deps:** bump actions/setup-python from 2.3.0 to 2.3.1 ([#852](https://github.com/aws-powertools/powertools-lambda-python/issues/852)) - **deps-dev:** bump flake8 from 3.9.2 to 4.0.1 ([#789](https://github.com/aws-powertools/powertools-lambda-python/issues/789)) - **deps-dev:** bump black from 21.10b0 to 21.11b1 ([#839](https://github.com/aws-powertools/powertools-lambda-python/issues/839)) - **deps-dev:** bump black from 21.11b1 to 21.12b0 ([#872](https://github.com/aws-powertools/powertools-lambda-python/issues/872)) - **deps-dev:** bump mypy from 0.910 to 0.920 ([#903](https://github.com/aws-powertools/powertools-lambda-python/issues/903)) ## [v1.22.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.21.1...v1.22.0) - 2021-11-17 ## Bug Fixes - change supported python version from 3.6.1 to 3.6.2, bump black ([#807](https://github.com/aws-powertools/powertools-lambda-python/issues/807)) - **ci:** comment custom publish version checker - **ci:** skip sync master on docs hotfix - **parser:** body/QS can be null or omitted in apigw v1/v2 ([#820](https://github.com/aws-powertools/powertools-lambda-python/issues/820)) ## Code Refactoring - **apigateway:** Add BaseRouter and duplicate route check ([#757](https://github.com/aws-powertools/powertools-lambda-python/issues/757)) ## Documentation - updated Lambda Layers definition & limitations. ([#775](https://github.com/aws-powertools/powertools-lambda-python/issues/775)) - Idiomatic tenet updated to Progressive - use higher contrast font ([#822](https://github.com/aws-powertools/powertools-lambda-python/issues/822)) - use higher contrast font - fix indentation of SAM snippets in install section ([#778](https://github.com/aws-powertools/powertools-lambda-python/issues/778)) - improve public lambda layer wording, clipboard buttons ([#762](https://github.com/aws-powertools/powertools-lambda-python/issues/762)) - add amplify-cli instructions for public layer ([#754](https://github.com/aws-powertools/powertools-lambda-python/issues/754)) - **api-gateway:** add support for new router feature ([#767](https://github.com/aws-powertools/powertools-lambda-python/issues/767)) - **apigateway:** re-add sample layout, add considerations ([#826](https://github.com/aws-powertools/powertools-lambda-python/issues/826)) - **appsync:** add new router feature ([#821](https://github.com/aws-powertools/powertools-lambda-python/issues/821)) - **idempotency:** add support for DynamoDB composite keys ([#808](https://github.com/aws-powertools/powertools-lambda-python/issues/808)) - **tenets:** update Idiomatic tenet to Progressive ([#823](https://github.com/aws-powertools/powertools-lambda-python/issues/823)) ## Features - **apigateway:** add Router to allow large routing composition ([#645](https://github.com/aws-powertools/powertools-lambda-python/issues/645)) - **appsync:** add Router to allow large resolver composition ([#776](https://github.com/aws-powertools/powertools-lambda-python/issues/776)) - **data-classes:** ActiveMQ and RabbitMQ support ([#770](https://github.com/aws-powertools/powertools-lambda-python/issues/770)) - **logger:** add ALB correlation ID support ([#816](https://github.com/aws-powertools/powertools-lambda-python/issues/816)) ## Maintenance - fix var expr - remove Lambda Layer version tag - bump to 1.22.0 - conditional to publish docs only attempt 3 - conditional to publish docs only attempt 2 - conditional to publish docs only - **deps:** bump boto3 from 1.18.58 to 1.18.59 ([#760](https://github.com/aws-powertools/powertools-lambda-python/issues/760)) - **deps:** bump boto3 from 1.18.56 to 1.18.58 ([#755](https://github.com/aws-powertools/powertools-lambda-python/issues/755)) - **deps:** bump urllib3 from 1.26.4 to 1.26.5 ([#787](https://github.com/aws-powertools/powertools-lambda-python/issues/787)) - **deps:** bump boto3 from 1.19.6 to 1.20.3 ([#809](https://github.com/aws-powertools/powertools-lambda-python/issues/809)) - **deps:** bump boto3 from 1.18.61 to 1.19.6 ([#783](https://github.com/aws-powertools/powertools-lambda-python/issues/783)) - **deps:** bump boto3 from 1.20.3 to 1.20.5 ([#817](https://github.com/aws-powertools/powertools-lambda-python/issues/817)) - **deps:** bump boto3 from 1.18.59 to 1.18.61 ([#766](https://github.com/aws-powertools/powertools-lambda-python/issues/766)) - **deps-dev:** bump coverage from 6.0.1 to 6.0.2 ([#764](https://github.com/aws-powertools/powertools-lambda-python/issues/764)) - **deps-dev:** bump pytest-asyncio from 0.15.1 to 0.16.0 ([#782](https://github.com/aws-powertools/powertools-lambda-python/issues/782)) - **deps-dev:** bump flake8-eradicate from 1.1.0 to 1.2.0 ([#784](https://github.com/aws-powertools/powertools-lambda-python/issues/784)) - **deps-dev:** bump flake8-isort from 4.0.0 to 4.1.1 ([#785](https://github.com/aws-powertools/powertools-lambda-python/issues/785)) - **deps-dev:** bump mkdocs-material from 7.3.2 to 7.3.3 ([#758](https://github.com/aws-powertools/powertools-lambda-python/issues/758)) - **deps-dev:** bump flake8-comprehensions from 3.6.1 to 3.7.0 ([#759](https://github.com/aws-powertools/powertools-lambda-python/issues/759)) - **deps-dev:** bump mkdocs-material from 7.3.3 to 7.3.5 ([#781](https://github.com/aws-powertools/powertools-lambda-python/issues/781)) - **deps-dev:** bump coverage from 6.0 to 6.0.1 ([#751](https://github.com/aws-powertools/powertools-lambda-python/issues/751)) - **deps-dev:** bump mkdocs-material from 7.3.5 to 7.3.6 ([#791](https://github.com/aws-powertools/powertools-lambda-python/issues/791)) - **deps-dev:** bump coverage from 6.0.2 to 6.1.2 ([#810](https://github.com/aws-powertools/powertools-lambda-python/issues/810)) - **deps-dev:** bump isort from 5.9.3 to 5.10.1 ([#811](https://github.com/aws-powertools/powertools-lambda-python/issues/811)) ## [v1.21.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.21.0...v1.21.1) - 2021-10-07 ## Documentation - add new public layer ARNs ([#746](https://github.com/aws-powertools/powertools-lambda-python/issues/746)) ## Maintenance - include public layers changelog - bump to 1.21.1 - include regression in changelog - ignore constants in test cov ([#745](https://github.com/aws-powertools/powertools-lambda-python/issues/745)) - ignore constants in tests cov - add support for publishing fallback - **deps:** bump boto3 from 1.18.54 to 1.18.56 ([#742](https://github.com/aws-powertools/powertools-lambda-python/issues/742)) - **deps-dev:** bump mkdocs-material from 7.3.1 to 7.3.2 ([#741](https://github.com/aws-powertools/powertools-lambda-python/issues/741)) ## Regression - **metrics:** typing regression on log_metrics callable ([#744](https://github.com/aws-powertools/powertools-lambda-python/issues/744)) ## [v1.21.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.20.2...v1.21.0) - 2021-10-05 ## Bug Fixes - **data-classes:** use correct asdict funciton ([#666](https://github.com/aws-powertools/powertools-lambda-python/issues/666)) - **feature-flags:** rules should evaluate with an AND op ([#724](https://github.com/aws-powertools/powertools-lambda-python/issues/724)) - **idempotency:** sorting keys before hashing ([#722](https://github.com/aws-powertools/powertools-lambda-python/issues/722)) - **idempotency:** sorting keys before hashing - **logger:** push extra keys to the end ([#722](https://github.com/aws-powertools/powertools-lambda-python/issues/722)) - **mypy:** a few return types, type signatures, and untyped areas ([#718](https://github.com/aws-powertools/powertools-lambda-python/issues/718)) ## Code Refactoring - **data-classes:** clean up internal logic for APIGatewayAuthorizerResponse ([#643](https://github.com/aws-powertools/powertools-lambda-python/issues/643)) ## Documentation - Terraform reference for SAR Lambda Layer ([#716](https://github.com/aws-powertools/powertools-lambda-python/issues/716)) - add team behind it and email - **event-handler:** document catch-all routes ([#705](https://github.com/aws-powertools/powertools-lambda-python/issues/705)) - **idempotency:** fix misleading idempotent examples ([#661](https://github.com/aws-powertools/powertools-lambda-python/issues/661)) - **jmespath:** clarify envelope terminology - **parser:** fix incorrect import in root_validator example ([#735](https://github.com/aws-powertools/powertools-lambda-python/issues/735)) ## Features - expose jmespath powertools functions ([#736](https://github.com/aws-powertools/powertools-lambda-python/issues/736)) - add get_raw_configuration property in store; expose store - boto3 sessions in batch, parameters & idempotency ([#717](https://github.com/aws-powertools/powertools-lambda-python/issues/717)) - **feature-flags:** Bring your own logger for debug ([#709](https://github.com/aws-powertools/powertools-lambda-python/issues/709)) - **feature-flags:** improve "IN/NOT_IN"; new rule actions ([#710](https://github.com/aws-powertools/powertools-lambda-python/issues/710)) - **feature-flags:** get_raw_configuration property in Store ([#720](https://github.com/aws-powertools/powertools-lambda-python/issues/720)) - **feature_flags:** Added inequality conditions ([#721](https://github.com/aws-powertools/powertools-lambda-python/issues/721)) - **idempotency:** makes customers unit testing easier ([#719](https://github.com/aws-powertools/powertools-lambda-python/issues/719)) - **validator:** include missing data elements from a validation error ([#686](https://github.com/aws-powertools/powertools-lambda-python/issues/686)) ## Maintenance - add python 3.9 support - bump to 1.21.0 - **deps:** bump boto3 from 1.18.41 to 1.18.49 ([#703](https://github.com/aws-powertools/powertools-lambda-python/issues/703)) - **deps:** bump boto3 from 1.18.32 to 1.18.38 ([#671](https://github.com/aws-powertools/powertools-lambda-python/issues/671)) - **deps:** bump boto3 from 1.18.38 to 1.18.41 ([#677](https://github.com/aws-powertools/powertools-lambda-python/issues/677)) - **deps:** bump boto3 from 1.18.51 to 1.18.54 ([#733](https://github.com/aws-powertools/powertools-lambda-python/issues/733)) - **deps:** bump boto3 from 1.18.49 to 1.18.51 ([#713](https://github.com/aws-powertools/powertools-lambda-python/issues/713)) - **deps:** bump codecov/codecov-action from 2.0.2 to 2.1.0 ([#675](https://github.com/aws-powertools/powertools-lambda-python/issues/675)) - **deps-dev:** bump flake8-bugbear from 21.9.1 to 21.9.2 ([#712](https://github.com/aws-powertools/powertools-lambda-python/issues/712)) - **deps-dev:** bump mkdocs-material from 7.3.0 to 7.3.1 ([#731](https://github.com/aws-powertools/powertools-lambda-python/issues/731)) - **deps-dev:** bump mkdocs-material from 7.2.8 to 7.3.0 ([#695](https://github.com/aws-powertools/powertools-lambda-python/issues/695)) - **deps-dev:** bump mkdocs-material from 7.2.6 to 7.2.8 ([#682](https://github.com/aws-powertools/powertools-lambda-python/issues/682)) - **deps-dev:** bump flake8-bugbear from 21.4.3 to 21.9.1 ([#676](https://github.com/aws-powertools/powertools-lambda-python/issues/676)) - **deps-dev:** bump coverage from 5.5 to 6.0 ([#732](https://github.com/aws-powertools/powertools-lambda-python/issues/732)) - **deps-dev:** bump radon from 4.5.2 to 5.1.0 ([#673](https://github.com/aws-powertools/powertools-lambda-python/issues/673)) - **deps-dev:** bump pytest-cov from 2.12.1 to 3.0.0 ([#730](https://github.com/aws-powertools/powertools-lambda-python/issues/730)) - **deps-dev:** bump xenon from 0.7.3 to 0.8.0 ([#669](https://github.com/aws-powertools/powertools-lambda-python/issues/669)) ## [v1.20.2](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.20.1...v1.20.2) - 2021-09-02 ## Bug Fixes - Fix issue with strip_prefixes ([#647](https://github.com/aws-powertools/powertools-lambda-python/issues/647)) ## Maintenance - bump to 1.20.2 - **deps:** bump boto3 from 1.18.26 to 1.18.32 ([#663](https://github.com/aws-powertools/powertools-lambda-python/issues/663)) - **deps-dev:** bump mkdocs-material from 7.2.4 to 7.2.6 ([#665](https://github.com/aws-powertools/powertools-lambda-python/issues/665)) - **deps-dev:** bump pytest from 6.2.4 to 6.2.5 ([#662](https://github.com/aws-powertools/powertools-lambda-python/issues/662)) - **license:** Add THIRD-PARTY-LICENSES ([#641](https://github.com/aws-powertools/powertools-lambda-python/issues/641)) ## [v1.20.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.20.0...v1.20.1) - 2021-08-22 ## Bug Fixes - **idempotency:** sorting keys before hashing ([#639](https://github.com/aws-powertools/powertools-lambda-python/issues/639)) ## Maintenance - bump to 1.20.1 - markdown linter fixes ([#636](https://github.com/aws-powertools/powertools-lambda-python/issues/636)) - setup codespaces ([#637](https://github.com/aws-powertools/powertools-lambda-python/issues/637)) - **license:** add third party license ([#635](https://github.com/aws-powertools/powertools-lambda-python/issues/635)) ## [v1.20.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.19.0...v1.20.0) - 2021-08-21 ## Bug Fixes - **api-gateway:** HTTP API strip stage name from request path ([#622](https://github.com/aws-powertools/powertools-lambda-python/issues/622)) - **docs:** correct feature_flags link and json exmaples ([#605](https://github.com/aws-powertools/powertools-lambda-python/issues/605)) ## Code Refactoring - **event_handler:** match to match_results; 3.10 new keyword ([#616](https://github.com/aws-powertools/powertools-lambda-python/issues/616)) ## Documentation - **api-gateway:** add new API mapping support - **data-class:** fix invalid syntax in new AppSync Authorizer - **data-classes:** make authorizer concise; use enum ([#630](https://github.com/aws-powertools/powertools-lambda-python/issues/630)) ## Features - **data-classes:** authorizer for http api and rest api ([#620](https://github.com/aws-powertools/powertools-lambda-python/issues/620)) - **data-classes:** data_as_bytes prop KinesisStreamRecordPayload ([#628](https://github.com/aws-powertools/powertools-lambda-python/issues/628)) - **data-classes:** AppSync Lambda authorizer event ([#610](https://github.com/aws-powertools/powertools-lambda-python/issues/610)) - **event-handler:** prefixes to strip for custom mappings ([#579](https://github.com/aws-powertools/powertools-lambda-python/issues/579)) - **general:** support for Python 3.9 ([#626](https://github.com/aws-powertools/powertools-lambda-python/issues/626)) - **idempotency:** support for any synchronous function ([#625](https://github.com/aws-powertools/powertools-lambda-python/issues/625)) ## Maintenance - update changelog to reflect out-of-band commits - bump to 1.20.0 - update new changelog version tag - **actions:** include new labels - **api-docs:** enable allow_reuse to fix the docs ([#612](https://github.com/aws-powertools/powertools-lambda-python/issues/612)) - **deps:** bump boto3 from 1.18.25 to 1.18.26 ([#627](https://github.com/aws-powertools/powertools-lambda-python/issues/627)) - **deps:** bump boto3 from 1.18.24 to 1.18.25 ([#623](https://github.com/aws-powertools/powertools-lambda-python/issues/623)) - **deps:** bump boto3 from 1.18.22 to 1.18.24 ([#619](https://github.com/aws-powertools/powertools-lambda-python/issues/619)) - **deps:** bump boto3 from 1.18.21 to 1.18.22 ([#614](https://github.com/aws-powertools/powertools-lambda-python/issues/614)) - **deps:** bump boto3 from 1.18.17 to 1.18.21 ([#608](https://github.com/aws-powertools/powertools-lambda-python/issues/608)) - **deps-dev:** bump flake8-comprehensions from 3.6.0 to 3.6.1 ([#615](https://github.com/aws-powertools/powertools-lambda-python/issues/615)) - **deps-dev:** bump flake8-comprehensions from 3.5.0 to 3.6.0 ([#609](https://github.com/aws-powertools/powertools-lambda-python/issues/609)) - **deps-dev:** bump mkdocs-material from 7.2.3 to 7.2.4 ([#607](https://github.com/aws-powertools/powertools-lambda-python/issues/607)) - **docs:** correct markdown based on markdown lint ([#603](https://github.com/aws-powertools/powertools-lambda-python/issues/603)) - **shared:** fix cyclic import & refactor data extraction fn ([#613](https://github.com/aws-powertools/powertools-lambda-python/issues/613)) ## [v1.19.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.18.1...v1.19.0) - 2021-08-11 ## Bug Fixes - **deps:** bump poetry to latest ([#592](https://github.com/aws-powertools/powertools-lambda-python/issues/592)) - **feature-flags:** bug handling multiple conditions ([#599](https://github.com/aws-powertools/powertools-lambda-python/issues/599)) - **feature-toggles:** correct cdk example ([#601](https://github.com/aws-powertools/powertools-lambda-python/issues/601)) - **parser:** apigw wss validation check_message_id; housekeeping ([#553](https://github.com/aws-powertools/powertools-lambda-python/issues/553)) ## Code Refactoring - **feature-flags:** add debug for all features evaluation" ([#590](https://github.com/aws-powertools/powertools-lambda-python/issues/590)) - **feature_flags:** optimize UX and maintenance ([#563](https://github.com/aws-powertools/powertools-lambda-python/issues/563)) ## Documentation - **event-handler:** new custom serializer option - **feature-flags:** add guidance when to use vs env vars vs parameters - **feature-flags:** fix sample feature name in evaluate - **feature-flags:** create concrete documentation ([#594](https://github.com/aws-powertools/powertools-lambda-python/issues/594)) - **feature-toggles:** correct docs and typing ([#588](https://github.com/aws-powertools/powertools-lambda-python/issues/588)) - **feature_flags:** fix SAM infra, convert CDK to Python - **parameters:** auto-transforming values based on suffix ([#573](https://github.com/aws-powertools/powertools-lambda-python/issues/573)) - **readme:** add code coverage badge ([#577](https://github.com/aws-powertools/powertools-lambda-python/issues/577)) - **tracer:** update wording that it auto-disables on non-Lambda env ## Features - **api-gateway:** add support for custom serializer ([#568](https://github.com/aws-powertools/powertools-lambda-python/issues/568)) - **data-classes:** decode json_body if based64 encoded ([#560](https://github.com/aws-powertools/powertools-lambda-python/issues/560)) - **feature flags:** Add not_in action and rename contains to in ([#589](https://github.com/aws-powertools/powertools-lambda-python/issues/589)) - **params:** expose high level max_age, raise_on_transform_error ([#567](https://github.com/aws-powertools/powertools-lambda-python/issues/567)) - **tracer:** disable tracer when for non-Lambda envs ([#598](https://github.com/aws-powertools/powertools-lambda-python/issues/598)) ## Maintenance - only build docs on docs path - update pypi description, keywords - bump to 1.19.0 - enable autolabel based on PR title - include feature-flags docs hotfix - **deps:** bump boto3 from 1.18.15 to 1.18.17 ([#597](https://github.com/aws-powertools/powertools-lambda-python/issues/597)) - **deps:** bump boto3 from 1.18.1 to 1.18.15 ([#591](https://github.com/aws-powertools/powertools-lambda-python/issues/591)) - **deps:** bump codecov/codecov-action from 2.0.1 to 2.0.2 ([#558](https://github.com/aws-powertools/powertools-lambda-python/issues/558)) - **deps-dev:** bump mkdocs-material from 7.2.1 to 7.2.2 ([#582](https://github.com/aws-powertools/powertools-lambda-python/issues/582)) - **deps-dev:** bump mkdocs-material from 7.2.2 to 7.2.3 ([#596](https://github.com/aws-powertools/powertools-lambda-python/issues/596)) - **deps-dev:** bump pdoc3 from 0.9.2 to 0.10.0 ([#584](https://github.com/aws-powertools/powertools-lambda-python/issues/584)) - **deps-dev:** bump isort from 5.9.2 to 5.9.3 ([#574](https://github.com/aws-powertools/powertools-lambda-python/issues/574)) - **deps-dev:** bump mkdocs-material from 7.2.0 to 7.2.1 ([#566](https://github.com/aws-powertools/powertools-lambda-python/issues/566)) - **deps-dev:** bump mkdocs-material from 7.1.11 to 7.2.0 ([#551](https://github.com/aws-powertools/powertools-lambda-python/issues/551)) - **deps-dev:** bump flake8-black from 0.2.1 to 0.2.3 ([#541](https://github.com/aws-powertools/powertools-lambda-python/issues/541)) ## [v1.18.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.18.0...v1.18.1) - 2021-07-23 ## Bug Fixes - **api-gateway:** route regression non-word and unsafe URI chars ([#556](https://github.com/aws-powertools/powertools-lambda-python/issues/556)) ## Maintenance - bump 1.18.1 ## [v1.18.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.17.1...v1.18.0) - 2021-07-20 ## Bug Fixes - **api-gateway:** non-greedy route pattern regex ([#533](https://github.com/aws-powertools/powertools-lambda-python/issues/533)) - **api-gateway:** incorrect plain text mimetype [#506](https://github.com/aws-powertools/powertools-lambda-python/issues/506) - **data-classes:** include milliseconds in scalar types ([#504](https://github.com/aws-powertools/powertools-lambda-python/issues/504)) - **mypy:** fixes to resolve no implicit optional errors ([#521](https://github.com/aws-powertools/powertools-lambda-python/issues/521)) - **parser:** Make ApiGateway version, authorizer fields optional ([#532](https://github.com/aws-powertools/powertools-lambda-python/issues/532)) - **tracer:** mypy generic to preserve decorated method signature ([#529](https://github.com/aws-powertools/powertools-lambda-python/issues/529)) ## Code Refactoring - **feature-toggles:** Code coverage and housekeeping ([#530](https://github.com/aws-powertools/powertools-lambda-python/issues/530)) ## Documentation - **api-gateway:** document new HTTP service error exceptions ([#546](https://github.com/aws-powertools/powertools-lambda-python/issues/546)) - **logger:** document new get_correlation_id method ([#545](https://github.com/aws-powertools/powertools-lambda-python/issues/545)) ## Features - **api-gateway:** add debug mode ([#507](https://github.com/aws-powertools/powertools-lambda-python/issues/507)) - **api-gateway:** add common service errors ([#506](https://github.com/aws-powertools/powertools-lambda-python/issues/506)) - **event-handler:** Support AppSyncResolverEvent subclassing ([#526](https://github.com/aws-powertools/powertools-lambda-python/issues/526)) - **feat-toggle:** New simple feature toggles rule engine (WIP) ([#494](https://github.com/aws-powertools/powertools-lambda-python/issues/494)) - **logger:** add get_correlation_id method ([#516](https://github.com/aws-powertools/powertools-lambda-python/issues/516)) - **mypy:** add mypy support to makefile ([#508](https://github.com/aws-powertools/powertools-lambda-python/issues/508)) ## Maintenance - bump 1.18.0 ([#547](https://github.com/aws-powertools/powertools-lambda-python/issues/547)) - **deps:** bump codecov/codecov-action from 1 to 2.0.1 ([#539](https://github.com/aws-powertools/powertools-lambda-python/issues/539)) - **deps:** bump boto3 from 1.18.0 to 1.18.1 ([#528](https://github.com/aws-powertools/powertools-lambda-python/issues/528)) - **deps:** bump boto3 from 1.17.110 to 1.18.0 ([#527](https://github.com/aws-powertools/powertools-lambda-python/issues/527)) - **deps:** bump boto3 from 1.17.102 to 1.17.110 ([#523](https://github.com/aws-powertools/powertools-lambda-python/issues/523)) - **deps-dev:** bump mkdocs-material from 7.1.10 to 7.1.11 ([#542](https://github.com/aws-powertools/powertools-lambda-python/issues/542)) - **deps-dev:** bump mkdocs-material from 7.1.9 to 7.1.10 ([#522](https://github.com/aws-powertools/powertools-lambda-python/issues/522)) - **deps-dev:** bump isort from 5.9.1 to 5.9.2 ([#514](https://github.com/aws-powertools/powertools-lambda-python/issues/514)) - **event-handler:** adjusts exception docstrings to not confuse AppSync customers ## [v1.17.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.17.0...v1.17.1) - 2021-07-02 ## Bug Fixes - **validator:** handle built-in custom formats correctly ([#498](https://github.com/aws-powertools/powertools-lambda-python/issues/498)) ## Documentation - add Layers example for Serverless framework & CDK ([#500](https://github.com/aws-powertools/powertools-lambda-python/issues/500)) - enable dark mode switch ([#471](https://github.com/aws-powertools/powertools-lambda-python/issues/471)) - **logger:** add FAQ for cross-account searches ([#501](https://github.com/aws-powertools/powertools-lambda-python/issues/501)) - **tracer:** additional scenario when to disable auto-capture ([#499](https://github.com/aws-powertools/powertools-lambda-python/issues/499)) ## Maintenance - bump 1.17.1 ([#502](https://github.com/aws-powertools/powertools-lambda-python/issues/502)) - **deps:** bump boto3 from 1.17.101 to 1.17.102 ([#493](https://github.com/aws-powertools/powertools-lambda-python/issues/493)) - **deps:** bump boto3 from 1.17.91 to 1.17.101 ([#490](https://github.com/aws-powertools/powertools-lambda-python/issues/490)) - **deps:** bump email-validator from 1.1.2 to 1.1.3 ([#478](https://github.com/aws-powertools/powertools-lambda-python/issues/478)) - **deps:** bump boto3 from 1.17.89 to 1.17.91 ([#473](https://github.com/aws-powertools/powertools-lambda-python/issues/473)) - **deps-dev:** bump flake8-eradicate from 1.0.0 to 1.1.0 ([#492](https://github.com/aws-powertools/powertools-lambda-python/issues/492)) - **deps-dev:** bump isort from 5.8.0 to 5.9.1 ([#487](https://github.com/aws-powertools/powertools-lambda-python/issues/487)) - **deps-dev:** bump mkdocs-material from 7.1.7 to 7.1.9 ([#491](https://github.com/aws-powertools/powertools-lambda-python/issues/491)) ## [v1.17.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.16.1...v1.17.0) - 2021-06-08 ## Documentation - include new public roadmap ([#452](https://github.com/aws-powertools/powertools-lambda-python/issues/452)) - **data_classes:** fix missing dynamodb stream get_type/value - **idempotency:** remove old todo ## Features - **data-classes:** add AttributeValueType to DynamoDBStreamEvent ([#462](https://github.com/aws-powertools/powertools-lambda-python/issues/462)) - **data-classes:** decorator to instantiate data_classes and docs updates ([#442](https://github.com/aws-powertools/powertools-lambda-python/issues/442)) - **logger:** add option to clear state per invocation ([#467](https://github.com/aws-powertools/powertools-lambda-python/issues/467)) - **parser:** add support for API Gateway HTTP API [#434](https://github.com/aws-powertools/powertools-lambda-python/issues/434) ([#441](https://github.com/aws-powertools/powertools-lambda-python/issues/441)) ## Maintenance - bump xenon from 0.7.1 to 0.7.3 ([#446](https://github.com/aws-powertools/powertools-lambda-python/issues/446)) - fix changelog file redirection - include dependencies label under maintenance - ignore codecov upload - reintroduce codecov token - fix path for PR auto-labelling - assited changelog pre-generation, auto-label PR ([#443](https://github.com/aws-powertools/powertools-lambda-python/issues/443)) - enable dependabot for dep upgrades ([#444](https://github.com/aws-powertools/powertools-lambda-python/issues/444)) - enable mergify ([#450](https://github.com/aws-powertools/powertools-lambda-python/issues/450)) - dependabot/mergify guardrail for major versions - fix dependabot commit messages prefix - fix dependabot unique set config - bump mkdocs-material from 7.1.5 to 7.1.6 ([#451](https://github.com/aws-powertools/powertools-lambda-python/issues/451)) - bump boto3 from 1.17.78 to 1.17.84 ([#449](https://github.com/aws-powertools/powertools-lambda-python/issues/449)) - update mergify to require approval on dependabot ([#456](https://github.com/aws-powertools/powertools-lambda-python/issues/456)) - bump actions/setup-python from 1 to 2.2.2 ([#445](https://github.com/aws-powertools/powertools-lambda-python/issues/445)) - trial boring cyborg automation - **deps:** bump boto3 from 1.17.87 to 1.17.88 ([#463](https://github.com/aws-powertools/powertools-lambda-python/issues/463)) - **deps:** bump boto3 from 1.17.88 to 1.17.89 ([#466](https://github.com/aws-powertools/powertools-lambda-python/issues/466)) - **deps:** bump boto3 from 1.17.84 to 1.17.85 ([#455](https://github.com/aws-powertools/powertools-lambda-python/issues/455)) - **deps:** bump boto3 from 1.17.85 to 1.17.86 ([#458](https://github.com/aws-powertools/powertools-lambda-python/issues/458)) - **deps:** bump boto3 from 1.17.86 to 1.17.87 ([#459](https://github.com/aws-powertools/powertools-lambda-python/issues/459)) - **deps-dev:** bump mkdocs-material from 7.1.6 to 7.1.7 ([#464](https://github.com/aws-powertools/powertools-lambda-python/issues/464)) - **deps-dev:** bump pytest-cov from 2.12.0 to 2.12.1 ([#454](https://github.com/aws-powertools/powertools-lambda-python/issues/454)) - **mergify:** use job name to match GH Actions - **mergify:** disable check for matrix jobs ## [v1.16.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.16.0...v1.16.1) - 2021-05-23 ## Features - **parser:** security issue in Pydantic [#436](https://github.com/aws-powertools/powertools-lambda-python/issues/436) ([#437](https://github.com/aws-powertools/powertools-lambda-python/issues/437)) ## Maintenance - bump to 1.16.1 ## [v1.16.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.15.1...v1.16.0) - 2021-05-17 ## Features - **data-classes:** decode base64 encoded body ([#425](https://github.com/aws-powertools/powertools-lambda-python/issues/425)) - **data-classes:** support for code pipeline job event ([#416](https://github.com/aws-powertools/powertools-lambda-python/issues/416)) ## Maintenance - bump to 1.16.0 ## [v1.15.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.15.0...v1.15.1) - 2021-05-13 ## Bug Fixes - **docs:** Use updated names for ProxyEventType ([#424](https://github.com/aws-powertools/powertools-lambda-python/issues/424)) ## Documentation - update list of features - **event_handler:** add missing note on trimmed responses ## Maintenance - bump to 1.15.1 ## [v1.15.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.14.0...v1.15.0) - 2021-05-06 ## Bug Fixes - **deps:** Bump aws-xray-sdk from 2.6.0 to 2.8.0 ([#413](https://github.com/aws-powertools/powertools-lambda-python/issues/413)) - **docs:** workflow to include api ref in latest alias ([#408](https://github.com/aws-powertools/powertools-lambda-python/issues/408)) - **parser:** Improve types for parser.py ([#419](https://github.com/aws-powertools/powertools-lambda-python/issues/419)) - **validator:** event type annotation as any in validate fn ([#405](https://github.com/aws-powertools/powertools-lambda-python/issues/405)) ## Code Refactoring - simplify custom formatter for minor changes ([#417](https://github.com/aws-powertools/powertools-lambda-python/issues/417)) - **event-handler:** api gateway handler review changes ([#420](https://github.com/aws-powertools/powertools-lambda-python/issues/420)) - **event-handler:** Add ResponseBuilder and more docs ([#412](https://github.com/aws-powertools/powertools-lambda-python/issues/412)) - **logger:** BYOFormatter and Handler, UTC support, and more ([#404](https://github.com/aws-powertools/powertools-lambda-python/issues/404)) ## Documentation - **api_gateway:** new event handler for API Gateway and ALB ([#418](https://github.com/aws-powertools/powertools-lambda-python/issues/418)) - **event_handler:** fix closing brackets in CORS sample - **event_handler:** remove beta flag from new HTTP utility - **idempotency:** remove beta flag - **logger:** improvements extensibility & new features ([#415](https://github.com/aws-powertools/powertools-lambda-python/issues/415)) - **parser:** fix table and heading syntax - **tracer:** Fix line highlighting ([#395](https://github.com/aws-powertools/powertools-lambda-python/issues/395)) ## Features - add support to persist default dimensions ([#410](https://github.com/aws-powertools/powertools-lambda-python/issues/410)) - **event-handle:** allow for cors=None setting ([#421](https://github.com/aws-powertools/powertools-lambda-python/issues/421)) - **event-handler:** add http ProxyEvent handler ([#369](https://github.com/aws-powertools/powertools-lambda-python/issues/369)) - **parser:** Support for API GW v1 proxy schema & envelope ([#403](https://github.com/aws-powertools/powertools-lambda-python/issues/403)) ## Maintenance - bump to 1.15.0 ([#422](https://github.com/aws-powertools/powertools-lambda-python/issues/422)) ## [v1.14.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.13.0...v1.14.0) - 2021-04-09 ## Bug Fixes - perf tests for Logger and fail str msgs - downgrade poetry to 1.1.4 ([#385](https://github.com/aws-powertools/powertools-lambda-python/issues/385)) - lock X-Ray SDK to 2.6.0 ([#384](https://github.com/aws-powertools/powertools-lambda-python/issues/384)) - **data-classes:** Add missing operationName ([#373](https://github.com/aws-powertools/powertools-lambda-python/issues/373)) - **idempotent:** Correctly raise IdempotencyKeyError ([#378](https://github.com/aws-powertools/powertools-lambda-python/issues/378)) - **metrics:** AttributeError raised by MediaManager and Typing and docs ([#357](https://github.com/aws-powertools/powertools-lambda-python/issues/357)) - **parser:** S3Model support empty keys ([#375](https://github.com/aws-powertools/powertools-lambda-python/issues/375)) - **tracer:** Correct type hint for MyPy ([#365](https://github.com/aws-powertools/powertools-lambda-python/issues/365)) - **workflow:** github actions depends on for release ## Documentation - Fix doc links and line highlights ([#380](https://github.com/aws-powertools/powertools-lambda-python/issues/380)) - fix extra key for versioning - update mkdocs-material to 7.1.0 - Correct link targets and line highlights ([#390](https://github.com/aws-powertools/powertools-lambda-python/issues/390)) - introduce event handlers utility section ([#388](https://github.com/aws-powertools/powertools-lambda-python/issues/388)) - enable versioning feature ([#374](https://github.com/aws-powertools/powertools-lambda-python/issues/374)) - **idempotency:** add default configuration for those not using CFN ([#391](https://github.com/aws-powertools/powertools-lambda-python/issues/391)) - **index:** fix link to event handler - **logger:** add example on how to set UTC timestamp ([#392](https://github.com/aws-powertools/powertools-lambda-python/issues/392)) - **validator:** include more complete examples & intro to JSON Schema ([#389](https://github.com/aws-powertools/powertools-lambda-python/issues/389)) ## Features - **event-handler:** Add AppSync handler decorator ([#363](https://github.com/aws-powertools/powertools-lambda-python/issues/363)) - **parameter:** add dynamodb_endpoint_url for local_testing ([#376](https://github.com/aws-powertools/powertools-lambda-python/issues/376)) - **parser:** Add S3 Object Lambda Event ([#362](https://github.com/aws-powertools/powertools-lambda-python/issues/362)) ## Maintenance - bump to 1.14.0 - add approved by field in RFC template - make RFC proposal more explicit - update automated steps in release process ## [v1.13.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.12.0...v1.13.0) - 2021-03-23 ## Bug Fixes - **deps:** Bump dependencies and fix some of the dev tooling ([#354](https://github.com/aws-powertools/powertools-lambda-python/issues/354)) - **lint:** Move `tests/THIRD-PARTY-LICENSES` to root ([#352](https://github.com/aws-powertools/powertools-lambda-python/issues/352)) ## Features - **data-classes:** Add S3 Object Lambda Event ([#353](https://github.com/aws-powertools/powertools-lambda-python/issues/353)) ## Maintenance - include internals in release template - bump to 1.13.0 - correct 3rd party license ## [v1.12.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.11.0...v1.12.0) - 2021-03-17 ## Bug Fixes - **idempotency:** TypeError when calling is_missing_idempotency_key with an int ([#315](https://github.com/aws-powertools/powertools-lambda-python/issues/315)) - **idempotency:** Correctly handle save_inprogress errors ([#313](https://github.com/aws-powertools/powertools-lambda-python/issues/313)) ## Code Refactoring - **parameters:** Consistently reference env ([#319](https://github.com/aws-powertools/powertools-lambda-python/issues/319)) ## Documentation - surface new 1.12.0 features and enhancements ([#344](https://github.com/aws-powertools/powertools-lambda-python/issues/344)) - Correct code examples ([#317](https://github.com/aws-powertools/powertools-lambda-python/issues/317)) - **data-classes:** Add more cognito code examples ([#340](https://github.com/aws-powertools/powertools-lambda-python/issues/340)) - **idempotency:** Correct examples and line highlights ([#312](https://github.com/aws-powertools/powertools-lambda-python/issues/312)) - **metrics:** Corrections to the code examples ([#314](https://github.com/aws-powertools/powertools-lambda-python/issues/314)) - **metrics:** remove minimum dimensions - **metrics:** Correct code examples in markdown ([#316](https://github.com/aws-powertools/powertools-lambda-python/issues/316)) - **tracer:** Fix Tracer typing hinting for Pycharm ([#345](https://github.com/aws-powertools/powertools-lambda-python/issues/345)) ## Features - **data-classes:** Add appsync scalar_types_utils ([#339](https://github.com/aws-powertools/powertools-lambda-python/issues/339)) - **data-classes:** AppSync Resolver Event ([#323](https://github.com/aws-powertools/powertools-lambda-python/issues/323)) - **idempotent:** Include function name in the idempotent key ([#326](https://github.com/aws-powertools/powertools-lambda-python/issues/326)) - **logging:** Add correlation_id support ([#321](https://github.com/aws-powertools/powertools-lambda-python/issues/321)) - **logging:** Include exception_name ([#320](https://github.com/aws-powertools/powertools-lambda-python/issues/320)) - **parameters:** Add force_fetch option ([#341](https://github.com/aws-powertools/powertools-lambda-python/issues/341)) ## Maintenance - bump to 1.12.0 - remove auto-label as restrictions prevent it from working - increase perf SLA due to slow GitHub Actions machine - add PR size labelling action # 2 - add PR size labelling action - add PR auto-label action - remove gatsby mention as migrated completed ## [v1.11.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.10.5...v1.11.0) - 2021-03-05 ## Bug Fixes - import time latency by lazily loading high level modules ([#301](https://github.com/aws-powertools/powertools-lambda-python/issues/301)) - correct behaviour to avoid caching "INPROGRESS" records ([#295](https://github.com/aws-powertools/powertools-lambda-python/issues/295)) - **idempotency:** PR feedback on config and kwargs ## Code Refactoring - **idempotent:** Change UX to use a config class for non-persistence related features ([#306](https://github.com/aws-powertools/powertools-lambda-python/issues/306)) - **metrics:** optimize validation and serialization ([#307](https://github.com/aws-powertools/powertools-lambda-python/issues/307)) ## Documentation - **batch:** add example on how to integrate with sentry.io ([#308](https://github.com/aws-powertools/powertools-lambda-python/issues/308)) - **data-classes:** Correct import for DynamoDBRecordEventName ([#299](https://github.com/aws-powertools/powertools-lambda-python/issues/299)) - **dataclasses:** new Connect Contact Flow ([#310](https://github.com/aws-powertools/powertools-lambda-python/issues/310)) - **idempotency:** tidy up doc before release ([#309](https://github.com/aws-powertools/powertools-lambda-python/issues/309)) - **idempotent:** Fix typos and code formatting ([#305](https://github.com/aws-powertools/powertools-lambda-python/issues/305)) ## Features - Idempotency helper utility ([#245](https://github.com/aws-powertools/powertools-lambda-python/issues/245)) - **data-classes:** Add connect contact flow event ([#304](https://github.com/aws-powertools/powertools-lambda-python/issues/304)) - **idempotency:** Add raise_on_no_idempotency_key flag ([#297](https://github.com/aws-powertools/powertools-lambda-python/issues/297)) - **idempotency:** Fix KeyError when local_cache is True and an error is raised in the lambda handler ([#300](https://github.com/aws-powertools/powertools-lambda-python/issues/300)) - **idempotent:** Add support for jmespath_options ([#302](https://github.com/aws-powertools/powertools-lambda-python/issues/302)) ## Maintenance - update changelog ([#311](https://github.com/aws-powertools/powertools-lambda-python/issues/311)) - adjusts Metrics SLA for slow py36 interpreters - remove unsuccessful labeler bot - update labeler bot to sync upon PR changes - attempt 1 to fix PR labeler ## [v1.10.5](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.10.4...v1.10.5) - 2021-02-17 ## Maintenance - version bump to 1.10.5 ([#292](https://github.com/aws-powertools/powertools-lambda-python/issues/292)) ## [v1.10.4](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.10.3...v1.10.4) - 2021-02-17 ## Bug Fixes - sync features in main page - meta tags, and ext link to open in new tab ## Documentation - **data-classes:** Fix anchor tags to be lower case ([#288](https://github.com/aws-powertools/powertools-lambda-python/issues/288)) ## Maintenance - version bump to 1.10.4 ([#291](https://github.com/aws-powertools/powertools-lambda-python/issues/291)) - add default runtime key - Correct the docs location ([#289](https://github.com/aws-powertools/powertools-lambda-python/issues/289)) - enable PR labeler workflow - add auto-label for known files ## Regression - search input size ## [v1.10.3](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.10.2...v1.10.3) - 2021-02-12 ## Bug Fixes - sfix typing hit for envelope parse model ([#286](https://github.com/aws-powertools/powertools-lambda-python/issues/286)) - disable batching of X-Ray subsegments ([#284](https://github.com/aws-powertools/powertools-lambda-python/issues/284)) ## Documentation - migrate documentation from Gatsby to MkDocs material ([#279](https://github.com/aws-powertools/powertools-lambda-python/issues/279)) ## Maintenance - bump to 1.10.3 ([#287](https://github.com/aws-powertools/powertools-lambda-python/issues/287)) ## [v1.10.2](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.10.1...v1.10.2) - 2021-02-04 ## Bug Fixes - remove unnecessary typing-extensions for py3.8 ([#281](https://github.com/aws-powertools/powertools-lambda-python/issues/281)) - batch processing exceptions ([#276](https://github.com/aws-powertools/powertools-lambda-python/issues/276)) ## Documentation - **appconfig:** Use correct import for docstring ([#271](https://github.com/aws-powertools/powertools-lambda-python/issues/271)) ## Maintenance - bump to 1.10.2 ([#282](https://github.com/aws-powertools/powertools-lambda-python/issues/282)) - fix immer and socket.io CVEs ([#278](https://github.com/aws-powertools/powertools-lambda-python/issues/278)) - typo in parser docs ## [v1.10.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.10.0...v1.10.1) - 2021-01-19 ## Features - add support for SNS->SQS protocol ([#272](https://github.com/aws-powertools/powertools-lambda-python/issues/272)) ## Maintenance - bump to 1.10.1 ([#273](https://github.com/aws-powertools/powertools-lambda-python/issues/273)) ## [v1.10.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.9.1...v1.10.0) - 2021-01-18 ## Documentation - fix import ([#267](https://github.com/aws-powertools/powertools-lambda-python/issues/267)) - add info about extras layer ([#260](https://github.com/aws-powertools/powertools-lambda-python/issues/260)) - fix note whitespace - add missing parser models ([#254](https://github.com/aws-powertools/powertools-lambda-python/issues/254)) ## Features - toggle to disable log deduplication locally for pytest live log [#262](https://github.com/aws-powertools/powertools-lambda-python/issues/262) ([#268](https://github.com/aws-powertools/powertools-lambda-python/issues/268)) - Add AppConfig parameter provider ([#236](https://github.com/aws-powertools/powertools-lambda-python/issues/236)) - support extra parameter in Logger messages ([#257](https://github.com/aws-powertools/powertools-lambda-python/issues/257)) - support custom formats in JSON Schema validation ([#247](https://github.com/aws-powertools/powertools-lambda-python/issues/247)) ## Maintenance - bump to 1.10.0 ([#270](https://github.com/aws-powertools/powertools-lambda-python/issues/270)) - move env names to constant file ([#264](https://github.com/aws-powertools/powertools-lambda-python/issues/264)) - update stale bot - general simplifications and cleanup ([#255](https://github.com/aws-powertools/powertools-lambda-python/issues/255)) - hardcode axios transitive resolution ([#256](https://github.com/aws-powertools/powertools-lambda-python/issues/256)) ## [v1.9.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.9.0...v1.9.1) - 2020-12-21 ## Bug Fixes - ensures all Loggers have unique service names ## Code Refactoring - convert dict into a literal dict object and re-use it ## Documentation - add clarification to Tracer docs for how `capture_method` decorator can cause function responses to be read and serialized. ## Features - **pep-561:** Create py.typed file and include into pyproject. ## Maintenance - bump to 1.9.1 ([#252](https://github.com/aws-powertools/powertools-lambda-python/issues/252)) - add changelog - implement phony targets correctly - **deps:** bump ini from 1.3.5 to 1.3.8 in /docs ## Pull Requests - Merge pull request [#250](https://github.com/aws-powertools/powertools-lambda-python/issues/250) from heitorlessa/fix/[#249](https://github.com/aws-powertools/powertools-lambda-python/issues/249) - Merge pull request [#235](https://github.com/aws-powertools/powertools-lambda-python/issues/235) from Nr18/phony - Merge pull request [#244](https://github.com/aws-powertools/powertools-lambda-python/issues/244) from awslabs/docs/capture_method_clarification - Merge pull request [#241](https://github.com/aws-powertools/powertools-lambda-python/issues/241) from awslabs/dependabot/npm_and_yarn/docs/ini-1.3.8 - Merge pull request [#237](https://github.com/aws-powertools/powertools-lambda-python/issues/237) from gmcrocetti/pep-561 - Merge pull request [#234](https://github.com/aws-powertools/powertools-lambda-python/issues/234) from Nr18/test-equal - Merge pull request [#233](https://github.com/aws-powertools/powertools-lambda-python/issues/233) from GroovyDan/improv/add_equality_check_to_dict_wrapper - Merge pull request [#232](https://github.com/aws-powertools/powertools-lambda-python/issues/232) from gyft/add-missing-tests ## [v1.9.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.8.0...v1.9.0) - 2020-12-04 ## Bug Fixes - s3 model import - cloudwatch logs envelope typo ## Documentation - add Kinesis Streams as a supported model & envelope - add S3 as a supported model - add CW Logs as a supported envelope - add CW Logs as a supported model - add Alb as a supported model - shadow sidebar to remain expanded - add source code link in nav bar - fix broken link for github ## Features - Add Kinesis lambda event support to Parser utility - Add cloudwatch lambda event support to Parser utility - Add alb lambda event support to Parser utility [#228](https://github.com/aws-powertools/powertools-lambda-python/issues/228) - Add Kinesis lambda event support to Parser utility - Add S3 lambda event support to Parser utility [#224](https://github.com/aws-powertools/powertools-lambda-python/issues/224) - Add Ses lambda event support to Parser utility [#213](https://github.com/aws-powertools/powertools-lambda-python/issues/213) ## Maintenance ## Pull Requests - Merge pull request [#227](https://github.com/aws-powertools/powertools-lambda-python/issues/227) from risenberg-cyberark/kinesis - Merge pull request [#225](https://github.com/aws-powertools/powertools-lambda-python/issues/225) from risenberg-cyberark/s3 - Merge pull request [#231](https://github.com/aws-powertools/powertools-lambda-python/issues/231) from risenberg-cyberark/cloudwatch - Merge pull request [#229](https://github.com/aws-powertools/powertools-lambda-python/issues/229) from risenberg-cyberark/alb - Merge pull request [#223](https://github.com/aws-powertools/powertools-lambda-python/issues/223) from heitorlessa/docs/add-source-code-link - Merge pull request [#222](https://github.com/aws-powertools/powertools-lambda-python/issues/222) from awslabs/docs-fix-broken-link - Merge pull request [#219](https://github.com/aws-powertools/powertools-lambda-python/issues/219) from igorlg/docs/logger-supress-clarify - Merge pull request [#214](https://github.com/aws-powertools/powertools-lambda-python/issues/214) from risenberg-cyberark/ses ## [v1.8.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.7.0...v1.8.0) - 2020-11-20 ## Bug Fixes - replace now deprecated set-env with new GitHub Env file - remove dummy heading to prevent htmlAst bug ## Documentation - correct example usage of SES data class - add faq section - add minimal permission set for using layer ## Features - include new replay-name field in parser and data_classes - **data_classes:** API Gateway V2 IAM and Lambda ## Maintenance - bump to 1.8.0 - bump dependencies - **docs:** Add some of the missing docstrings ## Pull Requests - Merge pull request [#212](https://github.com/aws-powertools/powertools-lambda-python/issues/212) from heitorlessa/chore/bump-1.8.0 - Merge pull request [#211](https://github.com/aws-powertools/powertools-lambda-python/issues/211) from heitorlessa/feat/eventbridge-replay-support - Merge pull request [#209](https://github.com/aws-powertools/powertools-lambda-python/issues/209) from awslabs/docs/correct_ses_dataclass_example - Merge pull request [#207](https://github.com/aws-powertools/powertools-lambda-python/issues/207) from risenberg-cyberark/sns - Merge pull request [#205](https://github.com/aws-powertools/powertools-lambda-python/issues/205) from heitorlessa/chore/update-docs-dep - Merge pull request [#202](https://github.com/aws-powertools/powertools-lambda-python/issues/202) from Nr18/logger-faq - Merge pull request [#204](https://github.com/aws-powertools/powertools-lambda-python/issues/204) from am29d/docs/add-iam-permissions-for-layer - Merge pull request [#201](https://github.com/aws-powertools/powertools-lambda-python/issues/201) from gyft/feat-data-classes-event-updates ## [v1.7.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.6.1...v1.7.0) - 2020-10-26 ## Bug Fixes - \_parse return type - high and security peer dependency vulnerabilities - change to Yarn to support manual resolutions - generic type to match ABC bound class - debug logging in envelopes before each parsing - remove malformed 3.1. sentence - ensures parser can take json strings as input - parse high level import - code inspect issues - unnecessary return; better error handling - snake_case - comment out validators [#118](https://github.com/aws-powertools/powertools-lambda-python/issues/118) - CR fixes Merge branch 'develop' of https://github.com/awslabs/aws-lambda-powertools-python into pydantic - reduce complexity of dynamo envelope - poetry update + pydantic, typing_extensions as optional - add only pydantic (+1 squashed commit) Squashed commits: [804f251] fix poetry.lock, revert changes - Correct typo - remove only dev extras - remove jmespath extras in Make ## Code Refactoring - pydantic as optional dependancy, remove lambdaContext - change to advanced parser ## Documentation - reorder parser's payload sample position - add more info on conditional keys [#195](https://github.com/aws-powertools/powertools-lambda-python/issues/195) - add a note that decorator will replace the event - address Ran's feedback - reorder data validation; improve envelopes section - reorder extending models as parse fn wasn't introduced - use yarn's resolution to fix incompatible dependency - add cold start data - add a FAQ section - ensure examples can be copied/pasted as-is - add extending built-in models - add envelope section - add data model validation section - use non-hello world model to better exemplify parsing - add 101 parsing events content - initial structure for parser docs - initial sketch of parser docs - update examples in README ## Features - experiment with codeQL over LGTM - add standalone parse function - Advanced parser utility (pydantic) - RFC: Validate incoming and outgoing events utility [#95](https://github.com/aws-powertools/powertools-lambda-python/issues/95) - **data_classes:** case insensitive header lookup - **data_classes:** Cognito custom auth triggers ## Maintenance - fix repository URL - spacing - typo in list - typo on code generation tool - remove flake8 polyfill as explicit dep - explicit DynamoDB Stream schema naming - lint - kwarg over arg to ease refactoring - remove test for commented code - fix make build syntax for internal build whl - upgrade docs dep - remove dev deps from example project - remove kitchen sink example - upgrade gatsby - upgrade amplify, antd, and gatsby plugins - upgrade apollo-docs theme - remove dev deps from example project - remove kitchen sink example ## Reverts - fix: remove jmespath extras in Make - fix: remove jmespath extras in Make ## Pull Requests - Merge pull request [#200](https://github.com/aws-powertools/powertools-lambda-python/issues/200) from heitorlessa/chore/bump-1.7.0 - Merge pull request [#199](https://github.com/aws-powertools/powertools-lambda-python/issues/199) from heitorlessa/docs/clarify-dynamic-log-keys - Merge pull request [#198](https://github.com/aws-powertools/powertools-lambda-python/issues/198) from awslabs/improv/suppress-logger-propagation - Merge pull request [#192](https://github.com/aws-powertools/powertools-lambda-python/issues/192) from heitorlessa/docs/parser - Merge pull request [#196](https://github.com/aws-powertools/powertools-lambda-python/issues/196) from awslabs/dependabot/npm_and_yarn/docs/object-path-0.11.5 - Merge pull request [#189](https://github.com/aws-powertools/powertools-lambda-python/issues/189) from heitorlessa/improv/parser[#118](https://github.com/aws-powertools/powertools-lambda-python/issues/118) - Merge pull request [#186](https://github.com/aws-powertools/powertools-lambda-python/issues/186) from gyft/feat-case-insensitive-dict - Merge pull request [#188](https://github.com/aws-powertools/powertools-lambda-python/issues/188) from gyft/tests-pydantic - Merge pull request [#178](https://github.com/aws-powertools/powertools-lambda-python/issues/178) from gyft/cognito-custom-auth - Merge pull request [#118](https://github.com/aws-powertools/powertools-lambda-python/issues/118) from risenberg-cyberark/pydantic - Merge pull request [#181](https://github.com/aws-powertools/powertools-lambda-python/issues/181) from awslabs/fix/docs-sec-vuln - Merge pull request [#180](https://github.com/aws-powertools/powertools-lambda-python/issues/180) from heitorlessa/chore/remove-example ## [v1.6.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.6.0...v1.6.1) - 2020-09-23 ## Maintenance - bump to 1.6.1 ([#177](https://github.com/aws-powertools/powertools-lambda-python/issues/177)) ## [v1.6.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.5.0...v1.6.0) - 2020-09-22 ## Bug Fixes - apply Tom's suggestion - branding - Correct description for data classes util - duplicate features content - navigation, branding - remove DeleteMessageBatch call to SQS api if there are no messages to delete ([#170](https://github.com/aws-powertools/powertools-lambda-python/issues/170)) - correct type hint Dict instead of dict ## Code Refactoring - correct type hint ## Documentation - fixed more typos, correct index reference to new util - fix typo in DynamoDB example - add docs for data classes utility - improve wording on jmespath fns - document validator utility ## Features - add custom jmespath functions support - emf multiple metric values ([#167](https://github.com/aws-powertools/powertools-lambda-python/issues/167)) - add initial validator tests - add cloudwatch_logs based on Bryan's feedback - add powertools_base64 custom fn - add built-in envelopes - add jmespath as optional dependency - add initial draft simple validator - **trigger:** data class and helper functions for lambda trigger events ([#159](https://github.com/aws-powertools/powertools-lambda-python/issues/159)) ## Maintenance - typo - bump to 1.6.0 - better type hinting - update changelog - fix docstring; import order ## Pull Requests - Merge pull request [#175](https://github.com/aws-powertools/powertools-lambda-python/issues/175) from heitorlessa/chore/bump-1.6.0 - Merge pull request [#171](https://github.com/aws-powertools/powertools-lambda-python/issues/171) from awslabs/docs/data_classes - Merge pull request [#174](https://github.com/aws-powertools/powertools-lambda-python/issues/174) from heitorlessa/improv/docs-logger-metrics-testing - Merge pull request [#168](https://github.com/aws-powertools/powertools-lambda-python/issues/168) from gyft/tests-missing - Merge pull request [#153](https://github.com/aws-powertools/powertools-lambda-python/issues/153) from heitorlessa/feat/validator-utility ## [v1.5.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.4.0...v1.5.0) - 2020-09-04 ## Bug Fixes - throw exception by default if messages processing fails - add sqs_batch_processor as its own method - ensure debug log event has latest ctx - update image with correct sample - ensures xray_trace_id is refreshed - typo in example - include proposed suggestions - **base-partial:** append record instead of entry - **logging:** Don't include `json_default` in logs ([#132](https://github.com/aws-powertools/powertools-lambda-python/issues/132)) ## Code Refactoring - changes partial_sqs middleware in favor of a generic interface always expecting a BatchProcessor - replace LambdaEvent with Dict[str, Any] - remove initial reference - fix import issues and provide context in docblocks - split properties and add docblocks - split the objects into seperate files - make requested changes - use None instead of - batch middleware - remove references to BaseProcessor. Left BasePartialProcessor - change return for failure/success handlers - **sqs:** add module middlewares - **sqs:** change methods to protected - **tests:** update tests to new batch processor middleware - **tests:** processor using default config ## Documentation - address readability feedbacks - add detail to batch processing - simplify documentation more SQS specific focus Update for sqs_batch_processor interface - rephrase the wording to make it more clear - refactor example; improve docs about creating your own processor - add newly created Slack Channel - describe the typing utility - add troubleshooting section - add xray_trace_id key - fix suggestions made by [@heitorlessa](https://github.com/heitorlessa) - add description where to find the layer arn ([#145](https://github.com/aws-powertools/powertools-lambda-python/issues/145)) - new section "Migrating from other Loggers" ([#148](https://github.com/aws-powertools/powertools-lambda-python/issues/148)) - minor edit to letter case part 2 - user specific documentation - Fix doc for log sampling ([#135](https://github.com/aws-powertools/powertools-lambda-python/issues/135)) - **partial-processor:** add simple docstrings to success/failure handlers - **sqs:** docstrings for PartialSQS - **sqs-base:** docstring for base class ## Features - add xray_trace_id key when tracing is active [#137](https://github.com/aws-powertools/powertools-lambda-python/issues/137) - initial implementation as the proposed gist is - add sqs failure processors - include base processors - add batch module - add package level import for batch utility - **logger:** readable log_dict seq - **logging:** suppress some log keys - **logging:** allow for custom json order - **parameters:** transform = "auto" ([#133](https://github.com/aws-powertools/powertools-lambda-python/issues/133)) - **sqs:** add optional config parameter - **sqs:** improve validation for queue_url ## Maintenance - tiny changes for readability - add debug logging for sqs batch processing - remove middlewares module, moving decorator functionality to base and sqs - add test for sqs_batch_processor interface - add sqs_batch_processor decorator to simplify interface - fix typos, docstrings and type hints ([#154](https://github.com/aws-powertools/powertools-lambda-python/issues/154)) - doc typo - **batch:** Housekeeping for recent changes ([#157](https://github.com/aws-powertools/powertools-lambda-python/issues/157)) ## Pull Requests - Merge pull request [#149](https://github.com/aws-powertools/powertools-lambda-python/issues/149) from Nr18/static-types - Merge pull request [#155](https://github.com/aws-powertools/powertools-lambda-python/issues/155) from awslabs/docs/batch_processing_util - Merge pull request [#100](https://github.com/aws-powertools/powertools-lambda-python/issues/100) from gmcrocetti/partial-sqs-batch - Merge pull request [#151](https://github.com/aws-powertools/powertools-lambda-python/issues/151) from Nr18/troubleshooting - Merge pull request [#150](https://github.com/aws-powertools/powertools-lambda-python/issues/150) from heitorlessa/feat/logger-add-xray-trace-id - Merge pull request [#140](https://github.com/aws-powertools/powertools-lambda-python/issues/140) from gyft/fix-log-key-order - Merge pull request [#142](https://github.com/aws-powertools/powertools-lambda-python/issues/142) from gyft/fix-letter-case ## [v1.4.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.3.1...v1.4.0) - 2020-08-25 ## Bug Fixes - upgrade dot-prop, serialize-javascript - remove actual response from debug logs - naming and staticmethod consistency - correct in_subsegment assertion - update cold_start doc to reflect [#125](https://github.com/aws-powertools/powertools-lambda-python/issues/125) - split ColdStart metric to its own EMF blob [#125](https://github.com/aws-powertools/powertools-lambda-python/issues/125) - **ssm:** Make decrypt an explicit option and refactoring ([#123](https://github.com/aws-powertools/powertools-lambda-python/issues/123)) ## Documentation - add Lambda Layer SAR App url and ARN - move tenets; remove extra space - use table for clarity - add blog post, and quick example - subtle rewording for better clarity - fix typos, log_event & sampling wording - make sensitive info more explicit with an example - create Patching modules section; cleanup response wording - move concurrent asynchronous under escape hatch - grammar - bring new feature upfront when returning sensitive info ## Features - capture_response as metadata option [#127](https://github.com/aws-powertools/powertools-lambda-python/issues/127) ## Maintenance - bump to 1.4.0 - update internal docstrings for consistency - update changelog to reflect new feature - clarify changelog bugfix vs breaking change - remove/correct unnecessary debug logs - fix debug log adding unused obj - grammar - add metrics fix description - correct typos ## Pull Requests - Merge pull request [#129](https://github.com/aws-powertools/powertools-lambda-python/issues/129) from am29d/feat/lambda-layers - Merge pull request [#130](https://github.com/aws-powertools/powertools-lambda-python/issues/130) from heitorlessa/docs/readability-improvements - Merge pull request [#128](https://github.com/aws-powertools/powertools-lambda-python/issues/128) from heitorlessa/feat/tracer-disallow-response-metadata - Merge pull request [#126](https://github.com/aws-powertools/powertools-lambda-python/issues/126) from heitorlessa/fix/metrics-cold-start-split ## [v1.3.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.3.0...v1.3.1) - 2020-08-22 ## Bug Fixes - **capture_method:** should yield inside with ([#124](https://github.com/aws-powertools/powertools-lambda-python/issues/124)) ## Maintenance - version bump to 1.3.1 - **deps:** bump prismjs from 1.20.0 to 1.21.0 in /docs - **deps:** bump elliptic from 6.5.2 to 6.5.3 in /docs ## Pull Requests - Merge pull request [#120](https://github.com/aws-powertools/powertools-lambda-python/issues/120) from awslabs/dependabot/npm_and_yarn/docs/elliptic-6.5.3 - Merge pull request [#121](https://github.com/aws-powertools/powertools-lambda-python/issues/121) from awslabs/dependabot/npm_and_yarn/docs/prismjs-1.21.0 ## [v1.3.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.2.0...v1.3.0) - 2020-08-21 ## Features - add parameter utility ([#96](https://github.com/aws-powertools/powertools-lambda-python/issues/96)) ## Maintenance ## [v1.2.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.1.3...v1.2.0) - 2020-08-20 ## Features - add support for tracing of generators using capture_method decorator ([#113](https://github.com/aws-powertools/powertools-lambda-python/issues/113)) ## Maintenance ## [v1.1.3](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.1.2...v1.1.3) - 2020-08-18 ## Bug Fixes - remove root logger handler set by Lambda [#115](https://github.com/aws-powertools/powertools-lambda-python/issues/115) ## Maintenance - bump to 1.1.3 ## Pull Requests - Merge pull request [#117](https://github.com/aws-powertools/powertools-lambda-python/issues/117) from heitorlessa/chore/bump-1.1.3 - Merge pull request [#116](https://github.com/aws-powertools/powertools-lambda-python/issues/116) from heitorlessa/fix/remove-root-logger-handler ## [v1.1.2](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.1.1...v1.1.2) - 2020-08-16 ## Bug Fixes - return subclass [#107](https://github.com/aws-powertools/powertools-lambda-python/issues/107) ## Documentation - clarify auto_patch as per [#108](https://github.com/aws-powertools/powertools-lambda-python/issues/108) ## Maintenance - suppress LGTM alert - add autocomplete as unreleased - remove unused stdout fixture - update Tracer docs as per [#108](https://github.com/aws-powertools/powertools-lambda-python/issues/108) ## Pull Requests - Merge pull request [#111](https://github.com/aws-powertools/powertools-lambda-python/issues/111) from heitorlessa/chore/bump-1.1.2 - Merge pull request [#110](https://github.com/aws-powertools/powertools-lambda-python/issues/110) from heitorlessa/improv/logger-auto-complete - Merge pull request [#109](https://github.com/aws-powertools/powertools-lambda-python/issues/109) from heitorlessa/docs/tracer-reuse ## [v1.1.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.1.0...v1.1.1) - 2020-08-14 ## Bug Fixes - regression 104 ([#105](https://github.com/aws-powertools/powertools-lambda-python/issues/105)) - return log level int immediately - add test covering logging constant ## Maintenance - bump patch version - fix unused fixture - fix docstring on level [str,int] consistency - fix test level typo - trigger docs on new release ([#102](https://github.com/aws-powertools/powertools-lambda-python/issues/102)) ([#103](https://github.com/aws-powertools/powertools-lambda-python/issues/103)) - trigger docs on new release ([#102](https://github.com/aws-powertools/powertools-lambda-python/issues/102)) - trigger docs on new release ## Regression - log level docstring as str ## Pull Requests - Merge pull request [#106](https://github.com/aws-powertools/powertools-lambda-python/issues/106) from heitorlessa/fix/regression-104 ## [v1.1.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.0.2...v1.1.0) - 2020-08-14 ## Bug Fixes - auto-assigner filename as per docs ## Features - add support for logger inheritance ([#99](https://github.com/aws-powertools/powertools-lambda-python/issues/99)) - enable issue auto-assigner to core team ## Maintenance - bump to 1.1.0 ([#101](https://github.com/aws-powertools/powertools-lambda-python/issues/101)) - **deps:** bump lodash from 4.17.15 to 4.17.19 in /docs ([#93](https://github.com/aws-powertools/powertools-lambda-python/issues/93)) ## [v1.0.2](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.0.1...v1.0.2) - 2020-07-16 ## Maintenance - bump to 1.0.2 ([#90](https://github.com/aws-powertools/powertools-lambda-python/issues/90)) - support aws-xray-sdk >=2.5.0 till \<3.0.0 ([#89](https://github.com/aws-powertools/powertools-lambda-python/issues/89)) ## [v1.0.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v1.0.0...v1.0.1) - 2020-07-05 ## Bug Fixes - append structured logs when injecting lambda context ([#86](https://github.com/aws-powertools/powertools-lambda-python/issues/86)) ## Documentation - add blog post in the readme ## [v1.0.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v0.11.0...v1.0.0) - 2020-06-18 ## Documentation - customize contributing guide ([#77](https://github.com/aws-powertools/powertools-lambda-python/issues/77)) ## Features - docs anonymized page view ([#82](https://github.com/aws-powertools/powertools-lambda-python/issues/82)) - add metrics metadata ([#81](https://github.com/aws-powertools/powertools-lambda-python/issues/81)) ## Maintenance - bump to 1.0.0 GA ([#83](https://github.com/aws-powertools/powertools-lambda-python/issues/83)) - add missing ':' and identation in examples - cleanup tests ([#79](https://github.com/aws-powertools/powertools-lambda-python/issues/79)) - remove deprecated code before GA ([#78](https://github.com/aws-powertools/powertools-lambda-python/issues/78)) - move blockquotes as hidden comments ## [v0.11.0](https://github.com/aws-powertools/powertools-lambda-python/compare/v0.10.1...v0.11.0) - 2020-06-10 ## Bug Fixes - default dimension creation now happens when metrics are serialized instead of on metrics constructor ([#74](https://github.com/aws-powertools/powertools-lambda-python/issues/74)) ## Maintenance - update CHANGELOG ## [v0.10.1](https://github.com/aws-powertools/powertools-lambda-python/compare/v0.10.0...v0.10.1) - 2020-06-10 ## Bug Fixes - default dimension creation now happens when metrics are serialized instead of on metrics constructor ([#74](https://github.com/aws-powertools/powertools-lambda-python/issues/74)) ## Documentation - fix contrast on highlighted code text ([#73](https://github.com/aws-powertools/powertools-lambda-python/issues/73)) ## Features - improve error handling for log_metrics decorator ([#71](https://github.com/aws-powertools/powertools-lambda-python/issues/71)) - add high level imports ([#70](https://github.com/aws-powertools/powertools-lambda-python/issues/70)) ## Maintenance - version bump 0.10.1 - **deps:** bump graphql-playground-html from 1.6.19 to 1.6.25 in /docs ## Pull Requests - Merge pull request [#72](https://github.com/aws-powertools/powertools-lambda-python/issues/72) from awslabs/dependabot/npm_and_yarn/docs/graphql-playground-html-1.6.25 ## v0.10.0 - 2020-06-08 ## Bug Fixes - correct env var name for publish to pypi test ([#69](https://github.com/aws-powertools/powertools-lambda-python/issues/69)) - release-drafter action syntax - release-drafter label for new feature/major non-breaking changes - cast dimension value to str to avoid issue where EMF silently fails ([#52](https://github.com/aws-powertools/powertools-lambda-python/issues/52)) - ignore path that might seem a broken link [#49](https://github.com/aws-powertools/powertools-lambda-python/issues/49) - open api ref in a new tab [#48](https://github.com/aws-powertools/powertools-lambda-python/issues/48) - metrics not being flushed on every invocation ([#45](https://github.com/aws-powertools/powertools-lambda-python/issues/45)) - [#35](https://github.com/aws-powertools/powertools-lambda-python/issues/35) duplicate changelog to project root - [#24](https://github.com/aws-powertools/powertools-lambda-python/issues/24) correct example test and docs - CI attempt 4 - CI attempt 3 - CI attempt 3 - CI attempt 2 - add missing single_metric example; test var name - fix import of aws_lambda_logging to relative import - **Makefile:** format before linting - **make:** add twine as a dev dep - **setup:** correct invalid license classifier - **setup:** correct license to MIT-0 in meta ## Documentation - build on master only - clarify logger debug sampling message - clean up readme in favour of docs website - add install in main docs website - add pypi badge ## Features - add capture_cold_start_metric for log_metrics ([#67](https://github.com/aws-powertools/powertools-lambda-python/issues/67)) - automate publishing to pypi ([#58](https://github.com/aws-powertools/powertools-lambda-python/issues/58)) - add pre-commit hooks ([#64](https://github.com/aws-powertools/powertools-lambda-python/issues/64)) - update Metrics interface to resemble tracer & logger: use "service" as its namespace. - add codecov service ([#59](https://github.com/aws-powertools/powertools-lambda-python/issues/59)) - add security and complexity baseline [#33](https://github.com/aws-powertools/powertools-lambda-python/issues/33) ([#57](https://github.com/aws-powertools/powertools-lambda-python/issues/57)) - add pull request template [#33](https://github.com/aws-powertools/powertools-lambda-python/issues/33) - add RFC template for proposals - create issue templates - readd release drafter action [#33](https://github.com/aws-powertools/powertools-lambda-python/issues/33) - add release drafter ([#56](https://github.com/aws-powertools/powertools-lambda-python/issues/56)) - add stale issues config [#33](https://github.com/aws-powertools/powertools-lambda-python/issues/33) ([#55](https://github.com/aws-powertools/powertools-lambda-python/issues/55)) - enforce semantic PR titles ([#54](https://github.com/aws-powertools/powertools-lambda-python/issues/54)) - add algolia search for docs and api ref ([#39](https://github.com/aws-powertools/powertools-lambda-python/issues/39)) - add documentation website ([#37](https://github.com/aws-powertools/powertools-lambda-python/issues/37)) - add docs to CI - Add Python3.8 support - **logger:** add log sampling - **pypi:** add bumpversion, public release pypi - **pyproject.toml:** move to poetry ## Maintenance - version bump ([#68](https://github.com/aws-powertools/powertools-lambda-python/issues/68)) - public beta version - rename Makefile target docs-dev to docs-local ([#65](https://github.com/aws-powertools/powertools-lambda-python/issues/65)) - correct docstring for log_metrics - fix typo in metrics doc - Correct test comment - remove unused import - formatting - plat wheels are not needed - reformat changelog to follow KeepAChangelog standard ([#50](https://github.com/aws-powertools/powertools-lambda-python/issues/50)) - bump to release candidate - renamed history to changelog dependabot - grammar issues - bump example to use 0.8.0 features - clean up CI workflows - fix github badge typo - pypi monthly download badge - lint - bump 0.3.1 with logging patch - bump history - lint - add Python 3.8 in badge as it's supported - CI badge - public beta version - **deps:** bump bleach from 3.1.0 to 3.1.1 in /python - **deps:** bump websocket-extensions from 0.1.3 to 0.1.4 in /docs ([#66](https://github.com/aws-powertools/powertools-lambda-python/issues/66)) ## Pull Requests - Merge pull request [#60](https://github.com/aws-powertools/powertools-lambda-python/issues/60) from awslabs/improv/metrics_interface - Merge pull request [#8](https://github.com/aws-powertools/powertools-lambda-python/issues/8) from awslabs/dependabot/pip/python/bleach-3.1.1 - Merge pull request [#7](https://github.com/aws-powertools/powertools-lambda-python/issues/7) from danilohgds/sampling_feature - Merge pull request [#5](https://github.com/aws-powertools/powertools-lambda-python/issues/5) from jfuss/feat/python38 ## Overview Our public roadmap outlines the high level direction we are working towards. We update this document when our priorities change: security and stability are our top priority. For most up-to-date information, see our [board of activities](https://github.com/orgs/aws-powertools/projects/3?query=sort%3Aupdated-desc+is%3Aopen). ### Key areas Security and operational excellence take precedence above all else. This means bug fixing, stability, customer's support, and internal compliance may delay one or more key areas below. **Missing something or want us to prioritize an existing area?** You can help us prioritize by [upvoting existing feature requests](https://github.com/aws-powertools/powertools-lambda-python/issues?q=is%3Aissue+is%3Aopen+sort%3Aupdated-desc+label%3Afeature-request), leaving a comment on what use cases it could unblock for you, and by joining our discussions on Discord. #### New features and utilities (p0) We will create new features and utilities to solve practical problems developers face when building serverless applications. - [Ability to buffer logs](https://github.com/aws-powertools/powertools-lambda-typescript/discussions/3410) - Async event handlers to streamline complex event-driven workflows across SQS, EventBridge #### Powertools toolchain (p1) To improve Lambda development workflows and tooling capabilities, we aim to demonstrate how to simplify complex packaging methods, enable OpenAPI code generation for multiple Lambda functions, and introduce profiling tools to evaluate Powertools for AWS Lambda (Python) code implementation, tracking memory consumption and computational performance. - Create a comprehensive "Recipes" section with Lambda packaging tutorials for tools like uv, poetry, pants, providing clear, practical build strategies. - Enable OpenAPI generation capabilities to create specifications across multiple Lambda functions, eliminating LambdaLith architectural constraints. #### Support for async (p2) Python's serverless ecosystem is increasingly adopting asynchronous programming to deliver more efficient, non-blocking applications. - Add support for aioboto3 or other tool, enabling efficient, non-blocking AWS service interactions in Lambda functions. - Write a PoC with Event Handler support for async. ## Roadmap status definition ``` graph LR Ideas --> Backlog --> Work["Working on it"] --> Merged["Coming soon"] --> Shipped ``` *Visual representation* Within our [public board](https://github.com/orgs/aws-powertools/projects/3/views/1?query=is%3Aopen+sort%3Aupdated-desc), you'll see the following values in the `Status` column: - **Ideas**. Incoming and existing feature requests that are not being actively considered yet. These will be reviewed when bandwidth permits. - **Backlog**. Accepted feature requests or enhancements that we want to work on. - **Working on it**. Features or enhancements we're currently either researching or implementing it. - **Coming soon**. Any feature, enhancement, or bug fixes that have been merged and are coming in the next release. - **Shipped**. Features or enhancements that are now available in the most recent release. > Tasks or issues with empty `Status` will be categorized in upcoming review cycles. ## Process ``` graph LR PFR[Feature request] --> Triage{Need RFC?} Triage --> |Complex/major change or new utility?| RFC[Ask or write RFC] --> Approval{Approved?} Triage --> |Minor feature or enhancement?| NoRFC[No RFC required] --> Approval Approval --> |Yes| Backlog Approval --> |No | Reject["Inform next steps"] Backlog --> |Prioritized| Implementation Backlog --> |Defer| WelcomeContributions["help-wanted label"] ``` *Visual representation* Our end-to-end mechanism follows four major steps: - **Feature Request**. Ideas start with a [feature request](https://github.com/aws-powertools/powertools-lambda-python/issues/new?assignees=&labels=feature-request%2Ctriage&template=feature_request.yml&title=Feature+request%3A+TITLE) to outline their use case at a high level. For complex use cases, maintainers might ask for/write a RFC. - Maintainers review requests based on [project tenets](../#tenets), customers reaction (👍), and use cases. - **Request-for-comments (RFC)**. Design proposals use our [RFC issue template](https://github.com/aws-powertools/powertools-lambda-python/issues/new?assignees=&labels=RFC%2Ctriage&template=rfc.yml&title=RFC%3A+TITLE) to describe its implementation, challenges, developer experience, dependencies, and alternative solutions. - This helps refine the initial idea with community feedback before a decision is made. - **Decision**. After carefully reviewing and discussing them, maintainers make a final decision on whether to start implementation, defer or reject it, and update everyone with the next steps. - **Implementation**. For approved features, maintainers give priority to the original authors for implementation unless it is a sensitive task that is best handled by maintainers. See [Maintainers](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/MAINTAINERS.md) document to understand how we triage issues and pull requests, labels and governance. ## Disclaimer The Powertools for AWS Lambda (Python) team values feedback and guidance from its community of users, although final decisions on inclusion into the project will be made by AWS. We determine the high-level direction for our open roadmap based on customer feedback and popularity (👍🏽 and comments), security and operational impacts, and business value. Where features don’t meet our goals and longer-term strategy, we will communicate that clearly and openly as quickly as possible with an explanation of why the decision was made. ## FAQs **Q: Why did you build this?** A: We know that our customers are making decisions and plans based on what we are developing, and we want to provide our customers the insights they need to plan. **Q: Why are there no dates on your roadmap?** A: Because job zero is security and operational stability, we can't provide specific target dates for features. The roadmap is subject to change at any time, and roadmap issues in this repository do not guarantee a feature will be launched as proposed. **Q: How can I provide feedback or ask for more information?** A: For existing features, you can directly comment on issues. For anything else, please open an issue. ## End of support v2 On March 25st, 2025, Powertools for AWS Lambda (Python) v2 reached end of support and will no longer receive updates or releases. If you are still using v2, we strongly recommend you to read our upgrade guide and update to the latest version. Given our commitment to all of our customers using Powertools for AWS Lambda (Python), we will keep [Pypi](https://pypi.org/project/aws-lambda-powertools/) v2 releases and documentation 2.x versions to prevent any disruption. ## Migrate to v3 from v2 We strongly encourage you to migrate to v3. However, if you still need to upgrade from v1 to v2, you can find the [upgrade guide](/lambda/python/2.43.1/upgrade/). We've made minimal breaking changes to make your transition to v3 as smooth as possible. ### Quick summary | Area | Change | Code change required | | --- | --- | --- | | **Pydantic** | We have removed support for [Pydantic v1](#drop-support-for-pydantic-v1) | No | | **Parser** | We have replaced [DynamoDBStreamModel](#dynamodbstreammodel-in-parser) `AttributeValue` with native Python types | Yes | | **Parser** | We no longer export [Pydantic objects](#importing-pydantic-objects) from `parser.pydantic`. | Yes | | **Lambda layer** | [Lambda layers](#new-aws-lambda-layer-arns) are now compiled according to the specific Python version and architecture | No | | **Event Handler** | We [have deprecated](#event-handler-headers-are-case-insensitive) the `get_header_value` function. | Yes | | **Batch Processor** | `@batch_processor` and `@async_batch_processor` decorators [are now deprecated](#deprecated-batch-processing-decorators) | Yes | | **Event Source Data Classes** | We have updated [default values](#event-source-default-values) for optional fields. | Yes | | **Parameters** | The [default cache TTL](#parameters-default-cache-ttl-updated-to-5-minutes) is now set to **5 minutes** | No | | **Parameters** | The `config` parameter [is deprecated](#parameters-using-the-new-boto_config-parameter) in favor of `boto_config` | Yes | | **JMESPath Functions** | The `extract_data_from_envelope` function is [deprecated in favor](#utilizing-the-new-query-function-in-jmespath-functions) of `query` | Yes | | **Types file** | We have removed the [type imports](#importing-types-from-typing-and-typing_annotations) from the `shared/types.py` file | Yes | ### First Steps Before you start, we suggest making a copy of your current working project or create a new branch with git. 1. **Upgrade** Python to at least v3.9. 1. **Ensure** you have the latest version via [Lambda Layer or PyPi](../#install). 1. **Review** the following sections to confirm if you need to make changes to your code. ## Drop support for Pydantic v1 No code changes required As of June 30, 2024, Pydantic v1 has reached its end-of-life, and we have discontinued support for this version. We now exclusively support Pydantic v2. Use [Pydantic v2 Migration Guide](https://docs.pydantic.dev/latest/migration/) to migrate your custom Pydantic models to v2. ## DynamoDBStreamModel in parser This also applies if you're using [**DynamoDB BatchProcessor**](https://docs.powertools.aws.dev/lambda/python/latest/utilities/batch/#processing-messages-from-dynamodb). `DynamoDBStreamModel` now returns native Python types when you access DynamoDB records through `Keys`, `NewImage`, and `OldImage` attributes. Previously, you'd receive a raw JSON and need to deserialize each item to the type you'd want for convenience, or to the type DynamoDB stored via `get` method. With this change, you can access data deserialized as stored in DynamoDB, and no longer need to recursively deserialize nested objects (Maps) if you had them. Note For a lossless conversion of DynamoDB `Number` type, we follow AWS Python SDK (boto3) approach and convert to `Decimal`. ``` from __future__ import annotations import json from typing import Any from aws_lambda_powertools.utilities.parser import event_parser from aws_lambda_powertools.utilities.parser.models import DynamoDBStreamModel from aws_lambda_powertools.utilities.typing import LambdaContext def send_to_sqs(data: dict): body = json.dumps(data) ... @event_parser def lambda_handler(event: DynamoDBStreamModel, context: LambdaContext): for record in event.Records: - # BEFORE - v2 - new_image: dict[str, Any] = record.dynamodb.NewImage - event_type = new_image["eventType"]["S"] - if event_type == "PENDING": - # deserialize attribute value into Python native type - # NOTE: nested objects would need additional logic - data = dict(new_image) - send_to_sqs(data) + # NOW - v3 + new_image: dict[str, Any] = record.dynamodb.NewImage + if new_image.get("eventType") == "PENDING": + send_to_sqs(new_image) # Here new_image is just a Python Dict type ``` ## Importing Pydantic objects We have stopped exporting Pydantic objects directly from `aws_lambda_powertools.utilities.parser.pydantic`. This change prevents customers from accidentally importing all of Pydantic, which could significantly slow down function startup times. ``` - #BEFORE - v2 - from aws_lambda_powertools.utilities.parser.pydantic import EmailStr + # NOW - v3 + from pydantic import EmailStr ``` ## New AWS Lambda Layer ARNs No code changes required To give you better a better experience, we're now building Powertools for AWS Lambda (Python)'s Lambda layers for specific Python versions (`3.9-3.13`) and architectures (`x86_64` & `arm64`). This also allows us to include architecture-specific versions of both Pydantic v2 and AWS Encryption SDK and give you a more streamlined setup. To take advantage of the new layers, you need to update your functions or deployment setup to include one of the new Lambda layer ARN from the table below: | Architecture | Python version | Layer ARN | | --- | --- | --- | | x86_64 | 3.9 | arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-x86_64:{version} | | x86_64 | 3.10 | arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-x86_64:{version} | | x86_64 | 3.11 | arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-x86_64:{version} | | x86_64 | 3.12 | arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:{version} | | x86_64 | 3.13 | arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-x86_64:{version} | | arm64 | 3.9 | arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python39-arm64:{version} | | arm64 | 3.10 | arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python310-arm64:{version} | | arm64 | 3.11 | arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python311-arm64:{version} | | arm64 | 3.12 | arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-arm64:{version} | | arm64 | 3.13 | arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python313-arm64:{version} | ## Event Handler: headers are case-insensitive According to the [HTTP RFC](https://datatracker.ietf.org/doc/html/rfc9110#section-5.1), HTTP headers are case-insensitive. As a result, we have deprecated the `get_header_value` function to align with this standard. Instead, we recommend using `app.current_event.headers.get` to access header values directly Consequently, the `case_sensitive` parameter in this function no longer has any effect, as we now ensure consistent casing by normalizing headers for you. This function will be removed in a future release, and we encourage users to adopt the new method to access header values. ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @app.get("/todos") @tracer.capture_method def get_todos(): endpoint = "https://jsonplaceholder.typicode.com/todos" # BEFORE - v2 - api_key: str = app.current_event.get_header_value(name="X-Api-Key", case_sensitive=True, default_value="") # NOW - v3 + api_key: str = app.current_event.headers.get("X-Api-Key", "") todos: Response = requests.get(endpoint, headers={"X-Api-Key": api_key}) todos.raise_for_status() return {"todos": todos.json()} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ## Deprecated Batch Processing decorators In v2, we designated `@batch_processor` and `@async_batch_processor` as legacy modes for using the Batch Processing utility. In v3, these have been marked as deprecated. Continuing to use them will result in warnings in your IDE and during Lambda execution. ``` import json from aws_lambda_powertools.utilities.batch import BatchProcessor, EventType, batch_processor, process_partial_response from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.SQS) @tracer.capture_method def record_handler(record: SQSRecord): payload: str = record.body if payload: item: dict = json.loads(payload) logger.info(item) -# BEFORE - v2 -@batch_processor(record_handler=record_handler, processor=processor) -def lambda_handler(event, context: LambdaContext): - return processor.response() + # NOW - v3 +def lambda_handler(event, context: LambdaContext): + return process_partial_response( + event=event, + record_handler=record_handler, + processor=processor, + context=context, + ) ``` ## Event source default values We've modified the **Event Source Data classes** so that optional dictionaries and lists now return empty strings, dictionaries or lists instead of `None`. This improvement simplifies your code by eliminating the need for conditional checks when accessing these fields, while maintaining backward compatibility with previous implementations. We've applied this change broadly across various event source data classes, ensuring a more consistent and streamlined coding experience for you. ``` from aws_lambda_powertools.utilities.data_classes import DynamoDBStreamEvent, event_source from aws_lambda_powertools.utilities.typing import LambdaContext @event_source(data_class=DynamoDBStreamEvent) def lambda_handler(event: DynamoDBStreamEvent, context: LambdaContext): for record in event.records: - # BEFORE - v2 - old_image_type_return_v2 = type(record.dynamodb.old_image) - # Output is + # NOW - v3 + old_image_type_return_v3 = type(record.dynamodb.old_image) + # Output is ``` ## Parameters: default cache TTL updated to 5 minutes No code changes required We have updated the cache TTL from 5 seconds to 5 minutes to reduce the number of API calls to AWS, leading to improved performance and lower costs. No code changes are necessary for this update; however, if you prefer the previous behavior, you can set the `max_age` parameter back to 5 seconds. ## Parameters: using the new boto_config parameter In v2, you could use the `config` parameter to modify the **botocore Config** session settings. In v3, we renamed this parameter to `boto_config` to standardize the name with other features, such as Idempotency, and introduced deprecation warnings for users still using `config`. ``` from botocore.config import Config from aws_lambda_powertools.utilities import parameters -# BEFORE - v2 -ssm_provider = parameters.SSMProvider(config=Config(region_name="us-west-1")) +# NOW - v3 +ssm_provider = parameters.SSMProvider(boto_config=Config(region_name="us-west-1")) def handler(event, context): value = ssm_provider.get("/my/parameter") return {"message": value} ``` ## Utilizing the new query function in JMESPath Functions In v2, you could use the `extract_data_from_envelope` function to search and extract data from dictionaries with JMESPath. This name was too generic and customers told us it was confusing. In v3, we renamed this function to `query` to align with similar frameworks in the ecosystem, and introduced deprecation warnings for users still using `extract_data_from_envelope`. ``` from aws_lambda_powertools.utilities.jmespath_utils import extract_data_from_envelope, query from aws_lambda_powertools.utilities.typing import LambdaContext def handler(event: dict, context: LambdaContext) -> dict: - # BEFORE - v2 - some_data = extract_data_from_envelope(data=event, envelope="powertools_json(body)") + # NOW - v3 + some_data = query(data=event, envelope="powertools_json(body)") return {"data": some_data} ``` ## Importing types from typing and typing_annotations We refactored our codebase to align with Python guidelines and eliminated the use of `aws_lambda_powertools.shared.types` imports. Instead, we now utilize types from the standard `typing` library, which are compatible with Python versions 3.9 and above, or from `typing_extensions` (included as a required dependency) for additional type support. ``` -# BEFORE - v2 -from aws_lambda_powertools.shared.types import Annotated +# NOW - v3 +from typing_extensions import Annotated from aws_lambda_powertools.utilities.typing import LambdaContext def handler(event: dict, context: LambdaContext) -> dict: ... ``` # Core Utilities Logger provides an opinionated logger with output structured as JSON. ## Key features - Capture key fields from Lambda context, cold start and structures logging output as JSON - Log Lambda event when instructed (disabled by default) - Log sampling enables DEBUG log level for a percentage of requests (disabled by default) - Append additional keys to structured log at any point in time - Buffering logs for a specific request or invocation, and flushing them automatically on error or manually as needed. ## Getting started Tip All examples shared in this documentation are available within the [project repository](https://github.com/aws-powertools/powertools-lambda-python/tree/develop/examples). Logger requires two settings: | Setting | Description | Environment variable | Constructor parameter | | --- | --- | --- | --- | | **Logging level** | Sets how verbose Logger should be (INFO, by default) | `POWERTOOLS_LOG_LEVEL` | `level` | | **Service** | Sets **service** key that will be present across all log statements | `POWERTOOLS_SERVICE_NAME` | `service` | There are some [other environment variables](#environment-variables) which can be set to modify Logger's settings at a global scope. ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: Powertools for AWS Lambda (Python) version Globals: Function: Timeout: 5 Runtime: python3.12 Tracing: Active Environment: Variables: POWERTOOLS_SERVICE_NAME: payment POWERTOOLS_LOG_LEVEL: INFO Layers: # Find the latest Layer version in the official documentation # https://docs.powertools.aws.dev/lambda/python/latest/#lambda-layer - !Sub arn:aws:lambda:${AWS::Region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15 Resources: LoggerLambdaHandlerExample: Type: AWS::Serverless::Function Properties: CodeUri: ../src Handler: inject_lambda_context.handler ``` ### Standard structured keys Your Logger will include the following keys to your structured logging: | Key | Example | Note | | --- | --- | --- | | **level**: `str` | `INFO` | Logging level | | **location**: `str` | `collect.handler:1` | Source code location where statement was executed | | **message**: `Any` | `Collecting payment` | Unserializable JSON values are casted as `str` | | **timestamp**: `str` | `2021-05-03 10:20:19,650+0000` | Timestamp with milliseconds, by default uses default AWS Lambda timezone (UTC) | | **service**: `str` | `payment` | Service name defined, by default `service_undefined` | | **xray_trace_id**: `str` | `1-5759e988-bd862e3fe1be46a994272793` | When [tracing is enabled](https://docs.aws.amazon.com/lambda/latest/dg/services-xray.html), it shows X-Ray Trace ID | | **sampling_rate**: `float` | `0.1` | When enabled, it shows sampling rate in percentage e.g. 10% | | **exception_name**: `str` | `ValueError` | When `logger.exception` is used and there is an exception | | **exception**: `str` | `Traceback (most recent call last)..` | When `logger.exception` is used and there is an exception | ### Capturing Lambda context info You can enrich your structured logs with key Lambda context information via `inject_lambda_context`. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> str: logger.info("Collecting payment") # You can log entire objects too logger.info({"operation": "collect_payment", "charge_id": event["charge_id"]}) return "hello world" ``` ``` [ { "level": "INFO", "location": "collect.handler:9", "message": "Collecting payment", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "cold_start": true, "function_name": "test", "function_memory_size": 128, "function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test", "function_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72" }, { "level": "INFO", "location": "collect.handler:12", "message": { "operation": "collect_payment", "charge_id": "ch_AZFlk2345C0" }, "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "cold_start": true, "function_name": "test", "function_memory_size": 128, "function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test", "function_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72" } ] ``` When used, this will include the following keys: | Key | Example | | --- | --- | | **cold_start**: `bool` | `false` | | **function_name** `str` | `example-powertools-HelloWorldFunction-1P1Z6B39FLU73` | | **function_memory_size**: `int` | `128` | | **function_arn**: `str` | `arn:aws:lambda:eu-west-1:012345678910:function:example-powertools-HelloWorldFunction-1P1Z6B39FLU73` | | **function_request_id**: `str` | `899856cb-83d1-40d7-8611-9e78f15f32f4` | ### Logging incoming event When debugging in non-production environments, you can instruct Logger to log the incoming event with `log_event` param or via `POWERTOOLS_LOGGER_LOG_EVENT` env var. Warning This is disabled by default to prevent sensitive info being logged ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() @logger.inject_lambda_context(log_event=True) def lambda_handler(event: dict, context: LambdaContext) -> str: return "hello world" ``` ### Setting a Correlation ID You can set a Correlation ID using `correlation_id_path` param by passing a [JMESPath expression](https://jmespath.org/tutorial.html), including [our custom JMESPath Functions](../../utilities/jmespath_functions/#powertools_json-function). Tip You can retrieve correlation IDs via `get_correlation_id` method. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() @logger.inject_lambda_context(correlation_id_path="headers.my_request_id_header") def lambda_handler(event: dict, context: LambdaContext) -> str: logger.debug(f"Correlation ID => {logger.get_correlation_id()}") logger.info("Collecting payment") return "hello world" ``` ``` { "headers": { "my_request_id_header": "correlation_id_value" } } ``` ``` { "level": "INFO", "location": "collect.handler:10", "message": "Collecting payment", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "cold_start": true, "function_name": "test", "function_memory_size": 128, "function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test", "function_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72", "correlation_id": "correlation_id_value" } ``` #### set_correlation_id method You can also use `set_correlation_id` method to inject it anywhere else in your code. Example below uses [Event Source Data Classes utility](../../utilities/data_classes/) to easily access events properties. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_classes import APIGatewayProxyEvent from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() def lambda_handler(event: dict, context: LambdaContext) -> str: request = APIGatewayProxyEvent(event) logger.set_correlation_id(request.request_context.request_id) logger.info("Collecting payment") return "hello world" ``` ``` { "requestContext": { "requestId": "correlation_id_value" } } ``` ``` { "level": "INFO", "location": "collect.handler:13", "message": "Collecting payment", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "correlation_id": "correlation_id_value" } ``` #### Known correlation IDs To ease routine tasks like extracting correlation ID from popular event sources, we provide [built-in JMESPath expressions](#built-in-correlation-id-expressions). ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) def lambda_handler(event: dict, context: LambdaContext) -> str: logger.debug(f"Correlation ID => {logger.get_correlation_id()}") logger.info("Collecting payment") return "hello world" ``` ``` { "requestContext": { "requestId": "correlation_id_value" } } ``` ``` { "level": "INFO", "location": "collect.handler:11", "message": "Collecting payment", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "cold_start": true, "function_name": "test", "function_memory_size": 128, "function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test", "function_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72", "correlation_id": "correlation_id_value" } ``` ### Appending additional keys Info: Custom keys are persisted across warm invocations Always set additional keys as part of your handler to ensure they have the latest value, or explicitly clear them with [`clear_state=True`](#clearing-all-state). You can append additional keys using either mechanism: - New keys persist across all future log messages via `append_keys` method - Add additional keys on a per log message basis as a keyword=value, or via `extra` parameter - New keys persist across all future logs in a specific thread via `thread_safe_append_keys` method. Check [Working with thread-safe keys](#working-with-thread-safe-keys) section. #### append_keys method Warning `append_keys` is not thread-safe, use [thread_safe_append_keys](#appending-thread-safe-additional-keys) instead You can append your own keys to your existing Logger via `append_keys(**additional_key_values)` method. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() def lambda_handler(event: dict, context: LambdaContext) -> str: order_id = event.get("order_id") # this will ensure order_id key always has the latest value before logging # alternative, you can use `clear_state=True` parameter in @inject_lambda_context logger.append_keys(order_id=order_id) logger.info("Collecting payment") return "hello world" ``` ``` { "level": "INFO", "location": "collect.handler:11", "message": "Collecting payment", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "order_id": "order_id_value" } ``` Tip: Logger will automatically reject any key with a None value If you conditionally add keys depending on the payload, you can follow the example above. This example will add `order_id` if its value is not empty, and in subsequent invocations where `order_id` might not be present it'll remove it from the Logger. #### append_context_keys method Warning `append_context_keys` is not thread-safe. The append_context_keys method allows temporary modification of a Logger instance's context without creating a new logger. It's useful for adding context keys to specific workflows while maintaining the logger's overall state and simplicity. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger(service="example_service") def lambda_handler(event: dict, context: LambdaContext) -> str: with logger.append_context_keys(user_id="123", operation="process"): logger.info("Log with context") logger.info("Log without context") return "hello world" ``` ``` [ { "level": "INFO", "location": "lambda_handler:8", "message": "Log with context", "timestamp": "2024-03-21T10:30:00.123Z", "service": "example_service", "user_id": "123", "operation": "process" }, { "level": "INFO", "location": "lambda_handler:10", "message": "Log without context", "timestamp": "2024-03-21T10:30:00.124Z", "service": "example_service" } ] ``` #### ephemeral metadata You can pass an arbitrary number of keyword arguments (kwargs) to all log level's methods, e.g. `logger.info, logger.warning`. Two common use cases for this feature is to enrich log statements with additional metadata, or only add certain keys conditionally. Any keyword argument added will not be persisted in subsequent messages. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() def lambda_handler(event: dict, context: LambdaContext) -> str: logger.info("Collecting payment", request_id="1123") return "hello world" ``` ``` { "level": "INFO", "location": "collect.handler:8", "message": "Collecting payment", "timestamp": "2022-11-26 11:47:12,494+0000", "service": "payment", "request_id": "1123" } ``` #### extra parameter Extra parameter is available for all log levels' methods, as implemented in the standard logging library - e.g. `logger.info, logger.warning`. It accepts any dictionary, and all keyword arguments will be added as part of the root structure of the logs for that log statement. Any keyword argument added using `extra` will not be persisted in subsequent messages. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() def lambda_handler(event: dict, context: LambdaContext) -> str: fields = {"request_id": "1123"} logger.info("Collecting payment", extra=fields) return "hello world" ``` ``` { "level": "INFO", "location": "collect.handler:9", "message": "Collecting payment", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "request_id": "1123" } ``` ### Removing additional keys You can remove additional keys using either mechanism: - Remove new keys across all future log messages via `remove_keys` method - Remove keys persist across all future logs in a specific thread via `thread_safe_remove_keys` method. Check [Working with thread-safe keys](#working-with-thread-safe-keys) section. Danger Keys added by `append_keys` can only be removed by `remove_keys` and thread-local keys added by `thread_safe_append_keys` can only be removed by `thread_safe_remove_keys` or `thread_safe_clear_keys`. Thread-local and normal logger keys are distinct values and can't be manipulated interchangeably. #### remove_keys method You can remove any additional key from Logger state using `remove_keys`. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() def lambda_handler(event: dict, context: LambdaContext) -> str: logger.append_keys(sample_key="value") logger.info("Collecting payment") logger.remove_keys(["sample_key"]) logger.info("Collecting payment without sample key") return "hello world" ``` ``` [ { "level": "INFO", "location": "collect.handler:9", "message": "Collecting payment", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "sample_key": "value" }, { "level": "INFO", "location": "collect.handler:12", "message": "Collecting payment without sample key", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment" } ] ``` #### Clearing all state ##### Decorator with clear_state Logger is commonly initialized in the global scope. Due to [Lambda Execution Context reuse](https://docs.aws.amazon.com/lambda/latest/dg/runtimes-context.html), this means that custom keys can be persisted across invocations. If you want all custom keys to be deleted, you can use `clear_state=True` param in `inject_lambda_context` decorator. Tip: When is this useful? It is useful when you add multiple custom keys conditionally, instead of setting a default `None` value if not present. Any key with `None` value is automatically removed by Logger. Danger: This can have unintended side effects if you use Layers Lambda Layers code is imported before the Lambda handler. When a Lambda function starts, it first imports and executes all code in the Layers (including any global scope code) before proceeding to the function's own code. This means that `clear_state=True` will instruct Logger to remove any keys previously added before Lambda handler execution proceeds. You can either avoid running any code as part of Lambda Layers global scope, or override keys with their latest value as part of handler's execution. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() @logger.inject_lambda_context(clear_state=True) def lambda_handler(event: dict, context: LambdaContext) -> str: if event.get("special_key"): # Should only be available in the first request log # as the second request doesn't contain `special_key` logger.append_keys(debugging_key="value") logger.info("Collecting payment") return "hello world" ``` ``` { "level": "INFO", "location": "collect.handler:10", "message": "Collecting payment", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "special_key": "debug_key", "cold_start": true, "function_name": "test", "function_memory_size": 128, "function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test", "function_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72" } ``` ``` { "level": "INFO", "location": "collect.handler:10", "message": "Collecting payment", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "cold_start": false, "function_name": "test", "function_memory_size": 128, "function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test", "function_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72" } ``` ##### clear_state method You can call `clear_state()` as a method explicitly within your code to clear appended keys at any point during the execution of your Lambda invocation. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger(service="payment", level="DEBUG") def lambda_handler(event: dict, context: LambdaContext) -> str: try: logger.append_keys(order_id="12345") logger.info("Starting order processing") finally: logger.info("Final state before clearing") logger.clear_state() logger.info("State after clearing - only show default keys") return "Completed" ``` ``` { "logs": [ { "level": "INFO", "location": "lambda_handler:122", "message": "Starting order processing", "timestamp": "2025-01-30 13:56:03,157-0300", "service": "payment", "order_id": "12345" }, { "level": "INFO", "location": "lambda_handler:124", "message": "Final state before clearing", "timestamp": "2025-01-30 13:56:03,157-0300", "service": "payment", "order_id": "12345" } ] } ``` ``` { "level": "INFO", "location": "lambda_handler:126", "message": "State after clearing - only show default keys", "timestamp": "2025-01-30 13:56:03,158-0300", "service": "payment" } ``` ### Accessing currently configured keys You can view all currently configured keys from the Logger state using the `get_current_keys()` method. This method is useful when you need to avoid overwriting keys that are already configured. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> str: logger.info("Collecting payment") if "order" not in logger.get_current_keys(): logger.append_keys(order=event.get("order")) return "hello world" ``` Info For thread-local additional logging keys, use `get_current_thread_keys` instead ### Log levels The default log level is `INFO`. It can be set using the `level` constructor option, `setLevel()` method or by using the `POWERTOOLS_LOG_LEVEL` environment variable. We support the following log levels: | Level | Numeric value | Standard logging | | --- | --- | --- | | `DEBUG` | 10 | `logging.DEBUG` | | `INFO` | 20 | `logging.INFO` | | `WARNING` | 30 | `logging.WARNING` | | `ERROR` | 40 | `logging.ERROR` | | `CRITICAL` | 50 | `logging.CRITICAL` | If you want to access the numeric value of the current log level, you can use the `log_level` property. For example, if the current log level is `INFO`, `logger.log_level` property will return `20`. ``` from aws_lambda_powertools import Logger logger = Logger(level="ERROR") print(logger.log_level) # returns 40 (ERROR) ``` ``` from aws_lambda_powertools import Logger logger = Logger() # print default log level print(logger.log_level) # returns 20 (INFO) # Setting programmatic log level logger.setLevel("DEBUG") # print new log level print(logger.log_level) # returns 10 (DEBUG) ``` #### AWS Lambda Advanced Logging Controls (ALC) When is it useful? When you want to set a logging policy to drop informational or verbose logs for one or all AWS Lambda functions, regardless of runtime and logger used. With [AWS Lambda Advanced Logging Controls (ALC)](https://docs.aws.amazon.com/lambda/latest/dg/monitoring-cloudwatchlogs.html#monitoring-cloudwatchlogs-advanced), you can enforce a minimum log level that Lambda will accept from your application code. When enabled, you should keep `Logger` and ALC log level in sync to avoid data loss. Here's a sequence diagram to demonstrate how ALC will drop both `INFO` and `DEBUG` logs emitted from `Logger`, when ALC log level is stricter than `Logger`. ``` sequenceDiagram title Lambda ALC allows WARN logs only participant Lambda service participant Lambda function participant Application Logger Note over Lambda service: AWS_LAMBDA_LOG_LEVEL="WARN" Note over Application Logger: POWERTOOLS_LOG_LEVEL="DEBUG" Lambda service->>Lambda function: Invoke (event) Lambda function->>Lambda function: Calls handler Lambda function->>Application Logger: logger.error("Something happened") Lambda function-->>Application Logger: logger.debug("Something happened") Lambda function-->>Application Logger: logger.info("Something happened") Lambda service--xLambda service: DROP INFO and DEBUG logs Lambda service->>CloudWatch Logs: Ingest error logs ``` **Priority of log level settings in Powertools for AWS Lambda** We prioritise log level settings in this order: 1. `AWS_LAMBDA_LOG_LEVEL` environment variable 1. Explicit log level in `Logger` constructor, or by calling the `logger.setLevel()` method 1. `POWERTOOLS_LOG_LEVEL` environment variable If you set `Logger` level lower than ALC, we will emit a warning informing you that your messages will be discarded by Lambda. > **NOTE** > > With ALC enabled, we are unable to increase the minimum log level below the `AWS_LAMBDA_LOG_LEVEL` environment variable value, see [AWS Lambda service documentation](https://docs.aws.amazon.com/lambda/latest/dg/monitoring-cloudwatchlogs.html#monitoring-cloudwatchlogs-log-level) for more details. ### Logging exceptions Use `logger.exception` method to log contextual information about exceptions. Logger will include `exception_name` and `exception` keys to aid troubleshooting and error enumeration. Tip You can use your preferred Log Analytics tool to enumerate and visualize exceptions across all your services using `exception_name` key. ``` import requests from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext ENDPOINT = "https://httpbin.org/status/500" logger = Logger(serialize_stacktrace=False) def lambda_handler(event: dict, context: LambdaContext) -> str: try: ret = requests.get(ENDPOINT) ret.raise_for_status() except requests.HTTPError as e: logger.exception("Received a HTTP 5xx error") raise RuntimeError("Unable to fullfil request") from e return "hello world" ``` ``` { "level": "ERROR", "location": "collect.handler:15", "message": "Received a HTTP 5xx error", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "exception_name": "RuntimeError", "exception": "Traceback (most recent call last):\n File \"\", line 2, in RuntimeError: Unable to fullfil request" } ``` #### Uncaught exceptions CAUTION: some users reported a problem that causes this functionality not to work in the Lambda runtime. We recommend that you don't use this feature for the time being. Logger can optionally log uncaught exceptions by setting `log_uncaught_exceptions=True` at initialization. Logger will replace any exception hook previously registered via [sys.excepthook](https://docs.python.org/3/library/sys.html#sys.excepthook). What are uncaught exceptions? It's any raised exception that wasn't handled by the [`except` statement](https://docs.python.org/3.9/tutorial/errors.html#handling-exceptions), leading a Python program to a non-successful exit. They are typically raised intentionally to signal a problem (`raise ValueError`), or a propagated exception from elsewhere in your code that you didn't handle it willingly or not (`KeyError`, `jsonDecoderError`, etc.). ``` import requests from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext ENDPOINT = "http://httpbin.org/status/500" logger = Logger(log_uncaught_exceptions=True) def lambda_handler(event: dict, context: LambdaContext) -> str: ret = requests.get(ENDPOINT) # HTTP 4xx/5xx status will lead to requests.HTTPError # Logger will log this exception before this program exits non-successfully ret.raise_for_status() return "hello world" ``` ``` { "level": "ERROR", "location": "log_uncaught_exception_hook:756", "message": "500 Server Error: INTERNAL SERVER ERROR for url: http://httpbin.org/status/500", "timestamp": "2022-11-16 13:51:29,198+0000", "service": "payment", "exception": "Traceback (most recent call last):\n File \"\", line 52, in \n handler({}, {})\n File \"\", line 17, in handler\n ret.raise_for_status()\n File \"/lib/python3.9/site-packages/requests/models.py\", line 1021, in raise_for_status\n raise HTTPError(http_error_msg, response=self)\nrequests.exceptions.HTTPError: 500 Server Error: INTERNAL SERVER ERROR for url: http://httpbin.org/status/500", "exception_name": "HTTPError" } ``` #### Stack trace logging By default, the Logger will automatically include the full stack trace in JSON format when using `logger.exception`. If you want to disable this feature, set `serialize_stacktrace=False` during initialization." ``` import requests from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext ENDPOINT = "https://httpbin.org/status/500" logger = Logger(serialize_stacktrace=True) def lambda_handler(event: dict, context: LambdaContext) -> str: try: ret = requests.get(ENDPOINT) ret.raise_for_status() except requests.HTTPError as e: logger.exception(e) raise RuntimeError("Unable to fullfil request") from e return "hello world" ``` ``` { "level":"ERROR", "location":"lambda_handler:16", "message":"500 Server Error: INTERNAL SERVER ERROR for url: http://httpbin.org/status/500", "timestamp":"2023-10-09 17:47:50,191+0000", "service":"service_undefined", "exception":"Traceback (most recent call last):\n File \"/var/task/app.py\", line 14, in lambda_handler\n ret.raise_for_status()\n File \"/var/task/requests/models.py\", line 1021, in raise_for_status\n raise HTTPError(http_error_msg, response=self)\nrequests.exceptions.HTTPError: 500 Server Error: INTERNAL SERVER ERROR for url: http://httpbin.org/status/500", "exception_name":"HTTPError", "stack_trace":{ "type":"HTTPError", "value":"500 Server Error: INTERNAL SERVER ERROR for url: http://httpbin.org/status/500", "module":"requests.exceptions", "frames":[ { "file":"/var/task/app.py", "line":14, "function":"lambda_handler", "statement":"ret.raise_for_status()" }, { "file":"/var/task/requests/models.py", "line":1021, "function":"raise_for_status", "statement":"raise HTTPError(http_error_msg, response=self)" } ] } } ``` #### Adding exception notes You can add notes to exceptions, which `logger.exception` propagates via a new `exception_notes` key in the log line. This works only in [Python 3.11 and later](https://peps.python.org/pep-0678/). ``` import requests from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext ENDPOINT = "https://httpbin.org/status/500" logger = Logger(serialize_stacktrace=False) def lambda_handler(event: dict, context: LambdaContext) -> str: try: ret = requests.get(ENDPOINT) ret.raise_for_status() except requests.HTTPError as e: e.add_note("Can't connect to the endpoint") # type: ignore[attr-defined] logger.exception(e) raise RuntimeError("Unable to fullfil request") from e return "hello world" ``` ``` { "level": "ERROR", "location": "collect.handler:15", "message": "Received a HTTP 5xx error", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "exception_name": "RuntimeError", "exception": "Traceback (most recent call last):\n File \"\", line 2, in RuntimeError: Unable to fullfil request", "exception_notes":[ "Can't connect to the endpoint" ] } ``` ### Date formatting Logger uses Python's standard logging date format with the addition of timezone: `2021-05-03 11:47:12,494+0000`. You can easily change the date format using one of the following parameters: - **`datefmt`**. You can pass any [strftime format codes](https://strftime.org/). Use `%F` if you need milliseconds. - **`use_rfc3339`**. This flag will use a format compliant with both RFC3339 and ISO8601: `2022-10-27T16:27:43.738+00:00` Prefer using [datetime string formats](https://docs.python.org/3/library/datetime.html#strftime-and-strptime-format-codes)? Use `use_datetime_directive` flag along with `datefmt` to instruct Logger to use `datetime` instead of `time.strftime`. ``` from aws_lambda_powertools import Logger date_format = "%m/%d/%Y %I:%M:%S %p" logger = Logger(service="payment", use_rfc3339=True) logger.info("Collecting payment") logger_custom_format = Logger(service="loyalty", datefmt=date_format) logger_custom_format.info("Calculating points") ``` ``` [ { "level": "INFO", "location": ":6", "message": "Collecting payment", "timestamp": "2022-10-28T14:35:03.210+00:00", "service": "payment" }, { "level": "INFO", "location": ":9", "message": "Calculating points", "timestamp": "10/28/2022 02:35:03 PM", "service": "loyalty" } ] ``` ### Environment variables The following environment variables are available to configure Logger at a global scope: | Setting | Description | Environment variable | Default | | --- | --- | --- | --- | | **Event Logging** | Whether to log the incoming event. | `POWERTOOLS_LOGGER_LOG_EVENT` | `false` | | **Debug Sample Rate** | Sets the debug log sampling. | `POWERTOOLS_LOGGER_SAMPLE_RATE` | `0` | | **Disable Deduplication** | Disables log deduplication filter protection to use Pytest Live Log feature. | `POWERTOOLS_LOG_DEDUPLICATION_DISABLED` | `false` | | **TZ** | Sets timezone when using Logger, e.g., `US/Eastern`. Timezone is defaulted to UTC when `TZ` is not set | `TZ` | `None` (UTC) | [`POWERTOOLS_LOGGER_LOG_EVENT`](#logging-incoming-event) can also be set on a per-method basis, and [`POWERTOOLS_LOGGER_SAMPLE_RATE`](#sampling-debug-logs) on a per-instance basis. These parameter values will override the environment variable value. ## Advanced ### Buffering logs Log buffering enables you to buffer logs for a specific request or invocation. Enable log buffering by passing `logger_buffer` when initializing a Logger instance. You can buffer logs at the `WARNING`, `INFO` or `DEBUG` level, and flush them automatically on error or manually as needed. This is useful when you want to reduce the number of log messages emitted while still having detailed logs when needed, such as when troubleshooting issues. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.logging.buffer import LoggerBufferConfig from aws_lambda_powertools.utilities.typing import LambdaContext logger_buffer_config = LoggerBufferConfig(max_bytes=20480, flush_on_error_log=True) logger = Logger(level="INFO", buffer_config=logger_buffer_config) def lambda_handler(event: dict, context: LambdaContext): logger.debug("a debug log") # this is buffered logger.info("an info log") # this is not buffered # do stuff logger.flush_buffer() ``` #### Configuring the buffer When configuring log buffering, you have options to fine-tune how logs are captured, stored, and emitted. You can configure the following parameters in the `LoggerBufferConfig` constructor: | Parameter | Description | Configuration | | --- | --- | --- | | `max_bytes` | Maximum size of the log buffer in bytes | `int` (default: 20480 bytes) | | `buffer_at_verbosity` | Minimum log level to buffer | `DEBUG`, `INFO`, `WARNING` | | `flush_on_error_log` | Automatically flush buffer when an error occurs | `True` (default), `False` | When `flush_on_error_log` is enabled, it automatically flushes for `logger.exception()`, `logger.error()`, and `logger.critical()` statements. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.logging.buffer import LoggerBufferConfig from aws_lambda_powertools.utilities.typing import LambdaContext logger_buffer_config = LoggerBufferConfig(buffer_at_verbosity="WARNING") # (1)! logger = Logger(level="INFO", buffer_config=logger_buffer_config) def lambda_handler(event: dict, context: LambdaContext): logger.warning("a warning log") # this is buffered logger.info("an info log") # this is buffered logger.debug("a debug log") # this is buffered # do stuff logger.flush_buffer() ``` 1. Setting `minimum_log_level="WARNING"` configures log buffering for `WARNING` and lower severity levels (`INFO`, `DEBUG`). ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.logging.buffer import LoggerBufferConfig from aws_lambda_powertools.utilities.typing import LambdaContext logger_buffer_config = LoggerBufferConfig(flush_on_error_log=False) # (1)! logger = Logger(level="INFO", buffer_config=logger_buffer_config) class MyException(Exception): pass def lambda_handler(event: dict, context: LambdaContext): logger.debug("a debug log") # this is buffered # do stuff try: raise MyException except MyException as error: logger.error("An error ocurrend", exc_info=error) # Logs won't be flushed here # Need to flush logs manually logger.flush_buffer() ``` 1. Disabling `flush_on_error_log` will not flush the buffer when logging an error. This is useful when you want to control when the buffer is flushed by calling the `logger.flush_buffer()` method. #### Flushing on exceptions Use the `@logger.inject_lambda_context` decorator to automatically flush buffered logs when an exception is raised in your Lambda function. This is done by setting the `flush_buffer_on_uncaught_error` option to `True` in the decorator. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.logging.buffer import LoggerBufferConfig from aws_lambda_powertools.utilities.typing import LambdaContext logger_buffer_config = LoggerBufferConfig(max_bytes=20480, flush_on_error_log=False) logger = Logger(level="INFO", buffer_config=logger_buffer_config) class MyException(Exception): pass @logger.inject_lambda_context(flush_buffer_on_uncaught_error=True) def lambda_handler(event: dict, context: LambdaContext): logger.debug("a debug log") # this is buffered # do stuff raise MyException # Logs will be flushed here ``` #### Reutilizing same logger instance If you are using log buffering, we recommend sharing the same log instance across your code/modules, so that the same buffer is also shared. Doing this you can centralize logger instance creation and prevent buffer configuration drift. Buffer Inheritance Loggers created with the same `service_name` automatically inherit the buffer configuration from the first initialized logger with a buffer configuration. Child loggers instances inherit their parent's buffer configuration but maintain a separate buffer. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.logging.buffer import LoggerBufferConfig logger_buffer_config = LoggerBufferConfig(max_bytes=20480, buffer_at_verbosity="WARNING") logger = Logger(level="INFO", buffer_config=logger_buffer_config) ``` ``` from working_with_buffering_logs_creating_instance import logger # reusing same instance from working_with_buffering_logs_reusing_function import my_function from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext): logger.debug("a debug log") # this is buffered my_function() logger.flush_buffer() ``` ``` from working_with_buffering_logs_creating_instance import logger # reusing same instance def my_function(): logger.debug("This will be buffered") # do stuff ``` #### Buffering workflows ##### Manual flush ``` sequenceDiagram participant Client participant Lambda participant Logger participant CloudWatch Client->>Lambda: Invoke Lambda Lambda->>Logger: Initialize with DEBUG level buffering Logger-->>Lambda: Logger buffer ready Lambda->>Logger: logger.debug("First debug log") Logger-->>Logger: Buffer first debug log Lambda->>Logger: logger.info("Info log") Logger->>CloudWatch: Directly log info message Lambda->>Logger: logger.debug("Second debug log") Logger-->>Logger: Buffer second debug log Lambda->>Logger: logger.flush_buffer() Logger->>CloudWatch: Emit buffered logs to stdout Lambda->>Client: Return execution result ``` *Flushing buffer manually* ##### Flushing when logging an error ``` sequenceDiagram participant Client participant Lambda participant Logger participant CloudWatch Client->>Lambda: Invoke Lambda Lambda->>Logger: Initialize with DEBUG level buffering Logger-->>Lambda: Logger buffer ready Lambda->>Logger: logger.debug("First log") Logger-->>Logger: Buffer first debug log Lambda->>Logger: logger.debug("Second log") Logger-->>Logger: Buffer second debug log Lambda->>Logger: logger.debug("Third log") Logger-->>Logger: Buffer third debug log Lambda->>Lambda: Exception occurs Lambda->>Logger: logger.error("Error details") Logger->>CloudWatch: Emit buffered debug logs Logger->>CloudWatch: Emit error log Lambda->>Client: Raise exception ``` *Flushing buffer when an error happens* ##### Flushing on exception This works only when decorating your Lambda handler with the decorator `@logger.inject_lambda_context(flush_buffer_on_uncaught_error=True)` ``` sequenceDiagram participant Client participant Lambda participant Logger participant CloudWatch Client->>Lambda: Invoke Lambda Lambda->>Logger: Using decorator Logger-->>Lambda: Logger context injected Lambda->>Logger: logger.debug("First log") Logger-->>Logger: Buffer first debug log Lambda->>Logger: logger.debug("Second log") Logger-->>Logger: Buffer second debug log Lambda->>Lambda: Uncaught Exception Lambda->>CloudWatch: Automatically emit buffered debug logs Lambda->>Client: Raise uncaught exception ``` *Flushing buffer when an uncaught exception happens* #### Buffering FAQs 1. **Does the buffer persist across Lambda invocations?** No, each Lambda invocation has its own buffer. The buffer is initialized when the Lambda function is invoked and is cleared after the function execution completes or when flushed manually. 1. **Are my logs buffered during cold starts?** No, we never buffer logs during cold starts. This is because we want to ensure that logs emitted during this phase are always available for debugging and monitoring purposes. The buffer is only used during the execution of the Lambda function. 1. **How can I prevent log buffering from consuming excessive memory?** You can limit the size of the buffer by setting the `max_bytes` option in the `LoggerBufferConfig` constructor parameter. This will ensure that the buffer does not grow indefinitely and consume excessive memory. 1. **What happens if the log buffer reaches its maximum size?** Older logs are removed from the buffer to make room for new logs. This means that if the buffer is full, you may lose some logs if they are not flushed before the buffer reaches its maximum size. When this happens, we emit a warning when flushing the buffer to indicate that some logs have been dropped. 1. **How is the log size of a log line calculated?** The log size is calculated based on the size of the log line in bytes. This includes the size of the log message, any exception (if present), the log line location, additional keys, and the timestamp. 1. **What timestamp is used when I flush the logs?** The timestamp preserves the original time when the log record was created. If you create a log record at 11:00:10 and flush it at 11:00:25, the log line will retain its original timestamp of 11:00:10. 1. **What happens if I try to add a log line that is bigger than max buffer size?** The log will be emitted directly to standard output and not buffered. When this happens, we emit a warning to indicate that the log line was too big to be buffered. 1. **What happens if Lambda times out without flushing the buffer?** Logs that are still in the buffer will be lost. 1. **Do child loggers inherit the buffer?** No, child loggers do not inherit the buffer from their parent logger but only the buffer configuration. This means that if you create a child logger, it will have its own buffer and will not share the buffer with the parent logger. ### Built-in Correlation ID expressions You can use any of the following built-in JMESPath expressions as part of [inject_lambda_context decorator](#setting-a-correlation-id). Note: Any object key named with `-` must be escaped For example, **`request.headers."x-amzn-trace-id"`**. | Name | Expression | Description | | --- | --- | --- | | **API_GATEWAY_REST** | `"requestContext.requestId"` | API Gateway REST API request ID | | **API_GATEWAY_HTTP** | `"requestContext.requestId"` | API Gateway HTTP API request ID | | **APPSYNC_RESOLVER** | `'request.headers."x-amzn-trace-id"'` | AppSync X-Ray Trace ID | | **APPLICATION_LOAD_BALANCER** | `'headers."x-amzn-trace-id"'` | ALB X-Ray Trace ID | | **EVENT_BRIDGE** | `"id"` | EventBridge Event ID | ### Working with thread-safe keys #### Appending thread-safe additional keys You can append your own thread-local keys in your existing Logger via the `thread_safe_append_keys` method ``` import threading from typing import List from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() def threaded_func(order_id: str): logger.thread_safe_append_keys(order_id=order_id, thread_id=threading.get_ident()) logger.info("Collecting payment") def lambda_handler(event: dict, context: LambdaContext) -> str: order_ids: List[str] = event["order_ids"] threading.Thread(target=threaded_func, args=(order_ids[0],)).start() threading.Thread(target=threaded_func, args=(order_ids[1],)).start() return "hello world" ``` ``` [ { "level": "INFO", "location": "threaded_func:11", "message": "Collecting payment", "timestamp": "2024-09-08 03:04:11,316-0400", "service": "payment", "order_id": "order_id_value_1", "thread_id": "3507187776085958" }, { "level": "INFO", "location": "threaded_func:11", "message": "Collecting payment", "timestamp": "2024-09-08 03:04:11,316-0400", "service": "payment", "order_id": "order_id_value_2", "thread_id": "140718447808512" } ] ``` #### Removing thread-safe additional keys You can remove any additional thread-local keys from Logger using either `thread_safe_remove_keys` or `thread_safe_clear_keys`. Use the `thread_safe_remove_keys` method to remove a list of thread-local keys that were previously added using the `thread_safe_append_keys` method. ``` import threading from typing import List from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() def threaded_func(order_id: str): logger.thread_safe_append_keys(order_id=order_id, thread_id=threading.get_ident()) logger.info("Collecting payment") logger.thread_safe_remove_keys(["order_id"]) logger.info("Exiting thread") def lambda_handler(event: dict, context: LambdaContext) -> str: order_ids: List[str] = event["order_ids"] threading.Thread(target=threaded_func, args=(order_ids[0],)).start() threading.Thread(target=threaded_func, args=(order_ids[1],)).start() return "hello world" ``` ``` [ { "level": "INFO", "location": "threaded_func:11", "message": "Collecting payment", "timestamp": "2024-09-08 12:26:10,648-0400", "service": "payment", "order_id": "order_id_value_1", "thread_id": 140077070292544 }, { "level": "INFO", "location": "threaded_func:11", "message": "Collecting payment", "timestamp": "2024-09-08 12:26:10,649-0400", "service": "payment", "order_id": "order_id_value_2", "thread_id": 140077061899840 }, { "level": "INFO", "location": "threaded_func:13", "message": "Exiting thread", "timestamp": "2024-09-08 12:26:10,649-0400", "service": "payment", "thread_id": 140077070292544 }, { "level": "INFO", "location": "threaded_func:13", "message": "Exiting thread", "timestamp": "2024-09-08 12:26:10,649-0400", "service": "payment", "thread_id": 140077061899840 } ] ``` #### Clearing thread-safe additional keys Use the `thread_safe_clear_keys` method to remove all thread-local keys that were previously added using the `thread_safe_append_keys` method. ``` import threading from typing import List from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() def threaded_func(order_id: str): logger.thread_safe_append_keys(order_id=order_id, thread_id=threading.get_ident()) logger.info("Collecting payment") logger.thread_safe_clear_keys() logger.info("Exiting thread") def lambda_handler(event: dict, context: LambdaContext) -> str: order_ids: List[str] = event["order_ids"] threading.Thread(target=threaded_func, args=(order_ids[0],)).start() threading.Thread(target=threaded_func, args=(order_ids[1],)).start() return "hello world" ``` ``` [ { "level": "INFO", "location": "threaded_func:11", "message": "Collecting payment", "timestamp": "2024-09-08 12:26:10,648-0400", "service": "payment", "order_id": "order_id_value_1", "thread_id": 140077070292544 }, { "level": "INFO", "location": "threaded_func:11", "message": "Collecting payment", "timestamp": "2024-09-08 12:26:10,649-0400", "service": "payment", "order_id": "order_id_value_2", "thread_id": 140077061899840 }, { "level": "INFO", "location": "threaded_func:13", "message": "Exiting thread", "timestamp": "2024-09-08 12:26:10,649-0400", "service": "payment" }, { "level": "INFO", "location": "threaded_func:13", "message": "Exiting thread", "timestamp": "2024-09-08 12:26:10,649-0400", "service": "payment" } ] ``` #### Accessing thread-safe currently keys You can view all currently thread-local keys from the Logger state using the `thread_safe_get_current_keys()` method. This method is useful when you need to avoid overwriting keys that are already configured. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> str: logger.info("Collecting payment") if "order" not in logger.thread_safe_get_current_keys(): logger.thread_safe_append_keys(order=event.get("order")) return "hello world" ``` ### Reusing Logger across your code Similar to [Tracer](../tracer/#reusing-tracer-across-your-code), a new instance that uses the same `service` name will reuse a previous Logger instance. Notice in the CloudWatch Logs output how `payment_id` appears as expected when logging in `collect.py`. ``` from logger_reuse_payment import inject_payment_id from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> str: inject_payment_id(context=event) logger.info("Collecting payment") return "hello world" ``` ``` from aws_lambda_powertools import Logger logger = Logger() def inject_payment_id(context): logger.append_keys(payment_id=context.get("payment_id")) ``` ``` { "level": "INFO", "location": "collect.handler:12", "message": "Collecting payment", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "cold_start": true, "function_name": "test", "function_memory_size": 128, "function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test", "function_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72", "payment_id": "968adaae-a211-47af-bda3-eed3ca2c0ed0" } ``` Note: About Child Loggers Coming from standard library, you might be used to use `logging.getLogger(__name__)`. This will create a new instance of a Logger with a different name. In Powertools, you can have the same effect by using `child=True` parameter: `Logger(child=True)`. This creates a new Logger instance named after `service.`. All state changes will be propagated bi-directionally between Child and Parent. For that reason, there could be side effects depending on the order the Child Logger is instantiated, because Child Loggers don't have a handler. For example, if you instantiated a Child Logger and immediately used `logger.append_keys/remove_keys/set_correlation_id` to update logging state, this might fail if the Parent Logger wasn't instantiated. In this scenario, you can either ensure any calls manipulating state are only called when a Parent Logger is instantiated (example above), or refrain from using `child=True` parameter altogether. ### Sampling debug logs Use sampling when you want to dynamically change your log level to **DEBUG** based on a **percentage of the Lambda function invocations**. You can use values ranging from `0.0` to `1` (100%) when setting `POWERTOOLS_LOGGER_SAMPLE_RATE` env var, or `sampling_rate` parameter in Logger. Tip: When is this useful? Log sampling allows you to capture debug information for a fraction of your requests, helping you diagnose rare or intermittent issues without increasing the overall verbosity of your logs. Example: Imagine an e-commerce checkout process where you want to understand rare payment gateway errors. With 10% sampling, you'll log detailed information for a small subset of transactions, making troubleshooting easier without generating excessive logs. The sampling decision happens automatically with each invocation when using `@logger.inject_lambda_context` decorator. When not using the decorator, you're in charge of refreshing it via `refresh_sample_rate_calculation` method. Skipping both may lead to unexpected sampling results. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext # Sample 10% of debug logs e.g. 0.1 logger = Logger(service="payment", sample_rate=0.1) @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext): logger.debug("Verifying whether order_id is present") logger.info("Collecting payment") return "hello world" ``` ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext # Sample 10% of debug logs e.g. 0.1 logger = Logger(service="payment", sample_rate=0.1) def lambda_handler(event: dict, context: LambdaContext): logger.debug("Verifying whether order_id is present") logger.info("Collecting payment") logger.refresh_sample_rate_calculation() return "hello world" ``` ``` [ { "level": "DEBUG", "location": "collect.handler:7", "message": "Verifying whether order_id is present", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "cold_start": true, "function_name": "test", "function_memory_size": 128, "function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test", "function_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72", "sampling_rate": 0.1 }, { "level": "INFO", "location": "collect.handler:7", "message": "Collecting payment", "timestamp": "2021-05-03 11:47:12,494+0000", "service": "payment", "cold_start": true, "function_name": "test", "function_memory_size": 128, "function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test", "function_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72", "sampling_rate": 0.1 } ] ``` ### LambdaPowertoolsFormatter Logger propagates a few formatting configurations to the built-in `LambdaPowertoolsFormatter` logging formatter. If you prefer configuring it separately, or you'd want to bring this JSON Formatter to another application, these are the supported settings: | Parameter | Description | Default | | --- | --- | --- | | **`json_serializer`** | function to serialize `obj` to a JSON formatted `str` | `json.dumps` | | **`json_deserializer`** | function to deserialize `str`, `bytes`, `bytearray` containing a JSON document to a Python obj | `json.loads` | | **`json_default`** | function to coerce unserializable values, when no custom serializer/deserializer is set | `str` | | **`datefmt`** | string directives (strftime) to format log timestamp | `%Y-%m-%d %H:%M:%S,%F%z`, where `%F` is a custom ms directive | | **`use_datetime_directive`** | format the `datefmt` timestamps using `datetime`, not `time` (also supports the custom `%F` directive for milliseconds) | `False` | | **`utc`** | enforce logging timestamp to UTC (ignore `TZ` environment variable) | `False` | | **`log_record_order`** | set order of log keys when logging | `["level", "location", "message", "timestamp"]` | | **`kwargs`** | key-value to be included in log messages | `None` | Info When `POWERTOOLS_DEV` env var is present and set to `"true"`, Logger's default serializer (`json.dumps`) will pretty-print log messages for easier readability. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.logging.formatter import LambdaPowertoolsFormatter # NOTE: Check docs for all available options # https://docs.powertools.aws.dev/lambda/python/latest/core/logger/#lambdapowertoolsformatter formatter = LambdaPowertoolsFormatter(utc=True, log_record_order=["message"]) logger = Logger(service="example", logger_formatter=formatter) ``` ### Observability providers In this context, an observability provider is an [AWS Lambda Partner](https://go.aws/3HtU6CZ) offering a platform for logging, metrics, traces, etc. You can send logs to the observability provider of your choice via [Lambda Extensions](https://aws.amazon.com/blogs/compute/using-aws-lambda-extensions-to-send-logs-to-custom-destinations/). In most cases, you shouldn't need any custom Logger configuration, and logs will be shipped async without any performance impact. #### Built-in formatters In rare circumstances where JSON logs are not parsed correctly by your provider, we offer built-in formatters to make this transition easier. | Provider | Formatter | Notes | | --- | --- | --- | | Datadog | `DatadogLogFormatter` | Modifies default timestamp to use RFC3339 by default | You can use import and use them as any other Logger formatter via `logger_formatter` parameter: ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.logging.formatters.datadog import DatadogLogFormatter logger = Logger(service="payment", logger_formatter=DatadogLogFormatter()) logger.info("hello") ``` ### Migrating from other Loggers If you're migrating from other Loggers, there are few key points to be aware of: [Service parameter](#the-service-parameter), [Child Loggers](#child-loggers), [Overriding Log records](#overriding-log-records), and [Logging exceptions](#logging-exceptions). #### The service parameter Service is what defines the Logger name, including what the Lambda function is responsible for, or part of (e.g payment service). For Logger, the `service` is the logging key customers can use to search log operations for one or more functions - For example, **search for all errors, or messages like X, where service is payment**. #### Child Loggers ``` stateDiagram-v2 direction LR Parent: Logger() Child: Logger(child=True) Parent --> Child: bi-directional updates Note right of Child Both have the same service end note ``` For inheritance, Logger uses `child` parameter to ensure we don't compete with its parents config. We name child Loggers following Python's convention: *`{service}`.`{filename}`*. Changes are bidirectional between parents and loggers. That is, appending a key in a child or parent will ensure both have them. This means, having the same `service` name is important when instantiating them. ``` from logging_inheritance_module import inject_payment_id from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext # NOTE: explicit service name matches any new Logger # because we're using POWERTOOLS_SERVICE_NAME env var # but we could equally use the same string as service value, e.g. "payment" logger = Logger() @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> str: inject_payment_id(context=event) return "hello world" ``` ``` from aws_lambda_powertools import Logger logger = Logger(child=True) def inject_payment_id(context): logger.append_keys(payment_id=context.get("payment_id")) ``` There are two important side effects when using child loggers: 1. **Service name mismatch**. Logging messages will be dropped as child loggers don't have logging handlers. - Solution: use `POWERTOOLS_SERVICE_NAME` env var. Alternatively, use the same service explicit value. 1. **Changing state before a parent instantiate**. Using `logger.append_keys` or `logger.remove_keys` without a parent Logger will lead to `OrphanedChildLoggerError` exception. - Solution: always initialize parent Loggers first. Alternatively, move calls to `append_keys`/`remove_keys` from the child at a later stage. ``` from logging_inheritance_module import inject_payment_id from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext # NOTE: explicit service name differs from Child # meaning we will have two Logger instances with different state # and an orphan child logger who won't be able to manipulate state logger = Logger(service="payment") @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> str: inject_payment_id(context=event) return "hello world" ``` ``` from aws_lambda_powertools import Logger logger = Logger(child=True) def inject_payment_id(context): logger.append_keys(payment_id=context.get("payment_id")) ``` #### Overriding Log records You might want to continue to use the same date formatting style, or override `location` to display the `package.function_name:line_number` as you previously had. Logger allows you to either change the format or suppress the following keys at initialization: `location`, `timestamp`, `xray_trace_id`. ``` from aws_lambda_powertools import Logger location_format = "[%(funcName)s] %(module)s" # override location and timestamp format logger = Logger(service="payment", location=location_format) logger.info("Collecting payment") # suppress keys with a None value logger_two = Logger(service="loyalty", location=None) logger_two.info("Calculating points") ``` ``` [ { "level": "INFO", "location": "[] overriding_log_records", "message": "Collecting payment", "timestamp": "2022-10-28 14:40:43,801+0000", "service": "payment" }, { "level": "INFO", "message": "Calculating points", "timestamp": "2022-10-28 14:40:43,801+0000", "service": "loyalty" } ] ``` #### Reordering log keys position You can change the order of [standard Logger keys](#standard-structured-keys) or any keys that will be appended later at runtime via the `log_record_order` parameter. ``` from aws_lambda_powertools import Logger # make message as the first key logger = Logger(service="payment", log_record_order=["message"]) # make request_id that will be added later as the first key logger_two = Logger(service="order", log_record_order=["request_id"]) logger_two.append_keys(request_id="123") logger.info("hello world") logger_two.info("hello world") ``` ``` [ { "message": "hello world", "level": "INFO", "location": ":11", "timestamp": "2022-06-24 11:25:40,143+0000", "service": "payment" }, { "request_id": "123", "level": "INFO", "location": ":12", "timestamp": "2022-06-24 11:25:40,144+0000", "service": "order", "message": "hello universe" } ] ``` #### Setting timestamp to custom Timezone By default, this Logger and the standard logging library emit records with the default AWS Lambda timestamp in **UTC**. If you prefer to log in a specific timezone, you can configure it by setting the `TZ` environment variable. You can do this either as an AWS Lambda environment variable or directly within your Lambda function settings. [Click here](https://docs.aws.amazon.com/lambda/latest/dg/configuration-envvars.html#configuration-envvars-runtime) for a comprehensive list of available Lambda environment variables. Tip `TZ` environment variable will be ignored if `utc` is set to `True` ``` import os import time from aws_lambda_powertools import Logger logger_in_utc = Logger(service="payment") logger_in_utc.info("Logging with default AWS Lambda timezone: UTC time") os.environ["TZ"] = "US/Eastern" time.tzset() # (1)! logger = Logger(service="order") logger.info("Logging with US Eastern timezone") ``` 1. if you set TZ in your Lambda function, `time.tzset()` need to be called. You don't need it when setting TZ in AWS Lambda environment variable ``` [ { "level":"INFO", "location":":7", "message":"Logging with default AWS Lambda timezone: UTC time", "timestamp":"2023-10-09 21:33:55,733+0000", "service":"payment" }, { "level":"INFO", "location":":13", "message":"Logging with US Eastern timezone", "timestamp":"2023-10-09 17:33:55,734-0400", "service":"order" } ] ``` #### Custom function for unserializable values By default, Logger uses `str` to handle values non-serializable by JSON. You can override this behavior via `json_default` parameter by passing a Callable: ``` from datetime import date, datetime from aws_lambda_powertools import Logger def custom_json_default(value: object) -> str: if isinstance(value, (datetime, date)): return value.isoformat() return f"" class Unserializable: pass logger = Logger(service="payment", json_default=custom_json_default) logger.info({"ingestion_time": datetime.utcnow(), "serialize_me": Unserializable()}) ``` ``` { "level": "INFO", "location": ":19", "message": { "ingestion_time": "2022-06-24T10:12:09.526365", "serialize_me": "" }, "timestamp": "2022-06-24 12:12:09,526+0000", "service": "payment" } ``` #### Bring your own handler By default, Logger uses StreamHandler and logs to standard output. You can override this behavior via `logger_handler` parameter: ``` import logging from pathlib import Path from aws_lambda_powertools import Logger log_file = Path("/tmp/log.json") log_file_handler = logging.FileHandler(filename=log_file) logger = Logger(service="payment", logger_handler=log_file_handler) logger.info("hello world") ``` #### Bring your own formatter By default, Logger uses [LambdaPowertoolsFormatter](#lambdapowertoolsformatter) that persists its custom structure between non-cold start invocations. There could be scenarios where the existing feature set isn't sufficient to your formatting needs. Info The most common use cases are remapping keys by bringing your existing schema, and redacting sensitive information you know upfront. For these, you can override the `serialize` method from [LambdaPowertoolsFormatter](#lambdapowertoolsformatter). ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.logging.formatter import LambdaPowertoolsFormatter from aws_lambda_powertools.logging.types import LogRecord class CustomFormatter(LambdaPowertoolsFormatter): def serialize(self, log: LogRecord) -> str: """Serialize final structured log dict to JSON str""" # in this example, log["message"] is a required field # but we want to remap to "event" and delete "message", hence mypy ignore checks log["event"] = log.pop("message") # type: ignore[typeddict-unknown-key,misc] return self.json_serializer(log) logger = Logger(service="payment", logger_formatter=CustomFormatter()) logger.info("hello") ``` ``` { "level": "INFO", "location": ":16", "timestamp": "2021-12-30 13:41:53,413+0000", "service": "payment", "event": "hello" } ``` The `log` argument is the final log record containing [our standard keys](#standard-structured-keys), optionally [Lambda context keys](#capturing-lambda-context-info), and any custom key you might have added via [append_keys](#append_keys-method) or the [extra parameter](#extra-parameter). For exceptional cases where you want to completely replace our formatter logic, you can subclass `BasePowertoolsFormatter`. Warning You will need to implement `append_keys`, `clear_state`, override `format`, and optionally `get_current_keys`, and `remove_keys` to keep the same feature set Powertools for AWS Lambda (Python) Logger provides. This also means tracking the added logging keys. ``` import json import logging from typing import Any, Dict, Iterable, List, Optional from aws_lambda_powertools import Logger from aws_lambda_powertools.logging.formatter import BasePowertoolsFormatter class CustomFormatter(BasePowertoolsFormatter): def __init__(self, log_record_order: Optional[List[str]] = None, *args, **kwargs): self.log_record_order = log_record_order or ["level", "location", "message", "timestamp"] self.log_format = dict.fromkeys(self.log_record_order) super().__init__(*args, **kwargs) def append_keys(self, **additional_keys): # also used by `inject_lambda_context` decorator self.log_format.update(additional_keys) def current_keys(self) -> Dict[str, Any]: return self.log_format def remove_keys(self, keys: Iterable[str]): for key in keys: self.log_format.pop(key, None) def clear_state(self): self.log_format = dict.fromkeys(self.log_record_order) def format(self, record: logging.LogRecord) -> str: # noqa: A003 """Format logging record as structured JSON str""" return json.dumps( { "event": super().format(record), "timestamp": self.formatTime(record), "my_default_key": "test", **self.log_format, }, ) logger = Logger(service="payment", logger_formatter=CustomFormatter()) @logger.inject_lambda_context def lambda_handler(event, context): logger.info("Collecting payment") ``` ``` { "event": "Collecting payment", "timestamp": "2021-05-03 11:47:12,494", "my_default_key": "test", "cold_start": true, "function_name": "test", "function_memory_size": 128, "function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test", "function_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72" } ``` #### Bring your own JSON serializer By default, Logger uses `json.dumps` and `json.loads` as serializer and deserializer respectively. There could be scenarios where you are making use of alternative JSON libraries like [orjson](https://github.com/ijl/orjson). As parameters don't always translate well between them, you can pass any callable that receives a `dict` and return a `str`: ``` import functools import orjson from aws_lambda_powertools import Logger custom_serializer = orjson.dumps custom_deserializer = orjson.loads logger = Logger(service="payment", json_serializer=custom_serializer, json_deserializer=custom_deserializer) # NOTE: when using parameters, you can pass a partial custom_serializer_with_parameters = functools.partial(orjson.dumps, option=orjson.OPT_SERIALIZE_NUMPY) logger_two = Logger( service="payment", json_serializer=custom_serializer_with_parameters, json_deserializer=custom_deserializer, ) ``` ## Testing your code ### Inject Lambda Context When unit testing your code that makes use of `inject_lambda_context` decorator, you need to pass a dummy Lambda Context, or else Logger will fail. This is a Pytest sample that provides the minimum information necessary for Logger to succeed: ``` from dataclasses import dataclass import fake_lambda_context_for_logger_module # sample module for completeness import pytest @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:809313241:function:test" aws_request_id: str = "52fdfc07-2182-154f-163f-5f0f9a621d72" @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() def test_lambda_handler(lambda_context): test_event = {"test": "event"} fake_lambda_context_for_logger_module.handler(test_event, lambda_context) ``` ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> str: logger.info("Collecting payment") return "hello world" ``` Tip Check out the built-in [Pytest caplog fixture](https://docs.pytest.org/en/latest/how-to/logging.html) to assert plain log messages ### Pytest live log feature Pytest Live Log feature duplicates emitted log messages in order to style log statements according to their levels, for this to work use `POWERTOOLS_LOG_DEDUPLICATION_DISABLED` env var. ``` POWERTOOLS_LOG_DEDUPLICATION_DISABLED="1" pytest -o log_cli=1 ``` Warning This feature should be used with care, as it explicitly disables our ability to filter propagated messages to the root logger (if configured). ## FAQ ### How can I enable boto3 and botocore library logging? You can enable the `botocore` and `boto3` logs by using the `set_stream_logger` method, this method will add a stream handler for the given name and level to the logging module. By default, this logs all boto3 messages to stdout. ``` from typing import Dict, List import boto3 from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext boto3.set_stream_logger() boto3.set_stream_logger("botocore") logger = Logger() client = boto3.client("s3") def lambda_handler(event: Dict, context: LambdaContext) -> List: response = client.list_buckets() return response.get("Buckets", []) ``` ### How can I enable Powertools for AWS Lambda (Python) logging for imported libraries? You can copy the Logger setup to all or sub-sets of registered external loggers. Use the `copy_config_to_registered_logger` method to do this. We include the logger `name` attribute for all loggers we copied configuration to help you differentiate them. By default all registered loggers will be modified. You can change this behavior by providing `include` and `exclude` attributes. You can also provide optional `log_level` attribute external top-level loggers will be configured with, by default it'll use the source logger log level. You can opt-out by using `ignore_log_level=True` parameter. ``` import logging from aws_lambda_powertools import Logger from aws_lambda_powertools.logging import utils logger = Logger() external_logger = logging.getLogger() utils.copy_config_to_registered_loggers(source_logger=logger) external_logger.info("test message") ``` ### How can I add standard library logging attributes to a log record? The Python standard library log records contains a [large set of attributes](https://docs.python.org/3/library/logging.html#logrecord-attributes), however only a few are included in Powertools for AWS Lambda (Python) Logger log record by default. You can include any of these logging attributes as key value arguments (`kwargs`) when instantiating `Logger` or `LambdaPowertoolsFormatter`. You can also add them later anywhere in your code with `append_keys`, or remove them with `remove_keys` methods. ``` from aws_lambda_powertools import Logger logger = Logger(service="payment", name="%(name)s") logger.info("Name should be equal service value") additional_log_attributes = {"process": "%(process)d", "processName": "%(processName)s"} logger.append_keys(**additional_log_attributes) logger.info("This will include process ID and name") logger.remove_keys(["processName"]) # further messages will not include processName ``` ``` [ { "level": "INFO", "location": ":16", "message": "Name should be equal service value", "name": "payment", "service": "payment", "timestamp": "2022-07-01 07:09:46,330+0000" }, { "level": "INFO", "location": ":23", "message": "This will include process ID and name", "name": "payment", "process": "9", "processName": "MainProcess", "service": "payment", "timestamp": "2022-07-01 07:09:46,330+0000" } ] ``` For log records originating from Powertools for AWS Lambda (Python) Logger, the `name` attribute will be the same as `service`, for log records coming from standard library logger, it will be the name of the logger (i.e. what was used as name argument to `logging.getLogger`). ### What's the difference between `append_keys` and `extra`? Keys added with `append_keys` will persist across multiple log messages while keys added via `extra` will only be available in a given log message operation. Here's an example where we persist `payment_id` not `request_id`. Note that `payment_id` remains in both log messages while `booking_id` is only available in the first message. ``` import os import requests from aws_lambda_powertools import Logger ENDPOINT = os.getenv("PAYMENT_API", "") logger = Logger(service="payment") class PaymentError(Exception): ... def lambda_handler(event, context): logger.append_keys(payment_id="123456789") charge_id = event.get("charge_id", "") try: ret = requests.post(url=f"{ENDPOINT}/collect", data={"charge_id": charge_id}) ret.raise_for_status() logger.info("Charge collected successfully", extra={"charge_id": charge_id}) return ret.json() except requests.HTTPError as e: raise PaymentError(f"Unable to collect payment for charge {charge_id}") from e logger.info("goodbye") ``` ``` [ { "level": "INFO", "location": ":22", "message": "Charge collected successfully", "timestamp": "2021-01-12 14:09:10,859", "service": "payment", "sampling_rate": 0.0, "payment_id": "123456789", "charge_id": "75edbad0-0857-4fc9-b547-6180e2f7959b" }, { "level": "INFO", "location": ":27", "message": "goodbye", "timestamp": "2021-01-12 14:09:10,860", "service": "payment", "sampling_rate": 0.0, "payment_id": "123456789" } ] ``` ### How do I aggregate and search Powertools for AWS Lambda (Python) logs across accounts? As of now, ElasticSearch (ELK) or 3rd party solutions are best suited to this task. Please refer to this [discussion for more details](https://github.com/aws-powertools/powertools-lambda-python/issues/460) Metrics creates custom metrics asynchronously by logging metrics to standard output following [Amazon CloudWatch Embedded Metric Format (EMF)](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch_Embedded_Metric_Format.html). These metrics can be visualized through [Amazon CloudWatch Console](https://console.aws.amazon.com/cloudwatch/). ## Key features - Aggregate up to 100 metrics using a single CloudWatch EMF object (large JSON blob) - Validate against common metric definitions mistakes (metric unit, values, max dimensions, max metrics, etc) - Metrics are created asynchronously by CloudWatch service, no custom stacks needed - Context manager to create a one off metric with a different dimension ## Terminologies If you're new to Amazon CloudWatch, there are five terminologies you must be aware of before using this utility: - **Namespace**. It's the highest level container that will group multiple metrics from multiple services for a given application, for example `ServerlessEcommerce`. - **Dimensions**. Metrics metadata in key-value format. They help you slice and dice metrics visualization, for example `ColdStart` metric by Payment `service`. - **Metric**. It's the name of the metric, for example: `SuccessfulBooking` or `UpdatedBooking`. - **Unit**. It's a value representing the unit of measure for the corresponding metric, for example: `Count` or `Seconds`. - **Resolution**. It's a value representing the storage resolution for the corresponding metric. Metrics can be either Standard or High resolution. Read more [here](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/publishingMetrics.html#high-resolution-metrics). Metric terminology, visually explained ## Getting started Tip All examples shared in this documentation are available within the [project repository](https://github.com/aws-powertools/powertools-lambda-python/tree/develop/examples). Metric has two global settings that will be used across all metrics emitted: | Setting | Description | Environment variable | Constructor parameter | | --- | --- | --- | --- | | **Metric namespace** | Logical container where all metrics will be placed e.g. `ServerlessAirline` | `POWERTOOLS_METRICS_NAMESPACE` | `namespace` | | **Service** | Optionally, sets **service** metric dimension across all metrics e.g. `payment` | `POWERTOOLS_SERVICE_NAME` | `service` | Info `POWERTOOLS_METRICS_DISABLED` will not disable default metrics created by AWS services. Tip Use your application or main service as the metric namespace to easily group all metrics. ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: Powertools for AWS Lambda (Python) version Globals: Function: Timeout: 5 Runtime: python3.12 Tracing: Active Environment: Variables: POWERTOOLS_SERVICE_NAME: booking POWERTOOLS_METRICS_NAMESPACE: ServerlessAirline POWERTOOLS_METRICS_FUNCTION_NAME: my-function-name Layers: # Find the latest Layer version in the official documentation # https://docs.powertools.aws.dev/lambda/python/latest/#lambda-layer - !Sub arn:aws:lambda:${AWS::Region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15 Resources: CaptureLambdaHandlerExample: Type: AWS::Serverless::Function Properties: CodeUri: ../src Handler: capture_lambda_handler.handler ``` Note For brevity, all code snippets in this page will rely on environment variables above being set. This ensures we instantiate `metrics = Metrics()` over `metrics = Metrics(service="booking", namespace="ServerlessAirline")`, etc. ### Creating metrics You can create metrics using `add_metric`, and you can create dimensions for all your aggregate metrics using `add_dimension` method. Tip You can initialize Metrics in any other module too. It'll keep track of your aggregate metrics in memory to optimize costs (one blob instead of multiples). ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) ``` ``` import os from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext STAGE = os.getenv("STAGE", "dev") metrics = Metrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_dimension(name="environment", value=STAGE) metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) ``` Tip: Autocomplete Metric Units `MetricUnit` enum facilitate finding a supported metric unit by CloudWatch. Alternatively, you can pass the value as a string if you already know them *e.g. `unit="Count"`*. Note: Metrics overflow CloudWatch EMF supports a max of 100 metrics per batch. Metrics utility will flush all metrics when adding the 100th metric. Subsequent metrics (101th+) will be aggregated into a new EMF object, for your convenience. Warning: Do not create metrics or dimensions outside the handler Metrics or dimensions added in the global scope will only be added during cold start. Disregard if that's the intended behavior. ### Adding high-resolution metrics You can create [high-resolution metrics](https://aws.amazon.com/about-aws/whats-new/2023/02/amazon-cloudwatch-high-resolution-metric-extraction-structured-logs/) passing `resolution` parameter to `add_metric`. When is it useful? High-resolution metrics are data with a granularity of one second and are very useful in several situations such as telemetry, time series, real-time incident management, and others. ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricResolution, MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1, resolution=MetricResolution.High) ``` Tip: Autocomplete Metric Resolutions `MetricResolution` enum facilitates finding a supported metric resolution by CloudWatch. Alternatively, you can pass the values 1 or 60 (must be one of them) as an integer *e.g. `resolution=1`*. ### Adding multi-value metrics You can call `add_metric()` with the same metric name multiple times. The values will be grouped together in a list. ``` import os from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext STAGE = os.getenv("STAGE", "dev") metrics = Metrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_dimension(name="environment", value=STAGE) metrics.add_metric(name="TurbineReads", unit=MetricUnit.Count, value=1) metrics.add_metric(name="TurbineReads", unit=MetricUnit.Count, value=8) ``` ``` { "_aws": { "Timestamp": 1656685750622, "CloudWatchMetrics": [ { "Namespace": "ServerlessAirline", "Dimensions": [ [ "environment", "service" ] ], "Metrics": [ { "Name": "TurbineReads", "Unit": "Count" } ] } ] }, "environment": "dev", "service": "booking", "TurbineReads": [ 1.0, 8.0 ] } ``` ### Adding default dimensions You can use `set_default_dimensions` method, or `default_dimensions` parameter in `log_metrics` decorator, to persist dimensions across Lambda invocations. If you'd like to remove them at some point, you can use `clear_default_dimensions` method. ``` import os from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext STAGE = os.getenv("STAGE", "dev") metrics = Metrics() metrics.set_default_dimensions(environment=STAGE, another="one") @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="TurbineReads", unit=MetricUnit.Count, value=1) metrics.add_metric(name="TurbineReads", unit=MetricUnit.Count, value=8) ``` ``` import os from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext STAGE = os.getenv("STAGE", "dev") metrics = Metrics() DEFAULT_DIMENSIONS = {"environment": STAGE, "another": "one"} # ensures metrics are flushed upon request completion/failure @metrics.log_metrics(default_dimensions=DEFAULT_DIMENSIONS) def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="TurbineReads", unit=MetricUnit.Count, value=1) metrics.add_metric(name="TurbineReads", unit=MetricUnit.Count, value=8) ``` **Note:** Dimensions with empty values will not be included. ### Changing default timestamp When creating metrics, we use the current timestamp. If you want to change the timestamp of all the metrics you create, utilize the `set_timestamp` function. You can specify a datetime object or an integer representing an epoch timestamp in milliseconds. Note that when specifying the timestamp using an integer, it must adhere to the epoch timezone format in milliseconds. Info If you need to use different timestamps across multiple metrics, opt for [single_metric](#working-with-different-timestamp). ``` import datetime from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) metric_timestamp = int((datetime.datetime.now() - datetime.timedelta(days=2)).timestamp() * 1000) metrics.set_timestamp(metric_timestamp) ``` ### Flushing metrics As you finish adding all your metrics, you need to serialize and flush them to standard output. You can do that automatically with the `log_metrics` decorator. This decorator also **validates**, **serializes**, and **flushes** all your metrics. During metrics validation, if no metrics are provided then a warning will be logged, but no exception will be raised. ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) ``` ``` { "_aws": { "Timestamp": 1656686788803, "CloudWatchMetrics": [ { "Namespace": "ServerlessAirline", "Dimensions": [ [ "service" ] ], "Metrics": [ { "Name": "SuccessfulBooking", "Unit": "Count" } ] } ] }, "service": "booking", "SuccessfulBooking": [ 1.0 ] } ``` Tip: Metric validation If metrics are provided, and any of the following criteria are not met, **`SchemaValidationError`** exception will be raised: - Maximum of 29 user-defined dimensions - Namespace is set, and no more than one - Metric units must be [supported by CloudWatch](https://docs.aws.amazon.com/AmazonCloudWatch/latest/APIReference/API_MetricDatum.html) #### Raising SchemaValidationError on empty metrics If you want to ensure at least one metric is always emitted, you can pass `raise_on_empty_metrics` to the **log_metrics** decorator: ``` from aws_lambda_powertools.metrics import Metrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics(raise_on_empty_metrics=True) def lambda_handler(event: dict, context: LambdaContext): # no metrics being created will now raise SchemaValidationError ... ``` Suppressing warning messages on empty metrics If you expect your function to execute without publishing metrics every time, you can suppress the warning with **`warnings.filterwarnings("ignore", "No application metrics to publish*")`**. ### Capturing cold start metric You can optionally capture cold start metrics with `log_metrics` decorator via `capture_cold_start_metric` param. ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics(capture_cold_start_metric=True) def lambda_handler(event: dict, context: LambdaContext): ... ``` ``` { "_aws": { "Timestamp": 1656687493142, "CloudWatchMetrics": [ { "Namespace": "ServerlessAirline", "Dimensions": [ [ "function_name", "service" ] ], "Metrics": [ { "Name": "ColdStart", "Unit": "Count" } ] } ] }, "function_name": "test", "service": "booking", "ColdStart": [ 1.0 ] } ``` If it's a cold start invocation, this feature will: - Create a separate EMF blob solely containing a metric named `ColdStart` - Add `function_name` and `service` dimensions This has the advantage of keeping cold start metric separate from your application metrics, where you might have unrelated dimensions. Info We do not emit 0 as a value for ColdStart metric for cost reasons. [Let us know](https://github.com/aws-powertools/powertools-lambda-python/issues/new?assignees=&labels=feature-request%2C+triage&template=feature_request.md&title=) if you'd prefer a flag to override it. #### Customizing function name for cold start metrics When emitting cold start metrics, the `function_name` dimension defaults to `context.function_name`. If you want to change the value you can set the `function_name` parameter in the metrics constructor, or define the environment variable `POWERTOOLS_METRICS_FUNCTION_NAME`. The priority of the `function_name` dimension value is defined as: 1. `function_name` constructor option 1. `POWERTOOLS_METRICS_FUNCTION_NAME` environment variable 1. `context.function_name` property ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics(function_name="my-function-name") @metrics.log_metrics(capture_cold_start_metric=True) def lambda_handler(event: dict, context: LambdaContext): ... ``` ### Environment variables The following environment variable is available to configure Metrics at a global scope: | Setting | Description | Environment variable | Default | | --- | --- | --- | --- | | **Namespace Name** | Sets **namespace** used for metrics. | `POWERTOOLS_METRICS_NAMESPACE` | `None` | | **Service** | Sets **service** metric dimension across all metrics e.g. `payment` | `POWERTOOLS_SERVICE_NAME` | `None` | | **Function Name** | Function name used as dimension for the **ColdStart** metric. | `POWERTOOLS_METRICS_FUNCTION_NAME` | `None` | | **Disable Powertools Metrics** | **Disables** all metrics emitted by Powertools. | `POWERTOOLS_METRICS_DISABLED` | `None` | `POWERTOOLS_METRICS_NAMESPACE` is also available on a per-instance basis with the `namespace` parameter, which will consequently override the environment variable value. ## Advanced ### Adding metadata You can add high-cardinality data as part of your Metrics log with `add_metadata` method. This is useful when you want to search highly contextual information along with your metrics in your logs. Info **This will not be available during metrics visualization** - Use **dimensions** for this purpose ``` from uuid import uuid4 from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) metrics.add_metadata(key="booking_id", value=f"{uuid4()}") ``` ``` { "_aws": { "Timestamp": 1656688250155, "CloudWatchMetrics": [ { "Namespace": "ServerlessAirline", "Dimensions": [ [ "service" ] ], "Metrics": [ { "Name": "SuccessfulBooking", "Unit": "Count" } ] } ] }, "service": "booking", "booking_id": "00347014-341d-4b8e-8421-a89d3d588ab3", "SuccessfulBooking": [ 1.0 ] } ``` ### Single metric CloudWatch EMF uses the same dimensions and timestamp across all your metrics. Use `single_metric` if you have a metric that should have different dimensions or timestamp. #### Working with different dimensions Generally, using different dimensions would be an edge case since you [pay for unique metric](https://aws.amazon.com/cloudwatch/pricing). Keep the following formula in mind: **unique metric = (metric_name + dimension_name + dimension_value)** ``` import os from aws_lambda_powertools import single_metric from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext STAGE = os.getenv("STAGE", "dev") def lambda_handler(event: dict, context: LambdaContext): with single_metric(name="MySingleMetric", unit=MetricUnit.Count, value=1) as metric: metric.add_dimension(name="environment", value=STAGE) ``` ``` { "_aws": { "Timestamp": 1656689267834, "CloudWatchMetrics": [ { "Namespace": "ServerlessAirline", "Dimensions": [ [ "environment", "service" ] ], "Metrics": [ { "Name": "MySingleMetric", "Unit": "Count" } ] } ] }, "environment": "dev", "service": "booking", "MySingleMetric": [ 1.0 ] } ``` By default it will skip all previously defined dimensions including default dimensions. Use `default_dimensions` keyword argument if you want to reuse default dimensions or specify custom dimensions from a dictionary. ``` import os from aws_lambda_powertools import single_metric from aws_lambda_powertools.metrics import Metrics, MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext STAGE = os.getenv("STAGE", "dev") metrics = Metrics() metrics.set_default_dimensions(environment=STAGE) def lambda_handler(event: dict, context: LambdaContext): with single_metric( name="RecordsCount", unit=MetricUnit.Count, value=10, default_dimensions=metrics.default_dimensions, ) as metric: metric.add_dimension(name="TableName", value="Users") ``` ``` import os from aws_lambda_powertools import single_metric from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext STAGE = os.getenv("STAGE", "dev") def lambda_handler(event: dict, context: LambdaContext): with single_metric( name="RecordsCount", unit=MetricUnit.Count, value=10, default_dimensions={"environment": STAGE}, ) as metric: metric.add_dimension(name="TableName", value="Users") ``` #### Working with different timestamp When working with multiple metrics, customers may need different timestamps between them. In such cases, utilize `single_metric` to flush individual metrics with specific timestamps. ``` from aws_lambda_powertools import Logger, single_metric from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() def lambda_handler(event: dict, context: LambdaContext): for record in event: record_id: str = record.get("record_id") amount: int = record.get("amount") timestamp: int = record.get("timestamp") with single_metric(name="Orders", unit=MetricUnit.Count, value=amount, namespace="Powertools") as metric: logger.info(f"Processing record id {record_id}") metric.set_timestamp(timestamp) ``` ``` [ { "record_id": "6ba7b810-9dad-11d1-80b4-00c04fd430c8", "amount": 10, "timestamp": 1648195200000 }, { "record_id": "6ba7b811-9dad-11d1-80b4-00c04fd430c8", "amount": 30, "timestamp": 1648224000000 }, { "record_id": "6ba7b812-9dad-11d1-80b4-00c04fd430c8", "amount": 25, "timestamp": 1648209600000 }, { "record_id": "6ba7b813-9dad-11d1-80b4-00c04fd430c8", "amount": 40, "timestamp": 1648177200000 }, { "record_id": "6ba7b814-9dad-11d1-80b4-00c04fd430c8", "amount": 32, "timestamp": 1648216800000 } ] ``` ### Flushing metrics manually If you are using the [AWS Lambda Web Adapter](https://github.com/awslabs/aws-lambda-web-adapter) project, or a middleware with custom metric logic, you can use `flush_metrics()`. This method will serialize, print metrics available to standard output, and clear in-memory metrics data. Warning This does not capture Cold Start metrics, and metric data validation still applies. Contrary to the `log_metrics` decorator, you are now also responsible to flush metrics in the event of an exception. ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() def book_flight(flight_id: str, **kwargs): # logic to book flight ... metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) def lambda_handler(event: dict, context: LambdaContext): try: book_flight(flight_id=event.get("flight_id", "")) finally: metrics.flush_metrics() ``` ### Metrics isolation You can use `EphemeralMetrics` class when looking to isolate multiple instances of metrics with distinct namespaces and/or dimensions. This is a typical use case is for multi-tenant, or emitting same metrics for distinct applications. ``` from aws_lambda_powertools.metrics import EphemeralMetrics, MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = EphemeralMetrics() @metrics.log_metrics def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) ``` **Differences between `EphemeralMetrics` and `Metrics`** `EphemeralMetrics` has only one difference while keeping nearly the exact same set of features: | Feature | Metrics | EphemeralMetrics | | --- | --- | --- | | **Share data across instances** (metrics, dimensions, metadata, etc.) | Yes | - | Why not changing the default `Metrics` behaviour to not share data across instances? This is an intentional design to prevent accidental data deduplication or data loss issues due to [CloudWatch EMF](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch_Embedded_Metric_Format_Specification.html) metric dimension constraint. In CloudWatch, there are two metric ingestion mechanisms: [EMF (async)](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch_Embedded_Metric_Format_Specification.html) and [`PutMetricData` API (sync)](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/cloudwatch.html#CloudWatch.Client.put_metric_data). The former creates metrics asynchronously via CloudWatch Logs, and the latter uses a synchronous and more flexible ingestion API. Key concept CloudWatch [considers a metric unique](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/cloudwatch_concepts.html#Metric) by a combination of metric **name**, metric **namespace**, and zero or more metric **dimensions**. With EMF, metric dimensions are shared with any metrics you define. With `PutMetricData` API, you can set a [list](https://docs.aws.amazon.com/AmazonCloudWatch/latest/APIReference/API_MetricDatum.html) defining one or more metrics with distinct dimensions. This is a subtle yet important distinction. Imagine you had the following metrics to emit: | Metric Name | Dimension | Intent | | --- | --- | --- | | **SuccessfulBooking** | service="booking", **tenant_id**="sample" | Application metric | | **IntegrationLatency** | service="booking", function_name="sample" | Operational metric | | **ColdStart** | service="booking", function_name="sample" | Operational metric | The `tenant_id` dimension could vary leading to two common issues: 1. `ColdStart` metric will be created multiple times (N * number of unique tenant_id dimension value), despite the `function_name` being the same 1. `IntegrationLatency` metric will be also created multiple times due to `tenant_id` as well as `function_name` (may or not be intentional) These issues are exacerbated when you create **(A)** metric dimensions conditionally, **(B)** multiple metrics' instances throughout your code instead of reusing them (globals). Subsequent metrics' instances will have (or lack) different metric dimensions resulting in different metrics and data points with the same name. Intentional design to address these scenarios **On 1**, when you enable [capture_start_metric feature](#capturing-cold-start-metric), we transparently create and flush an additional EMF JSON Blob that is independent from your application metrics. This prevents data pollution. **On 2**, you can use `EphemeralMetrics` to create an additional EMF JSON Blob from your application metric (`SuccessfulBooking`). This ensures that `IntegrationLatency` operational metric data points aren't tied to any dynamic dimension values like `tenant_id`. That is why `Metrics` shares data across instances by default, as that covers 80% of use cases and different personas using Powertools. This allows them to instantiate `Metrics` in multiple places throughout their code - be a separate file, a middleware, or an abstraction that sets default dimensions. ### Observability providers > An observability provider is an [AWS Lambda Partner](https://docs.aws.amazon.com/lambda/latest/dg/extensions-api-partners.html) offering a platform for logging, metrics, traces, etc. We provide a thin-wrapper on top of the most requested observability providers. We strive to keep a similar UX as close as possible while keeping our value add features. Missing your preferred provider? Please create a [feature request](https://github.com/aws-powertools/powertools-lambda-python/issues/new?assignees=&labels=feature-request%2Ctriage&projects=&template=feature_request.yml&title=Feature+request%3A+TITLE). Current providers: | Provider | Notes | | --- | --- | | [Datadog](./datadog) | Uses Datadog SDK and Datadog Lambda Extension by default | ## Testing your code ### Setting environment variables Tip Ignore this section, if: - You are explicitly setting namespace/default dimension via `namespace` and `service` parameters - You're not instantiating `Metrics` in the global namespace For example, `Metrics(namespace="ServerlessAirline", service="booking")` Make sure to set `POWERTOOLS_METRICS_NAMESPACE` and `POWERTOOLS_SERVICE_NAME` before running your tests to prevent failing on `SchemaValidation` exception. You can set it before you run tests or via pytest plugins like [dotenv](https://pypi.org/project/pytest-dotenv/). ``` POWERTOOLS_SERVICE_NAME="booking" POWERTOOLS_METRICS_NAMESPACE="ServerlessAirline" python -m pytest ``` ### Clearing metrics `Metrics` keep metrics in memory across multiple instances. If you need to test this behavior, you can use the following Pytest fixture to ensure metrics are reset incl. cold start: ``` import pytest from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics.provider import cold_start @pytest.fixture(scope="function", autouse=True) def reset_metric_set(): # Clear out every metric data prior to every test metrics = Metrics() metrics.clear_metrics() cold_start.is_cold_start = True # ensure each test has cold start metrics.clear_default_dimensions() # remove persisted default dimensions, if any yield ``` ### Functional testing You can read standard output and assert whether metrics have been flushed. Here's an example using `pytest` with `capsys` built-in fixture: ``` import json import add_metrics def test_log_metrics(capsys): add_metrics.lambda_handler({}, {}) log = capsys.readouterr().out.strip() # remove any extra line metrics_output = json.loads(log) # deserialize JSON str # THEN we should have no exceptions # and a valid EMF object should be flushed correctly assert "SuccessfulBooking" in log # basic string assertion in JSON str assert "SuccessfulBooking" in metrics_output["_aws"]["CloudWatchMetrics"][0]["Metrics"][0]["Name"] ``` ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) ``` This will be needed when using `capture_cold_start_metric=True`, or when both `Metrics` and `single_metric` are used. ``` import json from dataclasses import dataclass import assert_multiple_emf_blobs_module import pytest @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:809313241:function:test" aws_request_id: str = "52fdfc07-2182-154f-163f-5f0f9a621d72" @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() def capture_metrics_output_multiple_emf_objects(capsys): return [json.loads(line.strip()) for line in capsys.readouterr().out.split("\n") if line] def test_log_metrics(capsys, lambda_context: LambdaContext): assert_multiple_emf_blobs_module.lambda_handler({}, lambda_context) cold_start_blob, custom_metrics_blob = capture_metrics_output_multiple_emf_objects(capsys) # Since `capture_cold_start_metric` is used # we should have one JSON blob for cold start metric and one for the application assert cold_start_blob["ColdStart"] == [1.0] assert cold_start_blob["function_name"] == "test" assert "SuccessfulBooking" in custom_metrics_blob ``` ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics(capture_cold_start_metric=True) def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) ``` Tip For more elaborate assertions and comparisons, check out [our functional testing for Metrics utility.](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/tests/functional/metrics/required_dependencies/test_metrics_cloudwatch_emf.py) Tracer is an opinionated thin wrapper for [AWS X-Ray Python SDK](https://github.com/aws/aws-xray-sdk-python/). ## Key features - Auto capture cold start as annotation, and responses or full exceptions as metadata - Auto-disable when not running in AWS Lambda environment - Support tracing async methods, generators, and context managers - Auto patch supported modules by AWS X-Ray ## Getting started Tip All examples shared in this documentation are available within the [project repository](https://github.com/aws-powertools/powertools-lambda-python/tree/develop/examples). Tracer relies on AWS X-Ray SDK over [OpenTelememetry Distro (ADOT)](https://aws-otel.github.io/docs/getting-started/lambda) for optimal cold start (lower latency). ### Install This is not necessary if you're installing Powertools for AWS Lambda (Python) via [Lambda Layer/SAR](../../#lambda-layer) Add `aws-lambda-powertools[tracer]` as a dependency in your preferred tool: *e.g.*, *requirements.txt*, *pyproject.toml*. This will ensure you have the required dependencies before using Tracer. ### Permissions Before your use this utility, your AWS Lambda function [must have permissions](https://docs.aws.amazon.com/lambda/latest/dg/services-xray.html#services-xray-permissions) to send traces to AWS X-Ray. ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: Powertools for AWS Lambda (Python) version Globals: Function: Timeout: 5 Runtime: python3.12 Tracing: Active Environment: Variables: POWERTOOLS_SERVICE_NAME: payment Layers: # Find the latest Layer version in the official documentation # https://docs.powertools.aws.dev/lambda/python/latest/#lambda-layer - !Sub arn:aws:lambda:${AWS::Region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15 Resources: CaptureLambdaHandlerExample: Type: AWS::Serverless::Function Properties: CodeUri: ../src Handler: capture_lambda_handler.handler ``` ### Lambda handler You can quickly start by initializing `Tracer` and use `capture_lambda_handler` decorator for your Lambda handler. ``` from aws_lambda_powertools import Tracer from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() # Sets service via POWERTOOLS_SERVICE_NAME env var # OR tracer = Tracer(service="example") def collect_payment(charge_id: str) -> str: return f"dummy payment collected for charge: {charge_id}" @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> str: charge_id = event.get("charge_id", "") return collect_payment(charge_id=charge_id) ``` `capture_lambda_handler` performs these additional tasks to ease operations: - Creates a `ColdStart` annotation to easily filter traces that have had an initialization overhead - Creates a `Service` annotation if `service` parameter or `POWERTOOLS_SERVICE_NAME` is set - Captures any response, or full exceptions generated by the handler, and include as tracing metadata ### Annotations & Metadata **Annotations** are key-values associated with traces and indexed by AWS X-Ray. You can use them to filter traces and to create [Trace Groups](https://aws.amazon.com/about-aws/whats-new/2018/11/aws-xray-adds-the-ability-to-group-traces/) to slice and dice your transactions. ``` from aws_lambda_powertools import Tracer from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() def collect_payment(charge_id: str) -> str: tracer.put_annotation(key="PaymentId", value=charge_id) return f"dummy payment collected for charge: {charge_id}" @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> str: charge_id = event.get("charge_id", "") return collect_payment(charge_id=charge_id) ``` **Metadata** are key-values also associated with traces but not indexed by AWS X-Ray. You can use them to add additional context for an operation using any native object. ``` from aws_lambda_powertools import Tracer from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() def collect_payment(charge_id: str) -> str: return f"dummy payment collected for charge: {charge_id}" @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> str: payment_context = { "charge_id": event.get("charge_id", ""), "merchant_id": event.get("merchant_id", ""), "request_id": context.aws_request_id, } payment_context["receipt_id"] = collect_payment(charge_id=payment_context["charge_id"]) tracer.put_metadata(key="payment_response", value=payment_context) return payment_context["receipt_id"] ``` ### Synchronous functions You can trace synchronous functions using the `capture_method` decorator. ``` from aws_lambda_powertools import Tracer from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() @tracer.capture_method def collect_payment(charge_id: str) -> str: tracer.put_annotation(key="PaymentId", value=charge_id) return f"dummy payment collected for charge: {charge_id}" @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> str: charge_id = event.get("charge_id", "") return collect_payment(charge_id=charge_id) ``` Note: Function responses are auto-captured and stored as JSON, by default. Use [capture_response](#disabling-response-auto-capture) parameter to override this behaviour. The serialization is performed by aws-xray-sdk via `jsonpickle` module. This can cause side effects for file-like objects like boto S3 [`StreamingBody`](https://botocore.amazonaws.com/v1/documentation/api/latest/reference/response.html#botocore.response.StreamingBody), where its response will be read only once during serialization. ### Asynchronous and generator functions Warning We do not support asynchronous Lambda handler You can trace asynchronous functions and generator functions (including context managers) using `capture_method`. ``` import asyncio from aws_lambda_powertools import Tracer from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() @tracer.capture_method async def collect_payment(charge_id: str) -> str: tracer.put_annotation(key="PaymentId", value=charge_id) await asyncio.sleep(0.5) return f"dummy payment collected for charge: {charge_id}" @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> str: charge_id = event.get("charge_id", "") return asyncio.run(collect_payment(charge_id=charge_id)) ``` ``` import contextlib from collections.abc import Generator from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() @contextlib.contextmanager @tracer.capture_method def collect_payment(charge_id: str) -> Generator[str, None, None]: try: yield f"dummy payment collected for charge: {charge_id}" finally: tracer.put_annotation(key="PaymentId", value=charge_id) @tracer.capture_lambda_handler @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> str: charge_id = event.get("charge_id", "") with collect_payment(charge_id=charge_id) as receipt_id: logger.info(f"Processing payment collection for charge {charge_id} with receipt {receipt_id}") return receipt_id ``` ``` from collections.abc import Generator from aws_lambda_powertools import Tracer from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() @tracer.capture_method def collect_payment(charge_id: str) -> Generator[str, None, None]: yield f"dummy payment collected for charge: {charge_id}" @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> str: charge_id = event.get("charge_id", "") return next(collect_payment(charge_id=charge_id)) ``` ### Environment variables The following environment variables are available to configure Tracer at a global scope: | Setting | Description | Environment variable | Default | | --- | --- | --- | --- | | **Disable Tracing** | Explicitly disables all tracing. | `POWERTOOLS_TRACE_DISABLED` | `false` | | **Response Capture** | Captures Lambda or method return as metadata. | `POWERTOOLS_TRACER_CAPTURE_RESPONSE` | `true` | | **Exception Capture** | Captures Lambda or method exception as metadata. | `POWERTOOLS_TRACER_CAPTURE_ERROR` | `true` | Both [`POWERTOOLS_TRACER_CAPTURE_RESPONSE`](#disabling-response-auto-capture) and [`POWERTOOLS_TRACER_CAPTURE_ERROR`](#disabling-exception-auto-capture) can be set on a per-method basis, consequently overriding the environment variable value. ## Advanced ### Patching modules Tracer automatically patches all [supported libraries by X-Ray](https://docs.aws.amazon.com/xray/latest/devguide/xray-sdk-python-patching.html) during initialization, by default. Underneath, AWS X-Ray SDK checks whether a supported library has been imported before patching. If you're looking to shave a few microseconds, or milliseconds depending on your function memory configuration, you can patch specific modules using `patch_modules` param: ``` import requests from aws_lambda_powertools import Tracer from aws_lambda_powertools.utilities.typing import LambdaContext MODULES = ["requests"] tracer = Tracer(patch_modules=MODULES) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> str: ret = requests.get("https://httpbin.org/get") ret.raise_for_status() return ret.json() ``` ### Disabling response auto-capture Use **`capture_response=False`** parameter in both `capture_lambda_handler` and `capture_method` decorators to instruct Tracer **not** to serialize function responses as metadata. Info: This is useful in three common scenarios 1. You might **return sensitive** information you don't want it to be added to your traces 1. You might manipulate **streaming objects that can be read only once**; this prevents subsequent calls from being empty 1. You might return **more than 64K** of data *e.g., `message too long` error* ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() @tracer.capture_method(capture_response=False) def collect_payment(charge_id: str) -> str: tracer.put_annotation(key="PaymentId", value=charge_id) logger.debug("Returning sensitive information....") return f"dummy payment collected for charge: {charge_id}" @tracer.capture_lambda_handler(capture_response=False) def lambda_handler(event: dict, context: LambdaContext) -> str: charge_id = event.get("charge_id", "") return collect_payment(charge_id=charge_id) ``` ``` import os import boto3 from botocore.response import StreamingBody from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.typing import LambdaContext BUCKET = os.getenv("BUCKET_NAME", "") REPORT_KEY = os.getenv("REPORT_KEY", "") tracer = Tracer() logger = Logger() session = boto3.session.Session() s3 = session.client("s3") @tracer.capture_method(capture_response=False) def fetch_payment_report(payment_id: str) -> StreamingBody: ret = s3.get_object(Bucket=BUCKET, Key=f"{REPORT_KEY}/{payment_id}") logger.debug("Returning streaming body from S3 object....") return ret["Body"] @tracer.capture_lambda_handler(capture_response=False) def lambda_handler(event: dict, context: LambdaContext) -> str: payment_id = event.get("payment_id", "") report = fetch_payment_report(payment_id=payment_id) return report.read().decode() ``` ### Disabling exception auto-capture Use **`capture_error=False`** parameter in both `capture_lambda_handler` and `capture_method` decorators to instruct Tracer **not** to serialize exceptions as metadata. Info Useful when returning sensitive information in exceptions/stack traces you don't control ``` import os import requests from aws_lambda_powertools import Tracer from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() ENDPOINT = os.getenv("PAYMENT_API", "") class PaymentError(Exception): ... @tracer.capture_method(capture_error=False) def collect_payment(charge_id: str) -> dict: try: ret = requests.post(url=f"{ENDPOINT}/collect", data={"charge_id": charge_id}) ret.raise_for_status() return ret.json() except requests.HTTPError as e: raise PaymentError(f"Unable to collect payment for charge {charge_id}") from e @tracer.capture_lambda_handler(capture_error=False) def lambda_handler(event: dict, context: LambdaContext) -> str: charge_id = event.get("charge_id", "") ret = collect_payment(charge_id=charge_id) return ret.get("receipt_id", "") ``` ### Ignoring certain HTTP endpoints You might have endpoints you don't want requests to be traced, perhaps due to the volume of calls or sensitive URLs. You can use `ignore_endpoint` method with the hostname and/or URLs you'd like it to be ignored - globs (`*`) are allowed. ``` import os import requests from aws_lambda_powertools import Tracer from aws_lambda_powertools.utilities.typing import LambdaContext ENDPOINT = os.getenv("PAYMENT_API", "") IGNORE_URLS = ["/collect", "/refund"] tracer = Tracer() tracer.ignore_endpoint(hostname=ENDPOINT, urls=IGNORE_URLS) tracer.ignore_endpoint(hostname=f"*.{ENDPOINT}", urls=IGNORE_URLS) # `.ENDPOINT` class PaymentError(Exception): ... @tracer.capture_method(capture_error=False) def collect_payment(charge_id: str) -> dict: try: ret = requests.post(url=f"{ENDPOINT}/collect", data={"charge_id": charge_id}) ret.raise_for_status() return ret.json() except requests.HTTPError as e: raise PaymentError(f"Unable to collect payment for charge {charge_id}") from e @tracer.capture_lambda_handler(capture_error=False) def lambda_handler(event: dict, context: LambdaContext) -> str: charge_id = event.get("charge_id", "") ret = collect_payment(charge_id=charge_id) return ret.get("receipt_id", "") ``` ### Tracing aiohttp requests Info This snippet assumes you have aiohttp as a dependency You can use `aiohttp_trace_config` function to create a valid [aiohttp trace_config object](https://docs.aiohttp.org/en/stable/tracing_reference.html). This is necessary since X-Ray utilizes [aiohttp](https://docs.aiohttp.org/en/stable/) trace hooks to capture requests end-to-end. ``` import asyncio import os import aiohttp from aws_lambda_powertools import Tracer from aws_lambda_powertools.tracing import aiohttp_trace_config from aws_lambda_powertools.utilities.typing import LambdaContext ENDPOINT = os.getenv("PAYMENT_API", "") tracer = Tracer() @tracer.capture_method async def collect_payment(charge_id: str) -> dict: async with aiohttp.ClientSession(trace_configs=[aiohttp_trace_config()]) as session: async with session.get(f"{ENDPOINT}/collect") as resp: return await resp.json() @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: charge_id = event.get("charge_id", "") return asyncio.run(collect_payment(charge_id=charge_id)) ``` ### Escape hatch mechanism You can use `tracer.provider` attribute to access all methods provided by AWS X-Ray `xray_recorder` object. This is useful when you need a feature available in X-Ray that is not available in the Tracer utility, for example [thread-safe](https://github.com/aws/aws-xray-sdk-python/#user-content-trace-threadpoolexecutor), or [context managers](https://github.com/aws/aws-xray-sdk-python/#user-content-start-a-custom-segmentsubsegment). ``` from aws_lambda_powertools import Tracer from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() def collect_payment(charge_id: str) -> str: return f"dummy payment collected for charge: {charge_id}" @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> str: charge_id = event.get("charge_id", "") with tracer.provider.in_subsegment("## collect_payment") as subsegment: subsegment.put_annotation(key="PaymentId", value=charge_id) ret = collect_payment(charge_id=charge_id) subsegment.put_metadata(key="payment_response", value=ret) return ret ``` ### Concurrent asynchronous functions Warning [X-Ray SDK will raise an exception](https://github.com/aws/aws-xray-sdk-python/issues/164) when async functions are run and traced concurrently A safe workaround mechanism is to use `in_subsegment_async` available via Tracer escape hatch (`tracer.provider`). ``` import asyncio from aws_lambda_powertools import Tracer from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() async def another_async_task(): async with tracer.provider.in_subsegment_async("## another_async_task") as subsegment: subsegment.put_annotation(key="key", value="value") subsegment.put_metadata(key="key", value="value", namespace="namespace") ... async def another_async_task_2(): async with tracer.provider.in_subsegment_async("## another_async_task_2") as subsegment: subsegment.put_annotation(key="key", value="value") subsegment.put_metadata(key="key", value="value", namespace="namespace") ... async def collect_payment(charge_id: str) -> str: await asyncio.gather(another_async_task(), another_async_task_2()) return f"dummy payment collected for charge: {charge_id}" @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> str: charge_id = event.get("charge_id", "") return asyncio.run(collect_payment(charge_id=charge_id)) ``` ### Reusing Tracer across your code Tracer keeps a copy of its configuration after the first initialization. This is useful for scenarios where you want to use Tracer in more than one location across your code base. Warning: Import order matters when using Lambda Layers or multiple modules **Do not set `auto_patch=False`** when reusing Tracer in Lambda Layers, or in multiple modules. This can result in the first Tracer config being inherited by new instances, and their modules not being patched. Tracer will automatically ignore imported modules that have been patched. ``` from tracer_reuse_module import collect_payment from aws_lambda_powertools import Tracer from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> str: charge_id = event.get("charge_id", "") return collect_payment(charge_id=charge_id) ``` A new instance of Tracer will be created but will reuse the previous Tracer instance configuration, similar to a Singleton. ``` from aws_lambda_powertools import Tracer tracer = Tracer() @tracer.capture_method def collect_payment(charge_id: str) -> str: return f"dummy payment collected for charge: {charge_id}" ``` ## Testing your code Tracer is disabled by default when not running in the AWS Lambda environment, including AWS SAM CLI and Chalice environments. This means no code changes or environment variables to be set. ## Tips - Use annotations on key operations to slice and dice traces, create unique views, and create metrics from it via Trace Groups - Use a namespace when adding metadata to group data more easily - Annotations and metadata are added to the current subsegment opened. If you want them in a specific subsegment, use a [context manager](https://github.com/aws/aws-xray-sdk-python/#start-a-custom-segmentsubsegment) via the escape hatch mechanism Event handler for Amazon API Gateway REST and HTTP APIs, Application Load Balancer (ALB), Lambda Function URLs, and VPC Lattice. ## Key Features - Lightweight routing to reduce boilerplate for API Gateway REST/HTTP API, ALB and Lambda Function URLs - Support for CORS, binary and Gzip compression, Decimals JSON encoding and bring your own JSON serializer - Built-in integration with [Event Source Data Classes utilities](../../../utilities/data_classes/) for self-documented event schema - Works with micro function (one or a few routes) and monolithic functions (all routes) - Support for OpenAPI and data validation for requests/responses ## Getting started Tip All examples shared in this documentation are available within the [project repository](https://github.com/aws-powertools/powertools-lambda-python/tree/develop/examples). ### Install This is not necessary if you're installing Powertools for AWS Lambda (Python) via [Lambda Layer/SAR](../../../#lambda-layer). **When using the data validation feature**, you need to add `pydantic` as a dependency in your preferred tool *e.g., requirements.txt, pyproject.toml*. At this time, we only support Pydantic V2. ### Required resources If you're using any API Gateway integration, you must have an existing [API Gateway Proxy integration](https://docs.aws.amazon.com/apigateway/latest/developerguide/set-up-lambda-proxy-integrations.html) or [ALB](https://docs.aws.amazon.com/elasticloadbalancing/latest/application/lambda-functions.html) configured to invoke your Lambda function. In case of using [VPC Lattice](https://docs.aws.amazon.com/lambda/latest/dg/services-vpc-lattice.html), you must have a service network configured to invoke your Lambda function. This is the sample infrastructure for API Gateway and Lambda Function URLs we are using for the examples in this documentation. There is no additional permissions or dependencies required to use this utility. ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: Hello world event handler API Gateway Globals: Api: TracingEnabled: true Cors: # see CORS section AllowOrigin: "'https://example.com'" AllowHeaders: "'Content-Type,Authorization,X-Amz-Date'" MaxAge: "'300'" BinaryMediaTypes: # see Binary responses section - "*~1*" # converts to */* for any binary type # NOTE: use this stricter version if you're also using CORS; */* doesn't work with CORS # see: https://github.com/aws-powertools/powertools-lambda-python/issues/3373#issuecomment-1821144779 # - "image~1*" # converts to image/* # - "*~1csv" # converts to */csv, eg text/csv, application/csv Function: Timeout: 5 Runtime: python3.12 Tracing: Active Environment: Variables: POWERTOOLS_LOG_LEVEL: INFO POWERTOOLS_LOGGER_SAMPLE_RATE: 0.1 POWERTOOLS_LOGGER_LOG_EVENT: true POWERTOOLS_SERVICE_NAME: example Resources: ApiFunction: Type: AWS::Serverless::Function Properties: Handler: getting_started_rest_api_resolver.lambda_handler CodeUri: ../src Description: API handler function Events: AnyApiEvent: Type: Api Properties: # NOTE: this is a catch-all rule to simplify the documentation. # explicit routes and methods are recommended for prod instead (see below) Path: /{proxy+} # Send requests on any path to the lambda function Method: ANY # Send requests using any http method to the lambda function # GetAllTodos: # Type: Api # Properties: # Path: /todos # Method: GET # GetTodoById: # Type: Api # Properties: # Path: /todos/{todo_id} # Method: GET # CreateTodo: # Type: Api # Properties: # Path: /todos # Method: POST ## Swagger UI specific routes # SwaggerUI: # Type: Api # Properties: # Path: /swagger # Method: GET ``` ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: Hello world event handler Lambda Function URL Globals: Function: Timeout: 5 Runtime: python3.12 Tracing: Active Environment: Variables: POWERTOOLS_LOG_LEVEL: INFO POWERTOOLS_LOGGER_SAMPLE_RATE: 0.1 POWERTOOLS_LOGGER_LOG_EVENT: true POWERTOOLS_SERVICE_NAME: example FunctionUrlConfig: Cors: # see CORS section # Notice that values here are Lists of Strings, vs comma-separated values on API Gateway AllowOrigins: ["https://example.com"] AllowHeaders: ["Content-Type", "Authorization", "X-Amz-Date"] MaxAge: 300 Resources: ApiFunction: Type: AWS::Serverless::Function Properties: Handler: getting_started_lambda_function_url_resolver.lambda_handler CodeUri: ../src Description: API handler function FunctionUrlConfig: AuthType: NONE # AWS_IAM for added security beyond sample documentation ``` ### Event Resolvers Before you decorate your functions to handle a given path and HTTP method(s), you need to initialize a resolver. A resolver will handle request resolution, including [one or more routers](#split-routes-with-router), and give you access to the current event via typed properties. By default, we will use `APIGatewayRestResolver` throughout the documentation. You can use any of the following: | Resolver | AWS service | | --- | --- | | **[`APIGatewayRestResolver`](#api-gateway-rest-api)** | Amazon API Gateway REST API | | **[`APIGatewayHttpResolver`](#api-gateway-http-api)** | Amazon API Gateway HTTP API | | **[`ALBResolver`](#application-load-balancer)** | Amazon Application Load Balancer (ALB) | | **[`LambdaFunctionUrlResolver`](#lambda-function-url)** | AWS Lambda Function URL | | **[`VPCLatticeResolver`](#vpc-lattice)** | Amazon VPC Lattice | #### Response auto-serialization > Want full control of the response, headers and status code? [Read about `Response` object here](#fine-grained-responses). For your convenience, we automatically perform these if you return a dictionary response: 1. Auto-serialize `dictionary` responses to JSON and trim it 1. Include the response under each resolver's equivalent of a `body` 1. Set `Content-Type` to `application/json` 1. Set `status_code` to 200 (OK) ``` from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.utilities.typing.lambda_context import LambdaContext app = APIGatewayRestResolver() @app.get("/ping") def ping(): return {"message": "pong"} # (1)! def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. This dictionary will be serialized, trimmed, and included under the `body` key ``` { "statusCode": 200, "multiValueHeaders": { "Content-Type": [ "application/json" ] }, "body": "{'message':'pong'}", "isBase64Encoded": false } ``` Coming from Flask? We also support tuple response You can optionally set a different HTTP status code as the second argument of the tuple. ``` import requests from requests import Response from aws_lambda_powertools.event_handler import ALBResolver from aws_lambda_powertools.utilities.typing import LambdaContext app = ALBResolver() @app.post("/todo") def create_todo(): data: dict = app.current_event.json_body todo: Response = requests.post("https://jsonplaceholder.typicode.com/todos", data=data) # Returns the created todo object, with a HTTP 201 Created status return {"todo": todo.json()}, 201 def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` #### API Gateway REST API When using Amazon API Gateway REST API to front your Lambda functions, you can use `APIGatewayRestResolver`. Here's an example on how we can handle the `/todos` path. Trailing slash in routes For `APIGatewayRestResolver`, we seamless handle routes with a trailing slash (`/todos/`). ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @app.get("/todos") @tracer.capture_method def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` This utility uses `path` and `httpMethod` to route to the right function. This helps make unit tests and local invocation easier too. ``` { "body": "", "resource": "/todos", "path": "/todos", "httpMethod": "GET", "isBase64Encoded": false, "queryStringParameters": {}, "multiValueQueryStringParameters": {}, "pathParameters": {}, "stageVariables": {}, "headers": { "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8", "Accept-Encoding": "gzip, deflate, sdch", "Accept-Language": "en-US,en;q=0.8", "Cache-Control": "max-age=0", "CloudFront-Forwarded-Proto": "https", "CloudFront-Is-Desktop-Viewer": "true", "CloudFront-Is-Mobile-Viewer": "false", "CloudFront-Is-SmartTV-Viewer": "false", "CloudFront-Is-Tablet-Viewer": "false", "CloudFront-Viewer-Country": "US", "Host": "1234567890.execute-api.us-east-1.amazonaws.com", "Upgrade-Insecure-Requests": "1", "User-Agent": "Custom User Agent String", "Via": "1.1 08f323deadbeefa7af34d5feb414ce27.cloudfront.net (CloudFront)", "X-Amz-Cf-Id": "cDehVQoZnx43VYQb9j2-nvCh-9z396Uhbp027Y2JvkCPNLmGJHqlaA==", "X-Forwarded-For": "127.0.0.1, 127.0.0.2", "X-Forwarded-Port": "443", "X-Forwarded-Proto": "https" }, "multiValueHeaders": {}, "requestContext": { "accountId": "123456789012", "resourceId": "123456", "stage": "Prod", "requestId": "c6af9ac6-7b61-11e6-9a41-93e8deadbeef", "requestTime": "25/Jul/2020:12:34:56 +0000", "requestTimeEpoch": 1428582896000, "identity": { "cognitoIdentityPoolId": null, "accountId": null, "cognitoIdentityId": null, "caller": null, "accessKey": null, "sourceIp": "127.0.0.1", "cognitoAuthenticationType": null, "cognitoAuthenticationProvider": null, "userArn": null, "userAgent": "Custom User Agent String", "user": null }, "path": "/Prod/todos", "resourcePath": "/todos", "httpMethod": "GET", "apiId": "1234567890", "protocol": "HTTP/1.1" } } ``` ``` { "statusCode": 200, "multiValueHeaders": { "Content-Type": ["application/json"] }, "body": "{\"todos\":[{\"userId\":1,\"id\":1,\"title\":\"delectus aut autem\",\"completed\":false},{\"userId\":1,\"id\":2,\"title\":\"quis ut nam facilis et officia qui\",\"completed\":false},{\"userId\":1,\"id\":3,\"title\":\"fugiat veniam minus\",\"completed\":false},{\"userId\":1,\"id\":4,\"title\":\"et porro tempora\",\"completed\":true},{\"userId\":1,\"id\":5,\"title\":\"laboriosam mollitia et enim quasi adipisci quia provident illum\",\"completed\":false},{\"userId\":1,\"id\":6,\"title\":\"qui ullam ratione quibusdam voluptatem quia omnis\",\"completed\":false},{\"userId\":1,\"id\":7,\"title\":\"illo expedita consequatur quia in\",\"completed\":false},{\"userId\":1,\"id\":8,\"title\":\"quo adipisci enim quam ut ab\",\"completed\":true},{\"userId\":1,\"id\":9,\"title\":\"molestiae perspiciatis ipsa\",\"completed\":false},{\"userId\":1,\"id\":10,\"title\":\"illo est ratione doloremque quia maiores aut\",\"completed\":true}]}", "isBase64Encoded": false } ``` #### API Gateway HTTP API When using Amazon API Gateway HTTP API to front your Lambda functions, you can use `APIGatewayHttpResolver`. Note Using HTTP API v1 payload? Use `APIGatewayRestResolver` instead. `APIGatewayHttpResolver` defaults to v2 payload. If you're using Terraform to deploy a HTTP API, note that it defaults the [payload_format_version](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/apigatewayv2_integration#payload_format_version) value to 1.0 if not specified. ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayHttpResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayHttpResolver() @app.get("/todos") @tracer.capture_method def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_HTTP) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` #### Application Load Balancer When using Amazon Application Load Balancer (ALB) to front your Lambda functions, you can use `ALBResolver`. ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import ALBResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = ALBResolver() @app.get("/todos") @tracer.capture_method def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPLICATION_LOAD_BALANCER) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` #### Lambda Function URL When using [AWS Lambda Function URL](https://docs.aws.amazon.com/lambda/latest/dg/urls-configuration.html), you can use `LambdaFunctionUrlResolver`. ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import LambdaFunctionUrlResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = LambdaFunctionUrlResolver() @app.get("/todos") @tracer.capture_method def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.LAMBDA_FUNCTION_URL) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "version": "2.0", "routeKey": "$default", "rawPath": "/todos", "rawQueryString": "", "headers": { "x-amz-content-sha256": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855", "x-amzn-tls-version": "TLSv1.2", "x-amz-date": "20220803T092917Z", "x-forwarded-proto": "https", "x-forwarded-port": "443", "x-forwarded-for": "123.123.123.123", "accept": "application/xml", "x-amzn-tls-cipher-suite": "ECDHE-RSA-AES128-GCM-SHA256", "x-amzn-trace-id": "Root=1-63ea3fee-51ba94542feafa3928745ba3", "host": "xxxxxxxxxxxxx.lambda-url.eu-central-1.on.aws", "content-type": "application/json", "accept-encoding": "gzip, deflate", "user-agent": "Custom User Agent" }, "requestContext": { "accountId": "123457890", "apiId": "xxxxxxxxxxxxxxxxxxxx", "authorizer": { "iam": { "accessKey": "AAAAAAAAAAAAAAAAAA", "accountId": "123457890", "callerId": "AAAAAAAAAAAAAAAAAA", "cognitoIdentity": null, "principalOrgId": "o-xxxxxxxxxxxx", "userArn": "arn:aws:iam::AAAAAAAAAAAAAAAAAA:user/user", "userId": "AAAAAAAAAAAAAAAAAA" } }, "domainName": "xxxxxxxxxxxxx.lambda-url.eu-central-1.on.aws", "domainPrefix": "xxxxxxxxxxxxx", "http": { "method": "GET", "path": "/todos", "protocol": "HTTP/1.1", "sourceIp": "123.123.123.123", "userAgent": "Custom User Agent" }, "requestId": "24f9ef37-8eb7-45fe-9dbc-a504169fd2f8", "routeKey": "$default", "stage": "$default", "time": "03/Aug/2022:09:29:18 +0000", "timeEpoch": 1659518958068 }, "isBase64Encoded": false } ``` #### VPC Lattice When using [VPC Lattice with AWS Lambda](https://docs.aws.amazon.com/lambda/latest/dg/services-vpc-lattice.html), you can use `VPCLatticeV2Resolver`. ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import VPCLatticeV2Resolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = VPCLatticeV2Resolver() @app.get("/todos") @tracer.capture_method def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPLICATION_LOAD_BALANCER) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "version": "2.0", "path": "/todos", "method": "GET", "headers": { "user_agent": "curl/7.64.1", "x-forwarded-for": "10.213.229.10", "host": "test-lambda-service-3908sdf9u3u.dkfjd93.vpc-lattice-svcs.us-east-2.on.aws", "accept": "*/*" }, "queryStringParameters": { "order-id": "1" }, "body": "{\"message\": \"Hello from Lambda!\"}", "requestContext": { "serviceNetworkArn": "arn:aws:vpc-lattice:us-east-2:123456789012:servicenetwork/sn-0bf3f2882e9cc805a", "serviceArn": "arn:aws:vpc-lattice:us-east-2:123456789012:service/svc-0a40eebed65f8d69c", "targetGroupArn": "arn:aws:vpc-lattice:us-east-2:123456789012:targetgroup/tg-6d0ecf831eec9f09", "identity": { "sourceVpcArn": "arn:aws:ec2:region:123456789012:vpc/vpc-0b8276c84697e7339", "type" : "AWS_IAM", "principal": "arn:aws:sts::123456789012:assumed-role/example-role/057d00f8b51257ba3c853a0f248943cf", "sessionName": "057d00f8b51257ba3c853a0f248943cf", "x509SanDns": "example.com" }, "region": "us-east-2", "timeEpoch": "1696331543569073" } } ``` ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import VPCLatticeResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = VPCLatticeResolver() @app.get("/todos") @tracer.capture_method def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPLICATION_LOAD_BALANCER) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "raw_path": "/testpath", "method": "GET", "headers": { "user_agent": "curl/7.64.1", "x-forwarded-for": "10.213.229.10", "host": "test-lambda-service-3908sdf9u3u.dkfjd93.vpc-lattice-svcs.us-east-2.on.aws", "accept": "*/*" }, "query_string_parameters": { "order-id": "1" }, "body": "eyJ0ZXN0IjogImV2ZW50In0=", "is_base64_encoded": true } ``` ### Dynamic routes You can use `/todos/` to configure dynamic URL paths, where `` will be resolved at runtime. Each dynamic route you set must be part of your function signature. This allows us to call your function using keyword arguments when matching your dynamic route. Note For brevity, we will only include the necessary keys for each sample request for the example to work. ``` from urllib.parse import quote import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @app.get("/todos/") @tracer.capture_method def get_todo_by_id(todo_id: str): # value come as str todo_id = quote(todo_id, safe="") todos: Response = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todos.raise_for_status() return {"todos": todos.json()} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "resource": "/todos/{id}", "path": "/todos/1", "httpMethod": "GET" } ``` Tip You can also nest dynamic paths, for example `/todos//`. #### Catch-all routes Note We recommend having explicit routes whenever possible; use catch-all routes sparingly. You can use a [regex](https://docs.python.org/3/library/re.html#regular-expression-syntax) string to handle an arbitrary number of paths within a request, for example `.+`. You can also combine nested paths with greedy regex to catch in between routes. Warning We choose the most explicit registered route that matches an incoming event. ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @app.get(".+") @tracer.capture_method def catch_any_route_get_method(): return {"path_received": app.current_event.path} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "resource": "/{proxy+}", "path": "/any/route/should/work", "httpMethod": "GET" } ``` ### HTTP Methods You can use named decorators to specify the HTTP method that should be handled in your functions. That is, `app.`, where the HTTP method could be `get`, `post`, `put`, `patch` and `delete`. ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @app.post("/todos") @tracer.capture_method def create_todo(): todo_data: dict = app.current_event.json_body # deserialize json str to dict todo: Response = requests.post("https://jsonplaceholder.typicode.com/todos", data=todo_data) todo.raise_for_status() return {"todo": todo.json()} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "resource": "/todos", "path": "/todos", "httpMethod": "POST", "body": "{\"title\": \"foo\", \"userId\": 1, \"completed\": false}" } ``` If you need to accept multiple HTTP methods in a single function, or support a HTTP method for which no decorator exists (e.g. [TRACE](https://developer.mozilla.org/en-US/docs/Web/HTTP/Methods/TRACE)), you can use the `route` method and pass a list of HTTP methods. ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() # PUT and POST HTTP requests to the path /hello will route to this function @app.route("/todos", method=["PUT", "POST"]) @tracer.capture_method def create_todo(): todo_data: dict = app.current_event.json_body # deserialize json str to dict todo: Response = requests.post("https://jsonplaceholder.typicode.com/todos", data=todo_data) todo.raise_for_status() return {"todo": todo.json()} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` Note It is generally better to have separate functions for each HTTP method, as the functionality tends to differ depending on which method is used. ### Data validation This changes the authoring experience by relying on Python's type annotations It's inspired by [FastAPI framework](https://fastapi.tiangolo.com/) for ergonomics and to ease migrations in either direction. We support both Pydantic models and Python's dataclass. For brevity, we'll focus on Pydantic only. All resolvers can optionally coerce and validate incoming requests by setting `enable_validation=True`. With this feature, we can now express how we expect our incoming data and response to look like. This moves data validation responsibilities to Event Handler resolvers, reducing a ton of boilerplate code. Let's rewrite the previous examples to signal our resolver what shape we expect our data to be. ``` from typing import Optional import requests from pydantic import BaseModel, Field from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver(enable_validation=True) # (1)! class Todo(BaseModel): # (2)! userId: int id_: Optional[int] = Field(alias="id", default=None) title: str completed: bool @app.get("/todos/") # (3)! @tracer.capture_method def get_todo_by_id(todo_id: int) -> Todo: # (4)! todo = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todo.raise_for_status() return todo.json() # (5)! @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_HTTP) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. This enforces data validation at runtime. Any validation error will return `HTTP 422: Unprocessable Entity error`. 1. We create a Pydantic model to define how our data looks like. 1. Defining a route remains exactly as before. 1. By default, URL Paths will be `str`. Here, we are telling our resolver it should be `int`, so it converts it for us. Lastly, we're also saying the return should be our `Todo`. This will help us later when we touch OpenAPI auto-documentation. 1. `todo.json()` returns a dictionary. However, Event Handler knows the response should be `Todo` so it converts and validates accordingly. ``` { "version": "1.0", "resource": "/todos/1", "path": "/todos/1", "httpMethod": "GET", "headers": { "Origin": "https://aws.amazon.com" }, "multiValueHeaders": {}, "queryStringParameters": {}, "multiValueQueryStringParameters": {}, "requestContext": { "accountId": "123456789012", "apiId": "id", "authorizer": { "claims": null, "scopes": null }, "domainName": "id.execute-api.us-east-1.amazonaws.com", "domainPrefix": "id", "extendedRequestId": "request-id", "httpMethod": "GET", "path": "/todos/1", "protocol": "HTTP/1.1", "requestId": "id=", "requestTime": "04/Mar/2020:19:15:17 +0000", "requestTimeEpoch": 1583349317135, "resourceId": null, "resourcePath": "/todos/1", "stage": "$default" }, "pathParameters": null, "stageVariables": null, "body": "", "isBase64Encoded": false } ``` ``` { "statusCode": 200, "body": "Hello world", "isBase64Encoded": false, "multiValueHeaders": { "Content-Type": [ "application/json" ] } } ``` #### Handling validation errors By default, we hide extended error details for security reasons *(e.g., pydantic url, Pydantic code)*. Any incoming request or and outgoing response that fails validation will lead to a `HTTP 422: Unprocessable Entity error` response that will look similar to this: ``` { "statusCode": 422, "body": "{\"statusCode\": 422, \"detail\": [{\"type\": \"int_parsing\", \"loc\": [\"path\", \"todo_id\"]}]}", "isBase64Encoded": false, "headers": { "Content-Type": "application/json" }, "cookies": [] } ``` You can customize the error message by catching the `RequestValidationError` exception. This is useful when you might have a security policy to return opaque validation errors, or have a company standard for API validation errors. Here's an example where we catch validation errors, log all details for further investigation, and return the same `HTTP 422` with an opaque error. ``` from typing import Optional import requests from pydantic import BaseModel, Field from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver, Response, content_types from aws_lambda_powertools.event_handler.openapi.exceptions import RequestValidationError from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver(enable_validation=True) class Todo(BaseModel): userId: int id_: Optional[int] = Field(alias="id", default=None) title: str completed: bool @app.exception_handler(RequestValidationError) # (1)! def handle_validation_error(ex: RequestValidationError): logger.error("Request failed validation", path=app.current_event.path, errors=ex.errors()) return Response( status_code=422, content_type=content_types.APPLICATION_JSON, body="Invalid data", ) @app.post("/todos") def create_todo(todo: Todo) -> int: response = requests.post("https://jsonplaceholder.typicode.com/todos", json=todo.dict(by_alias=True)) response.raise_for_status() return response.json()["id"] @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_HTTP) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. We use [exception handler](#exception-handling) decorator to catch **any** request validation errors. Then, we log the detailed reason as to why it failed while returning a custom `Response` object to hide that from them. ``` { "statusCode": 422, "body": "Invalid data", "isBase64Encoded": false, "headers": { "Content-Type": "application/json" }, "cookies": [] } ``` #### Validating payloads We will automatically validate, inject, and convert incoming request payloads based on models via type annotation. Let's improve our previous example by handling the creation of todo items via `HTTP POST`. What we want is for Event Handler to convert the incoming payload as an instance of our `Todo` model. We handle the creation of that `todo`, and then return the `ID` of the newly created `todo`. Even better, we can also let Event Handler validate and convert our response according to type annotations, further reducing boilerplate. ``` from typing import List, Optional import requests from pydantic import BaseModel, Field from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver(enable_validation=True) # (1)! class Todo(BaseModel): # (2)! userId: int id_: Optional[int] = Field(alias="id", default=None) title: str completed: bool @app.post("/todos") def create_todo(todo: Todo) -> str: # (3)! response = requests.post("https://jsonplaceholder.typicode.com/todos", json=todo.dict(by_alias=True)) response.raise_for_status() return response.json()["id"] # (4)! @app.get("/todos") @tracer.capture_method def get_todos() -> List[Todo]: todo = requests.get("https://jsonplaceholder.typicode.com/todos") todo.raise_for_status() return todo.json() # (5)! @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_HTTP) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. This enforces data validation at runtime. Any validation error will return `HTTP 422: Unprocessable Entity error`. 1. We create a Pydantic model to define how our data looks like. 1. We define `Todo` as our type annotation. Event Handler then uses this model to validate and inject the incoming request as `Todo`. 1. Lastly, we return the ID of our newly created `todo` item. Because we specify the return type (`str`), Event Handler will take care of serializing this as a JSON string. 1. Note that the return type is `List[Todo]`. Event Handler will take the return (`todo.json`), and validate each list item against `Todo` model before returning the response accordingly. ``` { "version": "1.0", "body": "{\"title\": \"foo\", \"userId\": \"1\", \"completed\": false}", "resource": "/todos", "path": "/todos", "httpMethod": "POST", "headers": { "Origin": "https://aws.amazon.com" }, "multiValueHeaders": {}, "queryStringParameters": {}, "multiValueQueryStringParameters": {}, "requestContext": { "accountId": "123456789012", "apiId": "id", "authorizer": { "claims": null, "scopes": null }, "domainName": "id.execute-api.us-east-1.amazonaws.com", "domainPrefix": "id", "extendedRequestId": "request-id", "httpMethod": "POST", "path": "/todos", "protocol": "HTTP/1.1", "requestId": "id=", "requestTime": "04/Mar/2020:19:15:17 +0000", "requestTimeEpoch": 1583349317135, "resourceId": null, "resourcePath": "/todos", "stage": "$default" }, "pathParameters": null, "stageVariables": null, "isBase64Encoded": false } ``` ``` { "statusCode": 200, "body": "2008821", "isBase64Encoded": false, "multiValueHeaders": { "Content-Type": [ "application/json" ] } } ``` ##### Validating payload subset With the addition of the [`Annotated` type starting in Python 3.9](https://docs.python.org/3/library/typing.html#typing.Annotated), types can contain additional metadata, allowing us to represent anything we want. We use the `Annotated` and OpenAPI `Body` type to instruct Event Handler that our payload is located in a particular JSON key. Event Handler will match the parameter name with the JSON key to validate and inject what you want. ``` from typing import Optional import requests from pydantic import BaseModel, Field from typing_extensions import Annotated from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.event_handler.openapi.params import Body # (1)! from aws_lambda_powertools.utilities.typing import LambdaContext app = APIGatewayRestResolver(enable_validation=True) class Todo(BaseModel): userId: int id_: Optional[int] = Field(alias="id", default=None) title: str completed: bool @app.post("/todos") def create_todo(todo: Annotated[Todo, Body(embed=True)]) -> int: # (2)! response = requests.post("https://jsonplaceholder.typicode.com/todos", json=todo.dict(by_alias=True)) response.raise_for_status() return response.json()["id"] def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. `Body` is a special OpenAPI type that can add additional constraints to a request payload. 1. `Body(embed=True)` instructs Event Handler to look up inside the payload for a key. This means Event Handler will look up for a key named `todo`, validate the value against `Todo`, and inject it. ``` { "version": "1.0", "body": "{ \"todo\": {\"title\": \"foo\", \"userId\": \"1\", \"completed\": false } }", "resource": "/todos", "path": "/todos", "httpMethod": "POST", "headers": { "Origin": "https://aws.amazon.com" }, "multiValueHeaders": {}, "queryStringParameters": {}, "multiValueQueryStringParameters": {}, "requestContext": { "accountId": "123456789012", "apiId": "id", "authorizer": { "claims": null, "scopes": null }, "domainName": "id.execute-api.us-east-1.amazonaws.com", "domainPrefix": "id", "extendedRequestId": "request-id", "httpMethod": "POST", "path": "/todos", "protocol": "HTTP/1.1", "requestId": "id=", "requestTime": "04/Mar/2020:19:15:17 +0000", "requestTimeEpoch": 1583349317135, "resourceId": null, "resourcePath": "/todos", "stage": "$default" }, "pathParameters": null, "stageVariables": null, "isBase64Encoded": false } ``` ``` { "statusCode": 200, "body": "2008822", "isBase64Encoded": false, "multiValueHeaders": { "Content-Type": [ "application/json" ] } } ``` #### Validating responses You can use `response_validation_error_http_code` to set a custom HTTP code for failed response validation. When this field is set, we will raise a `ResponseValidationError` instead of a `RequestValidationError`. For a more granular control over the failed response validation http code, the `custom_response_validation_http_code` argument can be set per route. This value will override the value of the failed response validation http code set at constructor level (if any). ``` from http import HTTPStatus from typing import Optional import requests from pydantic import BaseModel, Field from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver( enable_validation=True, response_validation_error_http_code=HTTPStatus.INTERNAL_SERVER_ERROR, # (1)! ) class Todo(BaseModel): userId: int id_: Optional[int] = Field(alias="id", default=None) title: str completed: bool @app.get("/todos_bad_response/") @tracer.capture_method def get_todo_by_id(todo_id: int) -> Todo: todo = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todo.raise_for_status() return todo.json()["title"] # (2)! @app.get( "/todos_bad_response_with_custom_http_code/", custom_response_validation_http_code=HTTPStatus.UNPROCESSABLE_ENTITY, # (3)! ) @tracer.capture_method def get_todo_by_id_custom(todo_id: int) -> Todo: todo = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todo.raise_for_status() return todo.json()["title"] @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_HTTP) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. A response with status code set here will be returned if response data is not valid. 1. Operation returns a string as oppose to a `Todo` object. This will lead to a `500` response as set in line 16. 1. Operation will return a `422 Unprocessable Entity` response if response is not a `Todo` object. This overrides the custom http code set in line 16. ``` from http import HTTPStatus from typing import Optional import requests from pydantic import BaseModel, Field from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver( enable_validation=True, response_validation_error_http_code=HTTPStatus.INTERNAL_SERVER_ERROR, # (1)! ) class Todo(BaseModel): userId: int id_: Optional[int] = Field(alias="id", default=None) title: str completed: bool @app.get("/todos_bad_response/") @tracer.capture_method def get_todo_by_id(todo_id: int) -> Todo: todo = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todo.raise_for_status() return todo.json()["title"] # (2)! @app.get( "/todos_bad_response_with_custom_http_code/", custom_response_validation_http_code=HTTPStatus.UNPROCESSABLE_ENTITY, # (3)! ) @tracer.capture_method def get_todo_by_id_custom(todo_id: int) -> Todo: todo = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todo.raise_for_status() return todo.json()["title"] @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_HTTP) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. A response with status code set here will be returned if response data is not valid. 1. Operation returns a string as oppose to a `Todo` object. This will lead to a `500` response as set in line 18. ``` from http import HTTPStatus from typing import Optional import requests from pydantic import BaseModel, Field from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver, content_types from aws_lambda_powertools.event_handler.api_gateway import Response from aws_lambda_powertools.event_handler.openapi.exceptions import ResponseValidationError from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver( enable_validation=True, response_validation_error_http_code=HTTPStatus.INTERNAL_SERVER_ERROR, ) class Todo(BaseModel): userId: int id_: Optional[int] = Field(alias="id", default=None) title: str completed: bool @app.get("/todos_bad_response/") @tracer.capture_method def get_todo_by_id(todo_id: int) -> Todo: todo = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todo.raise_for_status() return todo.json()["title"] @app.exception_handler(ResponseValidationError) # (1)! def handle_response_validation_error(ex: ResponseValidationError): logger.error("Request failed validation", path=app.current_event.path, errors=ex.errors()) return Response( status_code=500, content_type=content_types.APPLICATION_JSON, body="Unexpected response.", ) @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_HTTP) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. The distinct `ResponseValidationError` exception can be caught to customise the response. #### Validating query strings We will automatically validate and inject incoming query strings via type annotation. We use the `Annotated` type to tell the Event Handler that a particular parameter is not only an optional string, but also a query string with constraints. In the following example, we use a new `Query` OpenAPI type to add [one out of many possible constraints](#customizing-openapi-parameters), which should read as: - `completed` is a query string with a `None` as its default value - `completed`, when set, should have at minimum 4 characters - No match? Event Handler will return a validation error response ``` from typing import List, Optional import requests from pydantic import BaseModel, Field from typing_extensions import Annotated # (1)! from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.event_handler.openapi.params import Query # (2)! from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver(enable_validation=True) class Todo(BaseModel): userId: int id_: Optional[int] = Field(alias="id", default=None) title: str completed: bool @app.get("/todos") @tracer.capture_method def get_todos(completed: Annotated[Optional[str], Query(min_length=4)] = None) -> List[Todo]: # (3)! url = "https://jsonplaceholder.typicode.com/todos" if completed is not None: url = f"{url}/?completed={completed}" todo = requests.get(url) todo.raise_for_status() return todo.json() @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_HTTP) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. If you're not using Python 3.9 or higher, you can install and use [`typing_extensions`](https://pypi.org/project/typing-extensions/) to the same effect 1. `Query` is a special OpenAPI type that can add constraints to a query string as well as document them 1. **First time seeing `Annotated`?** This special type uses the first argument as the actual type, and subsequent arguments as metadata. At runtime, static checkers will also see the first argument, but any receiver can inspect it to get the metadata. If you don't want to validate query strings but simply let Event Handler inject them as parameters, you can omit `Query` type annotation. This is merely for your convenience. ``` from typing import List, Optional import requests from pydantic import BaseModel, Field from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver(enable_validation=True) class Todo(BaseModel): userId: int id_: Optional[int] = Field(alias="id", default=None) title: str completed: bool @app.get("/todos") @tracer.capture_method def get_todos(completed: Optional[str] = None) -> List[Todo]: # (1)! url = "https://jsonplaceholder.typicode.com/todos" if completed is not None: url = f"{url}/?completed={completed}" todo = requests.get(url) todo.raise_for_status() return todo.json() @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_HTTP) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. `completed` is still the same query string as before, except we simply state it's an string. No `Query` or `Annotated` to validate it. If you need to handle multi-value query parameters, you can create a list of the desired type. ``` from enum import Enum from typing import List from typing_extensions import Annotated from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.event_handler.openapi.params import Query from aws_lambda_powertools.utilities.typing import LambdaContext app = APIGatewayRestResolver(enable_validation=True) class ExampleEnum(Enum): """Example of an Enum class.""" ONE = "value_one" TWO = "value_two" THREE = "value_three" @app.get("/todos") def get( example_multi_value_param: Annotated[ List[ExampleEnum], # (1)! Query( description="This is multi value query parameter.", ), ], ): """Return validated multi-value param values.""" return example_multi_value_param def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. `example_multi_value_param` is a list containing values from the `ExampleEnum` enumeration. #### Validating path parameters Just like we learned in [query string validation](#validating-query-strings), we can use a new `Path` OpenAPI type to [add constraints](#customizing-openapi-parameters). For example, we could validate that `` dynamic path should be no greater than three digits. ``` from typing import Optional import requests from pydantic import BaseModel, Field from typing_extensions import Annotated from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.event_handler.openapi.params import Path from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver(enable_validation=True) class Todo(BaseModel): userId: int id_: Optional[int] = Field(alias="id", default=None) title: str completed: bool @app.get("/todos/") @tracer.capture_method def get_todo_by_id(todo_id: Annotated[int, Path(lt=999)]) -> Todo: # (1)! todo = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todo.raise_for_status() return todo.json() @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_HTTP) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. `Path` is a special OpenAPI type that allows us to constrain todo_id to be less than 999. #### Validating headers We use the `Annotated` type to tell the Event Handler that a particular parameter is a header that needs to be validated. We adhere to [HTTP RFC standards](https://www.rfc-editor.org/rfc/rfc7540#section-8.1.2), which means we treat HTTP headers as case-insensitive. In the following example, we use a new `Header` OpenAPI type to add [one out of many possible constraints](#customizing-openapi-parameters), which should read as: - `correlation_id` is a header that must be present in the request - `correlation_id` should have 16 characters - No match? Event Handler will return a validation error response ``` from typing import List, Optional import requests from pydantic import BaseModel, Field from typing_extensions import Annotated # (1)! from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.event_handler.openapi.params import Header # (2)! from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver(enable_validation=True) class Todo(BaseModel): userId: int id_: Optional[int] = Field(alias="id", default=None) title: str completed: bool @app.get("/todos") @tracer.capture_method def get_todos(correlation_id: Annotated[str, Header(min_length=16, max_length=16)]) -> List[Todo]: # (3)! url = "https://jsonplaceholder.typicode.com/todos" todo = requests.get(url, headers={"correlation_id": correlation_id}) todo.raise_for_status() return todo.json() @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_HTTP) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. If you're not using Python 3.9 or higher, you can install and use [`typing_extensions`](https://pypi.org/project/typing-extensions/) to the same effect 1. `Header` is a special OpenAPI type that can add constraints and documentation to a header 1. **First time seeing `Annotated`?** This special type uses the first argument as the actual type, and subsequent arguments as metadata. At runtime, static checkers will also see the first argument, but any receiver can inspect it to get the metadata. You can handle multi-value headers by declaring it as a list of the desired type. ``` from enum import Enum from typing import List from typing_extensions import Annotated from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.event_handler.openapi.params import Header from aws_lambda_powertools.utilities.typing import LambdaContext app = APIGatewayRestResolver(enable_validation=True) class CountriesAllowed(Enum): """Example of an Enum class.""" US = "US" PT = "PT" BR = "BR" @app.get("/hello") def get( cloudfront_viewer_country: Annotated[ List[CountriesAllowed], # (1)! Header( description="This is multi value header parameter.", ), ], ): """Return validated multi-value header values.""" return cloudfront_viewer_country def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. `cloudfront_viewer_country` is a list that must contain values from the `CountriesAllowed` enumeration. #### Supported types for response serialization With data validation enabled, we natively support serializing the following data types to JSON: | Data type | Serialized type | | --- | --- | | **Pydantic models** | `dict` | | **Python Dataclasses** | `dict` | | **Enum** | Enum values | | **Datetime** | Datetime ISO format string | | **Decimal** | `int` if no exponent, or `float` | | **Path** | `str` | | **UUID** | `str` | | **Set** | `list` | | **Python primitives** *(dict, string, sequences, numbers, booleans)* | [Python's default JSON serializable types](https://docs.python.org/3/library/json.html#encoders-and-decoders) | See [custom serializer section](#custom-serializer) for bringing your own. Otherwise, we will raise `SerializationError` for any unsupported types *e.g., SQLAlchemy models*. ### Accessing request details Event Handler integrates with [Event Source Data Classes utilities](../../../utilities/data_classes/), and it exposes their respective resolver request details and convenient methods under `app.current_event`. That is why you see `app.resolve(event, context)` in every example. This allows Event Handler to resolve requests, and expose data like `app.lambda_context` and `app.current_event`. #### Query strings and payload Within `app.current_event` property, you can access all available query strings as a dictionary via `query_string_parameters`. You can access the raw payload via `body` property, or if it's a JSON string you can quickly deserialize it via `json_body` property - like the earlier example in the [HTTP Methods](#http-methods) section. ``` from typing import List, Optional import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @app.get("/todos") @tracer.capture_method def get_todos(): todo_id: str = app.current_event.query_string_parameters["id"] # alternatively _: Optional[str] = app.current_event.query_string_parameters.get("id") # or multi-value query string parameters; ?category="red"&?category="blue" _: List[str] = app.current_event.multi_value_query_string_parameters["category"] # Payload _: Optional[str] = app.current_event.body # raw str | None endpoint = "https://jsonplaceholder.typicode.com/todos" if todo_id: endpoint = f"{endpoint}/{todo_id}" todos: Response = requests.get(endpoint) todos.raise_for_status() return {"todos": todos.json()} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` #### Headers Similarly to [Query strings](#query-strings-and-payload), you can access headers as dictionary via `app.current_event.headers`. Specifically for headers, it's a case-insensitive dictionary, so all lookups are case-insensitive. ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @app.get("/todos") @tracer.capture_method def get_todos(): endpoint = "https://jsonplaceholder.typicode.com/todos" api_key = app.current_event.headers.get("X-Api-Key") todos: Response = requests.get(endpoint, headers={"X-Api-Key": api_key}) todos.raise_for_status() return {"todos": todos.json()} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ### Handling not found routes By default, we return `404` for any unmatched route. You can use **`not_found`** decorator to override this behavior, and return a custom **`Response`**. ``` import requests from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import ( APIGatewayRestResolver, Response, content_types, ) from aws_lambda_powertools.event_handler.exceptions import NotFoundError from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @app.not_found @tracer.capture_method def handle_not_found_errors(exc: NotFoundError) -> Response: logger.info(f"Not found route: {app.current_event.path}") return Response(status_code=418, content_type=content_types.TEXT_PLAIN, body="I'm a teapot!") @app.get("/todos") @tracer.capture_method def get_todos(): todos: requests.Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ### Exception handling You can use **`exception_handler`** decorator with any Python exception. This allows you to handle a common exception outside your route, for example validation errors. ``` import requests from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import ( APIGatewayRestResolver, Response, content_types, ) from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @app.exception_handler(ValueError) def handle_invalid_limit_qs(ex: ValueError): # receives exception raised metadata = {"path": app.current_event.path, "query_strings": app.current_event.query_string_parameters} logger.error(f"Malformed request: {ex}", extra=metadata) return Response( status_code=400, content_type=content_types.TEXT_PLAIN, body="Invalid request parameters.", ) @app.get("/todos") @tracer.capture_method def get_todos(): # educational purpose only: we should receive a `ValueError` # if a query string value for `limit` cannot be coerced to int max_results = int(app.current_event.query_string_parameters.get("limit", 0)) todos: requests.Response = requests.get(f"https://jsonplaceholder.typicode.com/todos?limit={max_results}") todos.raise_for_status() return {"todos": todos.json()} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` Info The `exception_handler` also supports passing a list of exception types you wish to handle with one handler. ### Raising HTTP errors You can easily raise any HTTP Error back to the client using `ServiceError` exception. This ensures your Lambda function doesn't fail but return the correct HTTP response signalling the error. Info If you need to send custom headers, use [Response](#fine-grained-responses) class instead. We provide pre-defined errors for the most popular ones based on [AWS Lambda API Reference Common Erros](https://docs.aws.amazon.com/lambda/latest/api/CommonErrors.html). ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.event_handler.exceptions import ( BadRequestError, ForbiddenError, InternalServerError, NotFoundError, RequestEntityTooLargeError, RequestTimeoutError, ServiceError, ServiceUnavailableError, UnauthorizedError, ) from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @app.get(rule="/bad-request-error") def bad_request_error(): raise BadRequestError("Missing required parameter") # HTTP 400 @app.get(rule="/unauthorized-error") def unauthorized_error(): raise UnauthorizedError("Unauthorized") # HTTP 401 @app.get(rule="/forbidden-error") def forbidden_error(): raise ForbiddenError("Access denied") # HTTP 403 @app.get(rule="/not-found-error") def not_found_error(): raise NotFoundError # HTTP 404 @app.get(rule="/request-timeout-error") def request_timeout_error(): raise RequestTimeoutError("Request timed out") # HTTP 408 @app.get(rule="/internal-server-error") def internal_server_error(): raise InternalServerError("Internal server error") # HTTP 500 @app.get(rule="/request-entity-too-large-error") def request_entity_too_large_error(): raise RequestEntityTooLargeError("Request payload too large") # HTTP 413 @app.get(rule="/service-error", cors=True) def service_error(): raise ServiceError(502, "Something went wrong!") @app.get(rule="/service-unavailable-error") def service_unavailable_error(): raise ServiceUnavailableError("Service is temporarily unavailable") # HTTP 503 @app.get("/todos") @tracer.capture_method def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() return {"todos": todos.json()[:10]} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ### Enabling SwaggerUI This feature requires [data validation](#data-validation) feature to be enabled. Behind the scenes, the [data validation](#data-validation) feature auto-generates an OpenAPI specification from your routes and type annotations. You can use [Swagger UI](https://swagger.io/tools/swagger-ui/) to visualize and interact with your newly auto-documented API. There are some important **caveats** that you should know before enabling it: | Caveat | Description | | --- | --- | | Swagger UI is **publicly accessible by default** | When using `enable_swagger` method, you can [protect sensitive API endpoints by implementing a custom middleware](#customizing-swagger-ui) using your preferred authorization mechanism. | | **No micro-functions support** yet | Swagger UI is enabled on a per resolver instance which will limit its accuracy here. | | You need to expose a **new route** | You'll need to expose the following path to Lambda: `/swagger`; ignore if you're routing this path already. | | JS and CSS files are **embedded within Swagger HTML** | If you are not using an external CDN to serve Swagger UI assets, we embed JS and CSS directly into the HTML. To enhance performance, please consider enabling the `compress` option to minimize the size of HTTP requests. | | Authorization data is **lost** on browser close/refresh | Use `enable_swagger(persist_authorization=True)` to persist authorization data, like OAuath 2.0 access tokens. | ``` from typing import List import requests from pydantic import BaseModel, Field from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver(enable_validation=True) app.enable_swagger(path="/swagger") # (1)! class Todo(BaseModel): userId: int id_: int = Field(alias="id") title: str completed: bool @app.post("/todos") def create_todo(todo: Todo) -> str: response = requests.post("https://jsonplaceholder.typicode.com/todos", json=todo.dict(by_alias=True)) response.raise_for_status() return response.json()["id"] @app.get("/todos") def get_todos() -> List[Todo]: todo = requests.get("https://jsonplaceholder.typicode.com/todos") todo.raise_for_status() return todo.json() def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. `enable_swagger` creates a route to serve Swagger UI and allows quick customizations. You can also include middlewares to protect or enhance the overall experience. Here's an example of what it looks like by default: ### Custom Domain API Mappings When using [Custom Domain API Mappings feature](https://docs.aws.amazon.com/apigateway/latest/developerguide/rest-api-mappings.html), you must use **`strip_prefixes`** param in the `APIGatewayRestResolver` constructor. **Scenario**: You have a custom domain `api.mydomain.dev`. Then you set `/payment` API Mapping to forward any payment requests to your Payments API. **Challenge**: This means your `path` value for any API requests will always contain `/payment/`, leading to HTTP 404 as Event Handler is trying to match what's after `payment/`. This gets further complicated with an [arbitrary level of nesting](https://github.com/aws-powertools/powertools-lambda/issues/34). To address this API Gateway behavior, we use `strip_prefixes` parameter to account for these prefixes that are now injected into the path regardless of which type of API Gateway you're using. ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver(strip_prefixes=["/payment"]) @app.get("/subscriptions/") @tracer.capture_method def get_subscription(subscription): return {"subscription_id": subscription} @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "resource": "/subscriptions/{subscription}", "path": "/payment/subscriptions/123", "httpMethod": "GET" } ``` Note After removing a path prefix with `strip_prefixes`, the new root path will automatically be mapped to the path argument of `/`. For example, when using `strip_prefixes` value of `/pay`, there is no difference between a request path of `/pay` and `/pay/`; and the path argument would be defined as `/`. For added flexibility, you can use regexes to strip a prefix. This is helpful when you have many options due to different combinations of prefixes (e.g: multiple environments, multiple versions). ``` import re from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.utilities.typing import LambdaContext # This will support: # /v1/dev/subscriptions/ # /v1/stg/subscriptions/ # /v1/qa/subscriptions/ # /v2/dev/subscriptions/ # ... app = APIGatewayRestResolver(strip_prefixes=[re.compile(r"/v[1-3]+/(dev|stg|qa)")]) @app.get("/subscriptions/") def get_subscription(subscription): return {"subscription_id": subscription} def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ## Advanced ### CORS You can configure CORS at the `APIGatewayRestResolver` constructor via `cors` parameter using the `CORSConfig` class. This will ensure that CORS headers are returned as part of the response when your functions match the path invoked and the `Origin` matches one of the allowed values. Tip Optionally disable CORS on a per path basis with `cors=False` parameter. ``` from urllib.parse import quote import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver, CORSConfig from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() # CORS will match when Origin is only https://www.example.com cors_config = CORSConfig(allow_origin="https://www.example.com", max_age=300) app = APIGatewayRestResolver(cors=cors_config) @app.get("/todos") @tracer.capture_method def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} @app.get("/todos/") @tracer.capture_method def get_todo_by_id(todo_id: str): # value come as str todo_id = quote(todo_id, safe="") todos: Response = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todos.raise_for_status() return {"todos": todos.json()} @app.get("/healthcheck", cors=False) # optionally removes CORS for a given route @tracer.capture_method def am_i_alive(): return {"am_i_alive": "yes"} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "statusCode": 200, "multiValueHeaders": { "Content-Type": ["application/json"], "Access-Control-Allow-Origin": ["https://www.example.com"], "Access-Control-Allow-Headers": ["Authorization,Content-Type,X-Amz-Date,X-Amz-Security-Token,X-Api-Key"] }, "body": "{\"todos\":[{\"userId\":1,\"id\":1,\"title\":\"delectus aut autem\",\"completed\":false},{\"userId\":1,\"id\":2,\"title\":\"quis ut nam facilis et officia qui\",\"completed\":false},{\"userId\":1,\"id\":3,\"title\":\"fugiat veniam minus\",\"completed\":false},{\"userId\":1,\"id\":4,\"title\":\"et porro tempora\",\"completed\":true},{\"userId\":1,\"id\":5,\"title\":\"laboriosam mollitia et enim quasi adipisci quia provident illum\",\"completed\":false},{\"userId\":1,\"id\":6,\"title\":\"qui ullam ratione quibusdam voluptatem quia omnis\",\"completed\":false},{\"userId\":1,\"id\":7,\"title\":\"illo expedita consequatur quia in\",\"completed\":false},{\"userId\":1,\"id\":8,\"title\":\"quo adipisci enim quam ut ab\",\"completed\":true},{\"userId\":1,\"id\":9,\"title\":\"molestiae perspiciatis ipsa\",\"completed\":false},{\"userId\":1,\"id\":10,\"title\":\"illo est ratione doloremque quia maiores aut\",\"completed\":true}]}", "isBase64Encoded": false } ``` ``` from urllib.parse import quote import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver, CORSConfig from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() # CORS will match when Origin is https://www.example.com OR https://dev.example.com cors_config = CORSConfig(allow_origin="https://www.example.com", extra_origins=["https://dev.example.com"], max_age=300) app = APIGatewayRestResolver(cors=cors_config) @app.get("/todos") @tracer.capture_method def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} @app.get("/todos/") @tracer.capture_method def get_todo_by_id(todo_id: str): # value come as str todo_id = quote(todo_id, safe="") todos: Response = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todos.raise_for_status() return {"todos": todos.json()} @app.get("/healthcheck", cors=False) # optionally removes CORS for a given route @tracer.capture_method def am_i_alive(): return {"am_i_alive": "yes"} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "statusCode": 200, "multiValueHeaders": { "Content-Type": ["application/json"], "Access-Control-Allow-Origin": ["https://www.example.com","https://dev.example.com"], "Access-Control-Allow-Headers": ["Authorization,Content-Type,X-Amz-Date,X-Amz-Security-Token,X-Api-Key"] }, "body": "{\"todos\":[{\"userId\":1,\"id\":1,\"title\":\"delectus aut autem\",\"completed\":false},{\"userId\":1,\"id\":2,\"title\":\"quis ut nam facilis et officia qui\",\"completed\":false},{\"userId\":1,\"id\":3,\"title\":\"fugiat veniam minus\",\"completed\":false},{\"userId\":1,\"id\":4,\"title\":\"et porro tempora\",\"completed\":true},{\"userId\":1,\"id\":5,\"title\":\"laboriosam mollitia et enim quasi adipisci quia provident illum\",\"completed\":false},{\"userId\":1,\"id\":6,\"title\":\"qui ullam ratione quibusdam voluptatem quia omnis\",\"completed\":false},{\"userId\":1,\"id\":7,\"title\":\"illo expedita consequatur quia in\",\"completed\":false},{\"userId\":1,\"id\":8,\"title\":\"quo adipisci enim quam ut ab\",\"completed\":true},{\"userId\":1,\"id\":9,\"title\":\"molestiae perspiciatis ipsa\",\"completed\":false},{\"userId\":1,\"id\":10,\"title\":\"illo est ratione doloremque quia maiores aut\",\"completed\":true}]}", "isBase64Encoded": false } ``` #### Pre-flight Pre-flight (OPTIONS) calls are typically handled at the API Gateway or Lambda Function URL level as per [our sample infrastructure](#required-resources), no Lambda integration is necessary. However, ALB expects you to handle pre-flight requests. For convenience, we automatically handle that for you as long as you [setup CORS in the constructor level](#cors). #### Defaults For convenience, these are the default values when using `CORSConfig` to enable CORS: Warning Always configure `allow_origin` when using in production. Multiple origins? If you need to allow multiple origins, pass the additional origins using the `extra_origins` key. | Key | Value | Note | | --- | --- | --- | | **[allow_origin](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Allow-Origin)**: `str` | `*` | Only use the default value for development. **Never use `*` for production** unless your use case requires it | | **[extra_origins](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Allow-Origin)**: `List[str]` | `[]` | Additional origins to be allowed, in addition to the one specified in `allow_origin` | | **[allow_headers](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Allow-Headers)**: `List[str]` | `[Authorization, Content-Type, X-Amz-Date, X-Api-Key, X-Amz-Security-Token]` | Additional headers will be appended to the default list for your convenience | | **[expose_headers](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Expose-Headers)**: `List[str]` | `[]` | Any additional header beyond the [safe listed by CORS specification](https://developer.mozilla.org/en-US/docs/Glossary/CORS-safelisted_response_header). | | **[max_age](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Max-Age)**: `int` | \`\` | Only for pre-flight requests if you choose to have your function to handle it instead of API Gateway | | **[allow_credentials](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Allow-Credentials)**: `bool` | `False` | Only necessary when you need to expose cookies, authorization headers or TLS client certificates. | ### Middleware ``` stateDiagram direction LR EventHandler: GET /todo Before: Before response Next: next_middleware() MiddlewareLoop: Middleware loop AfterResponse: After response MiddlewareFinished: Modified response Response: Final response EventHandler --> Middleware: Has middleware? state MiddlewareLoop { direction LR Middleware --> Before Before --> Next Next --> Middleware: More middlewares? Next --> AfterResponse } AfterResponse --> MiddlewareFinished MiddlewareFinished --> Response EventHandler --> Response: No middleware ``` A middleware is a function you register per route to **intercept** or **enrich** a **request before** or **after** any response. Each middleware function receives the following arguments: 1. **app**. An Event Handler instance so you can access incoming request information, Lambda context, etc. 1. **next_middleware**. A function to get the next middleware or route's response. Here's a sample middleware that extracts and injects correlation ID, using `APIGatewayRestResolver` (works for any [Resolver](#event-resolvers)): ``` import requests from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import APIGatewayRestResolver, Response from aws_lambda_powertools.event_handler.middlewares import NextMiddleware app = APIGatewayRestResolver() logger = Logger() def inject_correlation_id(app: APIGatewayRestResolver, next_middleware: NextMiddleware) -> Response: request_id = app.current_event.request_context.request_id # (1)! # Use API Gateway REST API request ID if caller didn't include a correlation ID correlation_id = logger.get_correlation_id() or request_id # (2)! # Inject correlation ID in shared context and Logger app.append_context(correlation_id=correlation_id) # (3)! logger.set_correlation_id(correlation_id) # Get response from next middleware OR /todos route result = next_middleware(app) # (4)! # Include Correlation ID in the response back to caller result.headers["x-correlation-id"] = correlation_id # (5)! return result @app.get("/todos", middlewares=[inject_correlation_id]) # (6)! def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} @logger.inject_lambda_context(correlation_id_path='headers."x-correlation-id"') # (7)! def lambda_handler(event, context): return app.resolve(event, context) ``` 1. You can access current request like you normally would. 1. Logger extracts it first in the request path, so we can use it. If this was available before, we'd use `app.context.get("correlation_id")`. 1. [Shared context is available](#sharing-contextual-data) to any middleware, Router and App instances. For example, another middleware can now use `app.context.get("correlation_id")` to retrieve it. 1. Get response from the next middleware (if any) or from `/todos` route. 1. You can manipulate headers, body, or status code before returning it. 1. Register one or more middlewares in order of execution. 1. Logger extracts correlation ID from header and makes it available under `correlation_id` key, and `get_correlation_id()` method. ``` { "statusCode": 200, "body": "{\"todos\":[{\"userId\":1,\"id\":1,\"title\":\"delectus aut autem\",\"completed\":false}]}", "isBase64Encoded": false, "multiValueHeaders": { "Content-Type": [ "application/json" ], "x-correlation-id": [ "ccd87d70-7a3f-4aec-b1a8-a5a558c239b2" ] } } ``` #### Global middlewares ![Combining middlewares](../../media/middlewares_normal_processing-light.svg#only-light) ![Combining middlewares](../../media/middlewares_normal_processing-dark.svg#only-dark) _Request flowing through multiple registered middlewares_ You can use `app.use` to register middlewares that should always run regardless of the route, also known as global middlewares. Event Handler **calls global middlewares first**, then middlewares defined at the route level. Here's an example with both middlewares: > Use [debug mode](#debug-mode) if you need to log request/response. ``` import middleware_global_middlewares_module # (1)! import requests from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import APIGatewayRestResolver, Response app = APIGatewayRestResolver() logger = Logger() app.use(middlewares=[middleware_global_middlewares_module.log_request_response]) # (2)! @app.get("/todos", middlewares=[middleware_global_middlewares_module.inject_correlation_id]) def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} @logger.inject_lambda_context def lambda_handler(event, context): return app.resolve(event, context) ``` 1. A separate file where our middlewares are to keep this example focused. 1. We register `log_request_response` as a global middleware to run before middleware. ``` stateDiagram direction LR GlobalMiddleware: Log request response RouteMiddleware: Inject correlation ID EventHandler: Event Handler EventHandler --> GlobalMiddleware GlobalMiddleware --> RouteMiddleware ``` ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import APIGatewayRestResolver, Response from aws_lambda_powertools.event_handler.middlewares import NextMiddleware logger = Logger() def log_request_response(app: APIGatewayRestResolver, next_middleware: NextMiddleware) -> Response: logger.info("Incoming request", path=app.current_event.path, request=app.current_event.raw_event) result = next_middleware(app) logger.info("Response received", response=result.__dict__) return result def inject_correlation_id(app: APIGatewayRestResolver, next_middleware: NextMiddleware) -> Response: request_id = app.current_event.request_context.request_id # Use API Gateway REST API request ID if caller didn't include a correlation ID correlation_id = logger.get_correlation_id() or request_id # elsewhere becomes app.context.get("correlation_id") # Inject correlation ID in shared context and Logger app.append_context(correlation_id=correlation_id) logger.set_correlation_id(correlation_id) # Get response from next middleware OR /todos route result = next_middleware(app) # Include Correlation ID in the response back to caller result.headers["x-correlation-id"] = correlation_id return result def enforce_correlation_id(app: APIGatewayRestResolver, next_middleware: NextMiddleware) -> Response: # If missing mandatory header raise an error if not app.current_event.headers.get("x-correlation-id"): return Response(status_code=400, body="Correlation ID header is now mandatory.") # (1)! # Get the response from the next middleware and return it return next_middleware(app) ``` #### Returning early ![Short-circuiting middleware chain](../../media/middlewares_early_return-light.svg#only-light) ![Short-circuiting middleware chain](../../media/middlewares_early_return-dark.svg#only-dark) _Interrupting request flow by returning early_ Imagine you want to stop processing a request if something is missing, or return immediately if you've seen this request before. In these scenarios, you short-circuit the middleware processing logic by returning a [Response object](#fine-grained-responses), or raising a [HTTP Error](#raising-http-errors). This signals to Event Handler to stop and run each `After` logic left in the chain all the way back. Here's an example where we prevent any request that doesn't include a correlation ID header: ``` import middleware_global_middlewares_module import requests from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import APIGatewayRestResolver, Response app = APIGatewayRestResolver() logger = Logger() app.use( middlewares=[ middleware_global_middlewares_module.log_request_response, middleware_global_middlewares_module.enforce_correlation_id, # (1)! ], ) @app.get("/todos") def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") # (2)! todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} @logger.inject_lambda_context def lambda_handler(event, context): return app.resolve(event, context) ``` 1. This middleware will raise an exception if correlation ID header is missing. 1. This code section will not run if `enforce_correlation_id` returns early. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import APIGatewayRestResolver, Response from aws_lambda_powertools.event_handler.middlewares import NextMiddleware logger = Logger() def log_request_response(app: APIGatewayRestResolver, next_middleware: NextMiddleware) -> Response: logger.info("Incoming request", path=app.current_event.path, request=app.current_event.raw_event) result = next_middleware(app) logger.info("Response received", response=result.__dict__) return result def inject_correlation_id(app: APIGatewayRestResolver, next_middleware: NextMiddleware) -> Response: request_id = app.current_event.request_context.request_id # Use API Gateway REST API request ID if caller didn't include a correlation ID correlation_id = logger.get_correlation_id() or request_id # elsewhere becomes app.context.get("correlation_id") # Inject correlation ID in shared context and Logger app.append_context(correlation_id=correlation_id) logger.set_correlation_id(correlation_id) # Get response from next middleware OR /todos route result = next_middleware(app) # Include Correlation ID in the response back to caller result.headers["x-correlation-id"] = correlation_id return result def enforce_correlation_id(app: APIGatewayRestResolver, next_middleware: NextMiddleware) -> Response: # If missing mandatory header raise an error if not app.current_event.headers.get("x-correlation-id"): return Response(status_code=400, body="Correlation ID header is now mandatory.") # (1)! # Get the response from the next middleware and return it return next_middleware(app) ``` 1. Raising an exception OR returning a Response object early will short-circuit the middleware chain. ``` { "statusCode": 400, "body": "Correlation ID header is now mandatory", "isBase64Encoded": false, "multiValueHeaders": {} } ``` #### Handling exceptions For catching exceptions more broadly, we recommend you use the [exception_handler](#exception-handling) decorator. By default, any unhandled exception in the middleware chain is eventually propagated as a HTTP 500 back to the client. While there isn't anything special on how to use [`try/catch`](https://docs.python.org/3/tutorial/errors.html#handling-exceptions) for middlewares, it is important to visualize how Event Handler deals with them under the following scenarios: An exception wasn't caught by any middleware during `next_middleware()` block, therefore it propagates all the way back to the client as HTTP 500. *Unhandled route exceptions propagate back to the client* An exception was only caught by the third middleware, resuming the normal execution of each `After` logic for the second and first middleware. *Unhandled route exceptions propagate back to the client* The third middleware short-circuited the chain by raising an exception and completely skipping the fourth middleware. Because we only caught it in the first middleware, it skipped the `After` logic in the second middleware. *Middleware handling short-circuit exceptions* #### Extending middlewares You can implement `BaseMiddlewareHandler` interface to create middlewares that accept configuration, or perform complex operations (*see [being a good citizen section](#being-a-good-citizen)*). As a practical example, let's refactor our correlation ID middleware so it accepts a custom HTTP Header to look for. ``` import requests from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import APIGatewayRestResolver, Response from aws_lambda_powertools.event_handler.middlewares import BaseMiddlewareHandler, NextMiddleware app = APIGatewayRestResolver() logger = Logger() class CorrelationIdMiddleware(BaseMiddlewareHandler): def __init__(self, header: str): # (1)! """Extract and inject correlation ID in response Parameters ---------- header : str HTTP Header to extract correlation ID """ super().__init__() self.header = header def handler(self, app: APIGatewayRestResolver, next_middleware: NextMiddleware) -> Response: # (2)! request_id = app.current_event.request_context.request_id correlation_id = app.current_event.headers.get(self.header, request_id) response = next_middleware(app) # (3)! response.headers[self.header] = correlation_id return response @app.get("/todos", middlewares=[CorrelationIdMiddleware(header="x-correlation-id")]) # (4)! def get_todos(): todos: requests.Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} @logger.inject_lambda_context def lambda_handler(event, context): return app.resolve(event, context) ``` 1. You can add any constructor argument like you normally would 1. We implement `handler` just like we [did before](#middleware) with the only exception of the `self` argument, since it's a method. 1. Get response from the next middleware (if any) or from `/todos` route. 1. Register an instance of `CorrelationIdMiddleware`. Class-based **vs** function-based middlewares When registering a middleware, we expect a callable in both cases. For class-based middlewares, `BaseMiddlewareHandler` is doing the work of calling your `handler` method with the correct parameters, hence why we expect an instance of it. #### Native middlewares These are native middlewares that may become native features depending on customer demand. | Middleware | Purpose | | --- | --- | | [SchemaValidationMiddleware](/lambda/python/latest/api/event_handler/middlewares/schema_validation.html) | Validates API request body and response against JSON Schema, using [Validation utility](../../../utilities/validation/) | #### Being a good citizen Middlewares can add subtle improvements to request/response processing, but also add significant complexity if you're not careful. Keep the following in mind when authoring middlewares for Event Handler: 1. **Use built-in features over middlewares**. We include built-in features like [CORS](#cors), [compression](#compress), [binary responses](#binary-responses), [global exception handling](#exception-handling), and [debug mode](#debug-mode) to reduce the need for middlewares. 1. **Call the next middleware**. Return the result of `next_middleware(app)`, or a [Response object](#fine-grained-responses) when you want to [return early](#returning-early). 1. **Keep a lean scope**. Focus on a single task per middleware to ease composability and maintenance. In [debug mode](#debug-mode), we also print out the order middlewares will be triggered to ease operations. 1. **Catch your own exceptions**. Catch and handle known exceptions to your logic. Unless you want to raise [HTTP Errors](#raising-http-errors), or propagate specific exceptions to the client. To catch all and any exceptions, we recommend you use the [exception_handler](#exception-handling) decorator. 1. **Use context to share data**. Use `app.append_context` to [share contextual data](#sharing-contextual-data) between middlewares and route handlers, and `app.context.get(key)` to fetch them. We clear all contextual data at the end of every request. ### Fine grained responses You can use the `Response` class to have full control over the response. For example, you might want to add additional headers, cookies, or set a custom Content-type. Info Powertools for AWS Lambda (Python) serializes headers and cookies according to the type of input event. Some event sources require headers and cookies to be encoded as `multiValueHeaders`. Using multiple values for HTTP headers in ALB? Make sure you [enable the multi value headers feature](https://docs.aws.amazon.com/elasticloadbalancing/latest/application/lambda-functions.html#multi-value-headers) to serialize response headers correctly. ``` from http import HTTPStatus from uuid import uuid4 import requests from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import ( APIGatewayRestResolver, Response, content_types, ) from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.shared.cookies import Cookie from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @app.get("/todos") @tracer.capture_method def get_todos(): todos: requests.Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() custom_headers = {"X-Transaction-Id": [f"{uuid4()}"]} return Response( status_code=HTTPStatus.OK.value, # 200 content_type=content_types.APPLICATION_JSON, body=todos.json()[:10], headers=custom_headers, cookies=[Cookie(name="session_id", value="12345")], ) # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "statusCode": 200, "multiValueHeaders": { "Content-Type": ["application/json"], "X-Transaction-Id": ["3490eea9-791b-47a0-91a4-326317db61a9"], "Set-Cookie": ["session_id=12345; Secure"] }, "body": "{\"todos\":[{\"userId\":1,\"id\":1,\"title\":\"delectus aut autem\",\"completed\":false},{\"userId\":1,\"id\":2,\"title\":\"quis ut nam facilis et officia qui\",\"completed\":false},{\"userId\":1,\"id\":3,\"title\":\"fugiat veniam minus\",\"completed\":false},{\"userId\":1,\"id\":4,\"title\":\"et porro tempora\",\"completed\":true},{\"userId\":1,\"id\":5,\"title\":\"laboriosam mollitia et enim quasi adipisci quia provident illum\",\"completed\":false},{\"userId\":1,\"id\":6,\"title\":\"qui ullam ratione quibusdam voluptatem quia omnis\",\"completed\":false},{\"userId\":1,\"id\":7,\"title\":\"illo expedita consequatur quia in\",\"completed\":false},{\"userId\":1,\"id\":8,\"title\":\"quo adipisci enim quam ut ab\",\"completed\":true},{\"userId\":1,\"id\":9,\"title\":\"molestiae perspiciatis ipsa\",\"completed\":false},{\"userId\":1,\"id\":10,\"title\":\"illo est ratione doloremque quia maiores aut\",\"completed\":true}]}", "isBase64Encoded": false } ``` Using `Response` with data validation? When using the [data validation](#data-validation) feature with `enable_validation=True`, you must specify the concrete type for the `Response` class. This allows the validation middleware to infer the underlying type and perform validation correctly. ``` from http import HTTPStatus from typing import Optional import requests from pydantic import BaseModel, Field from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver, Response, content_types from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver(enable_validation=True) class Todo(BaseModel): userId: int id_: Optional[int] = Field(alias="id", default=None) title: str completed: bool @app.get("/todos/") @tracer.capture_method def get_todo_by_id(todo_id: int) -> Response[Todo]: todo = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todo.raise_for_status() return Response( status_code=HTTPStatus.OK.value, content_type=content_types.APPLICATION_JSON, body=todo.json(), ) @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ### Compress You can compress with gzip and base64 encode your responses via `compress` parameter. You have the option to pass the `compress` parameter when working with a specific route or using the Response object. Info The `compress` parameter used in the Response object takes precedence over the one used in the route. Warning The client must send the `Accept-Encoding` header, otherwise a normal response will be sent. ``` from urllib.parse import quote import requests from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import ( APIGatewayRestResolver, Response, content_types, ) from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @app.get("/todos", compress=True) @tracer.capture_method def get_todos(): todos: requests.Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} @app.get("/todos/", compress=True) @tracer.capture_method def get_todo_by_id(todo_id: str): # same example using Response class todo_id = quote(todo_id, safe="") todos: requests.Response = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todos.raise_for_status() return Response(status_code=200, content_type=content_types.APPLICATION_JSON, body=todos.json()) # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` import requests from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import ( APIGatewayRestResolver, Response, content_types, ) from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @app.get("/todos") @tracer.capture_method def get_todos(): todos: requests.Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return Response(status_code=200, content_type=content_types.APPLICATION_JSON, body=todos.json()[:10], compress=True) # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "headers": { "Accept-Encoding": "gzip" }, "resource": "/todos", "path": "/todos", "httpMethod": "GET" } ``` ``` { "statusCode": 200, "multiValueHeaders": { "Content-Type": ["application/json"], "Content-Encoding": ["gzip"] }, "body": "H4sIAAAAAAACE42STU4DMQyFrxJl3QXln96AMyAW7sSDLCVxiJ0Kqerd8TCCUOgii1EmP/783pOPXjmw+N3L0TfB+hz8brvxtC5KGtHvfMCIkzZx0HT5MPmNnziViIr2dIYoeNr8Q1x3xHsjcVadIbkZJoq2RXU8zzQROLseQ9505NzeCNQdMJNBE+UmY4zbzjAJhWtlZ57sB84BWtul+rteH2HPlVgWARwjqXkxpklK5gmEHAQqJBMtFsGVygcKmNVRjG0wxvuzGF2L0dpVUOKMC3bfJNjJgWMrCuZk7cUp02AiD72D6WKHHwUDKbiJs6AZ0VZXKOUx4uNvzdxT+E4mLcMA+6G8nzrLQkaxkNEVrFKW2VGbJCoCY7q2V3+tiv5kGThyxfTecDWbgGz/NfYXhL6ePgF9PnFdPgMAAA==", "isBase64Encoded": true } ``` ### Binary responses Amazon API Gateway does not support `*/*` binary media type [when CORS is also configured](https://github.com/aws-powertools/powertools-lambda-python/issues/3373#issuecomment-1821144779). This feature requires API Gateway to configure binary media types, see [our sample infrastructure](#required-resources) for reference. For convenience, we automatically base64 encode binary responses. You can also use in combination with `compress` parameter if your client supports gzip. Like `compress` feature, the client must send the `Accept` header with the correct media type. Lambda Function URLs handle binary media types automatically. ``` import os from pathlib import Path from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler.api_gateway import ( APIGatewayRestResolver, Response, ) from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() logo_file: bytes = Path(f"{os.getenv('LAMBDA_TASK_ROOT')}/logo.svg").read_bytes() @app.get("/logo") @tracer.capture_method def get_logo(): return Response(status_code=200, content_type="image/svg+xml", body=logo_file) # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` AWS Lambda ``` ``` { "headers": { "Accept": "image/svg+xml" }, "resource": "/logo", "path": "/logo", "httpMethod": "GET" } ``` ``` { "body": "PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiPz4KPHN2ZyB3aWR0aD0iMjU2cHgiIGhlaWdodD0iMjU2cHgiIHZpZXdCb3g9IjAgMCAyNTYgMjU2IiB2ZXJzaW9uPSIxLjEiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyIgeG1sbnM6eGxpbms9Imh0dHA6Ly93d3cudzMub3JnLzE5OTkveGxpbmsiIHByZXNlcnZlQXNwZWN0UmF0aW89InhNaWRZTWlkIj4KICAgIDx0aXRsZT5BV1MgTGFtYmRhPC90aXRsZT4KICAgIDxkZWZzPgogICAgICAgIDxsaW5lYXJHcmFkaWVudCB4MT0iMCUiIHkxPSIxMDAlIiB4Mj0iMTAwJSIgeTI9IjAlIiBpZD0ibGluZWFyR3JhZGllbnQtMSI+CiAgICAgICAgICAgIDxzdG9wIHN0b3AtY29sb3I9IiNDODUxMUIiIG9mZnNldD0iMCUiPjwvc3RvcD4KICAgICAgICAgICAgPHN0b3Agc3RvcC1jb2xvcj0iI0ZGOTkwMCIgb2Zmc2V0PSIxMDAlIj48L3N0b3A+CiAgICAgICAgPC9saW5lYXJHcmFkaWVudD4KICAgIDwvZGVmcz4KICAgIDxnPgogICAgICAgIDxyZWN0IGZpbGw9InVybCgjbGluZWFyR3JhZGllbnQtMSkiIHg9IjAiIHk9IjAiIHdpZHRoPSIyNTYiIGhlaWdodD0iMjU2Ij48L3JlY3Q+CiAgICAgICAgPHBhdGggZD0iTTg5LjYyNDExMjYsMjExLjIgTDQ5Ljg5MDMyNzcsMjExLjIgTDkzLjgzNTQ4MzIsMTE5LjM0NzIgTDExMy43NDcyOCwxNjAuMzM5MiBMODkuNjI0MTEyNiwyMTEuMiBaIE05Ni43MDI5MzU3LDExMC41Njk2IEM5Ni4xNjQwODU4LDEwOS40NjU2IDk1LjA0MTQ4MTMsMTA4Ljc2NDggOTMuODE2MjM4NCwxMDguNzY0OCBMOTMuODA2NjE2MywxMDguNzY0OCBDOTIuNTcxNzUxNCwxMDguNzY4IDkxLjQ0OTE0NjYsMTA5LjQ3NTIgOTAuOTE5OTE4NywxMTAuNTg1NiBMNDEuOTEzNDIwOCwyMTMuMDIwOCBDNDEuNDM4NzE5NywyMTQuMDEyOCA0MS41MDYwNzU4LDIxNS4xNzc2IDQyLjA5NjI0NTEsMjE2LjEwODggQzQyLjY3OTk5OTQsMjE3LjAzNjggNDMuNzA2MzgwNSwyMTcuNiA0NC44MDY1MzMxLDIxNy42IEw5MS42NTQ0MjMsMjE3LjYgQzkyLjg5NTcwMjcsMjE3LjYgOTQuMDIxNTE0OSwyMTYuODg2NCA5NC41NTM5NTAxLDIxNS43Njk2IEwxMjAuMjAzODU5LDE2MS42ODk2IEMxMjAuNjE3NjE5LDE2MC44MTI4IDEyMC42MTQ0MTIsMTU5Ljc5ODQgMTIwLjE4NzgyMiwxNTguOTI4IEw5Ni43MDI5MzU3LDExMC41Njk2IFogTTIwNy45ODUxMTcsMjExLjIgTDE2OC41MDc5MjgsMjExLjIgTDEwNS4xNzM3ODksNzguNjI0IEMxMDQuNjQ0NTYxLDc3LjUxMDQgMTAzLjUxNTU0MSw3Ni44IDEwMi4yNzc0NjksNzYuOCBMNzYuNDQ3OTQzLDc2LjggTDc2LjQ3NjgwOTksNDQuOCBMMTI3LjEwMzA2Niw0NC44IEwxOTAuMTQ1MzI4LDE3Ny4zNzI4IEMxOTAuNjc0NTU2LDE3OC40ODY0IDE5MS44MDM1NzUsMTc5LjIgMTkzLjA0MTY0NywxNzkuMiBMMjA3Ljk4NTExNywxNzkuMiBMMjA3Ljk4NTExNywyMTEuMiBaIE0yMTEuMTkyNTU4LDE3Mi44IEwxOTUuMDcxOTU4LDE3Mi44IEwxMzIuMDI5Njk2LDQwLjIyNzIgQzEzMS41MDA0NjgsMzkuMTEzNiAxMzAuMzcxNDQ5LDM4LjQgMTI5LjEzMDE2OSwzOC40IEw3My4yNzI1NzYsMzguNCBDNzEuNTA1Mjc1OCwzOC40IDcwLjA2ODM0MjEsMzkuODMwNCA3MC4wNjUxMzQ0LDQxLjU5NjggTDcwLjAyOTg1MjgsNzkuOTk2OCBDNzAuMDI5ODUyOCw4MC44NDggNzAuMzYzNDI2Niw4MS42NjA4IDcwLjk2OTYzMyw4Mi4yNjI0IEM3MS41Njk0MjQ2LDgyLjg2NCA3Mi4zODQxMTQ2LDgzLjIgNzMuMjM3Mjk0MSw4My4yIEwxMDAuMjUzNTczLDgzLjIgTDE2My41OTA5MiwyMTUuNzc2IEMxNjQuMTIzMzU1LDIxNi44ODk2IDE2NS4yNDU5NiwyMTcuNiAxNjYuNDg0MDMyLDIxNy42IEwyMTEuMTkyNTU4LDIxNy42IEMyMTIuOTY2Mjc0LDIxNy42IDIxNC40LDIxNi4xNjY0IDIxNC40LDIxNC40IEwyMTQuNCwxNzYgQzIxNC40LDE3NC4yMzM2IDIxMi45NjYyNzQsMTcyLjggMjExLjE5MjU1OCwxNzIuOCBMMjExLjE5MjU1OCwxNzIuOCBaIiBmaWxsPSIjRkZGRkZGIj48L3BhdGg+CiAgICA8L2c+Cjwvc3ZnPg==", "multiValueHeaders": { "Content-Type": ["image/svg+xml"] }, "isBase64Encoded": true, "statusCode": 200 } ``` ### Debug mode You can enable debug mode via `debug` param, or via `POWERTOOLS_DEV` [environment variable](../../../#environment-variables). This will enable full tracebacks errors in the response, print request and responses, and set CORS in development mode. Danger This might reveal sensitive information in your logs and relax CORS restrictions, use it sparingly. It's best to use for local development only! ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver(debug=True) @app.get("/todos") @tracer.capture_method def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ### OpenAPI When you enable [Data Validation](#data-validation), we use a combination of Pydantic Models and [OpenAPI](https://www.openapis.org/) type annotations to add constraints to your API's parameters. OpenAPI schema version depends on the installed version of Pydantic Pydantic v1 generates [valid OpenAPI 3.0.3 schemas](https://docs.pydantic.dev/1.10/usage/schema/), and Pydantic v2 generates [valid OpenAPI 3.1.0 schemas](https://docs.pydantic.dev/latest/why/#json-schema). In OpenAPI documentation tools like [SwaggerUI](#enabling-swaggerui), these annotations become readable descriptions, offering a self-explanatory API interface. This reduces boilerplate code while improving functionality and enabling auto-documentation. Note We don't have support for files, form data, and header parameters at the moment. If you're interested in this, please [open an issue](https://github.com/aws-powertools/powertools-lambda-python/issues/new?assignees=&labels=feature-request%2Ctriage&projects=&template=feature_request.yml&title=Feature+request%3A+TITLE). #### Customizing OpenAPI parameters Whenever you use OpenAPI parameters to validate [query strings](./#validating-query-strings) or [path parameters](./#validating-path-parameters), you can enhance validation and OpenAPI documentation by using any of these parameters: | Field name | Type | Description | | --- | --- | --- | | `alias` | `str` | Alternative name for a field, used when serializing and deserializing data | | `validation_alias` | `str` | Alternative name for a field during validation (but not serialization) | | `serialization_alias` | `str` | Alternative name for a field during serialization (but not during validation) | | `description` | `str` | Human-readable description | | `gt` | `float` | Greater than. If set, value must be greater than this. Only applicable to numbers | | `ge` | `float` | Greater than or equal. If set, value must be greater than or equal to this. Only applicable to numbers | | `lt` | `float` | Less than. If set, value must be less than this. Only applicable to numbers | | `le` | `float` | Less than or equal. If set, value must be less than or equal to this. Only applicable to numbers | | `min_length` | `int` | Minimum length for strings | | `max_length` | `int` | Maximum length for strings | | `pattern` | `string` | A regular expression that the string must match. | | `strict` | `bool` | If `True`, strict validation is applied to the field. See [Strict Mode](https://docs.pydantic.dev/latest/concepts/strict_mode/) for details | | `multiple_of` | `float` | Value must be a multiple of this. Only applicable to numbers | | `allow_inf_nan` | `bool` | Allow `inf`, `-inf`, `nan`. Only applicable to numbers | | `max_digits` | `int` | Maximum number of allow digits for strings | | `decimal_places` | `int` | Maximum number of decimal places allowed for numbers | | `openapi_examples` | `dict[str, Example]` | A list of examples to be displayed in the SwaggerUI interface. Avoid using the `examples` field for this purpose. | | `deprecated` | `bool` | Marks the field as deprecated | | `include_in_schema` | `bool` | If `False` the field will not be part of the exported OpenAPI schema | | `json_schema_extra` | `JsonDict` | Any additional JSON schema data for the schema property | #### Customizing API operations Customize your API endpoints by adding metadata to endpoint definitions. Here's a breakdown of various customizable fields: | Field Name | Type | Description | | --- | --- | --- | | `summary` | `str` | A concise overview of the main functionality of the endpoint. This brief introduction is usually displayed in autogenerated API documentation and helps consumers quickly understand what the endpoint does. | | `description` | `str` | A more detailed explanation of the endpoint, which can include information about the operation's behavior, including side effects, error states, and other operational guidelines. | | `responses` | `Dict[int, Dict[str, OpenAPIResponse]]` | A dictionary that maps each HTTP status code to a Response Object as defined by the [OpenAPI Specification](https://swagger.io/specification/#response-object). This allows you to describe expected responses, including default or error messages, and their corresponding schemas or models for different status codes. | | `response_description` | `str` | Provides the default textual description of the response sent by the endpoint when the operation is successful. It is intended to give a human-readable understanding of the result. | | `tags` | `List[str]` | Tags are a way to categorize and group endpoints within the API documentation. They can help organize the operations by resources or other heuristic. | | `operation_id` | `str` | A unique identifier for the operation, which can be used for referencing this operation in documentation or code. This ID must be unique across all operations described in the API. | | `include_in_schema` | `bool` | A boolean value that determines whether or not this operation should be included in the OpenAPI schema. Setting it to `False` can hide the endpoint from generated documentation and schema exports, which might be useful for private or experimental endpoints. | | `deprecated` | `bool` | A boolean value that determines whether or not this operation should be marked as deprecated in the OpenAPI schema. | To implement these customizations, include extra parameters when defining your routes: ``` import requests from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.utilities.typing import LambdaContext app = APIGatewayRestResolver(enable_validation=True) @app.get( "/todos/", summary="Retrieves a todo item", description="Loads a todo item identified by the `todo_id`", response_description="The todo object", responses={ 200: {"description": "Todo item found"}, 404: { "description": "Item not found", }, }, tags=["Todos"], ) def get_todo_title(todo_id: int) -> str: todo = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todo.raise_for_status() return todo.json()["title"] def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` #### Customizing OpenAPI metadata Defining and customizing OpenAPI metadata gives detailed, top-level information about your API. Use the method `app.configure_openapi` to set and tailor this metadata: | Field Name | Type | Description | | --- | --- | --- | | `title` | `str` | The title for your API. It should be a concise, specific name that can be used to identify the API in documentation or listings. | | `version` | `str` | The version of the API you are documenting. This could reflect the release iteration of the API and helps clients understand the evolution of the API. | | `openapi_version` | `str` | Specifies the version of the OpenAPI Specification on which your API is based. When using Pydantic v1 it defaults to 3.0.3, and when using Pydantic v2, it defaults to 3.1.0. | | `summary` | `str` | A short and informative summary that can provide an overview of what the API does. This can be the same as or different from the title but should add context or information. | | `description` | `str` | A verbose description that can include Markdown formatting, providing a full explanation of the API's purpose, functionalities, and general usage instructions. | | `tags` | `List[str]` | A collection of tags that categorize endpoints for better organization and navigation within the documentation. This can group endpoints by their functionality or other criteria. | | `servers` | `List[Server]` | An array of Server objects, which specify the URL to the server and a description for its environment (production, staging, development, etc.), providing connectivity information. | | `terms_of_service` | `str` | A URL that points to the terms of service for your API. This could provide legal information and user responsibilities related to the usage of the API. | | `contact` | `Contact` | A Contact object containing contact details of the organization or individuals maintaining the API. This may include fields such as name, URL, and email. | | `license_info` | `License` | A License object providing the license details for the API, typically including the name of the license and the URL to the full license text. | Include extra parameters when exporting your OpenAPI specification to apply these customizations: ``` import requests from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.event_handler.openapi.models import Contact, Server from aws_lambda_powertools.utilities.typing import LambdaContext app = APIGatewayRestResolver(enable_validation=True) app.configure_openapi( title="TODO's API", version="1.21.3", summary="API to manage TODOs", description="This API implements all the CRUD operations for the TODO app", tags=["todos"], servers=[Server(url="https://stg.example.org/orders", description="Staging server")], contact=Contact(name="John Smith", email="john@smith.com"), ) @app.get("/todos/") def get_todo_title(todo_id: int) -> str: todo = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todo.raise_for_status() return todo.json()["title"] def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) if __name__ == "__main__": print(app.get_openapi_json_schema()) ``` #### Customizing Swagger UI Customizing the Swagger metadata The `enable_swagger` method accepts the same metadata as described at [Customizing OpenAPI metadata](#customizing-openapi-metadata). The Swagger UI appears by default at the `/swagger` path, but you can customize this to serve the documentation from another path and specify the source for Swagger UI assets. Below is an example configuration for serving Swagger UI from a custom path or CDN, with assets like CSS and JavaScript loading from a chosen CDN base URL. ``` from typing import List import requests from pydantic import BaseModel, EmailStr, Field from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.utilities.typing import LambdaContext app = APIGatewayRestResolver(enable_validation=True) app.enable_swagger(path="/_swagger", swagger_base_url="https://cdn.example.com/path/to/assets/") class Todo(BaseModel): userId: int id_: int = Field(alias="id") title: str completed: bool @app.get("/todos") def get_todos_by_email(email: EmailStr) -> List[Todo]: todos = requests.get(f"https://jsonplaceholder.typicode.com/todos?email={email}") todos.raise_for_status() return todos.json() def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` A Middleware can handle tasks such as adding security headers, user authentication, or other request processing for serving the Swagger UI. ``` from typing import List import requests from pydantic import BaseModel, EmailStr, Field from aws_lambda_powertools.event_handler import APIGatewayRestResolver, Response from aws_lambda_powertools.event_handler.middlewares import NextMiddleware from aws_lambda_powertools.utilities.typing import LambdaContext app = APIGatewayRestResolver(enable_validation=True) def swagger_middleware(app: APIGatewayRestResolver, next_middleware: NextMiddleware) -> Response: is_authenticated = ... if not is_authenticated: return Response(status_code=400, body="Unauthorized") return next_middleware(app) app.enable_swagger(middlewares=[swagger_middleware]) class Todo(BaseModel): userId: int id_: int = Field(alias="id") title: str completed: bool @app.get("/todos") def get_todos_by_email(email: EmailStr) -> List[Todo]: todos = requests.get(f"https://jsonplaceholder.typicode.com/todos?email={email}") todos.raise_for_status() return todos.json() def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` #### Security schemes Does Powertools implement any of the security schemes? No. Powertools adds support for generating OpenAPI documentation with [security schemes](https://swagger.io/docs/specification/authentication/), but it doesn't implement any of the security schemes itself, so you must implement the security mechanisms separately. Security schemes are declared at the top-level first. You can reference them globally or on a per path *(operation)* level. **However**, if you reference security schemes that are not defined at the top-level it will lead to a `SchemaValidationError` *(invalid OpenAPI spec)*. ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import ( APIGatewayRestResolver, ) from aws_lambda_powertools.event_handler.openapi.models import ( OAuth2, OAuthFlowAuthorizationCode, OAuthFlows, ) tracer = Tracer() logger = Logger() app = APIGatewayRestResolver(enable_validation=True) app.configure_openapi( title="My API", security_schemes={ "oauth": OAuth2( flows=OAuthFlows( authorizationCode=OAuthFlowAuthorizationCode( authorizationUrl="https://xxx.amazoncognito.com/oauth2/authorize", tokenUrl="https://xxx.amazoncognito.com/oauth2/token", ), ), ), }, security=[{"oauth": ["admin"]}], # (1)!) ) @app.get("/") def helloworld() -> dict: return {"hello": "world"} @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context): return app.resolve(event, context) if __name__ == "__main__": print(app.get_openapi_json_schema()) ``` 1. Using the oauth security scheme defined earlier, scoped to the "admin" role. ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import ( APIGatewayRestResolver, ) from aws_lambda_powertools.event_handler.openapi.models import ( OAuth2, OAuthFlowAuthorizationCode, OAuthFlows, ) tracer = Tracer() logger = Logger() app = APIGatewayRestResolver(enable_validation=True) app.configure_openapi( title="My API", security_schemes={ "oauth": OAuth2( flows=OAuthFlows( authorizationCode=OAuthFlowAuthorizationCode( authorizationUrl="https://xxx.amazoncognito.com/oauth2/authorize", tokenUrl="https://xxx.amazoncognito.com/oauth2/token", ), ), ), }, ) @app.get("/", security=[{"oauth": ["admin"]}]) # (1)! def helloworld() -> dict: return {"hello": "world"} @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context): return app.resolve(event, context) if __name__ == "__main__": print(app.get_openapi_json_schema()) ``` 1. Using the oauth security scheme defined bellow, scoped to the "admin" role. ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import ( APIGatewayRestResolver, ) from aws_lambda_powertools.event_handler.openapi.models import ( OAuth2, OAuthFlowAuthorizationCode, OAuthFlows, ) tracer = Tracer() logger = Logger() app = APIGatewayRestResolver(enable_validation=True) app.configure_openapi( title="My API", security_schemes={ "oauth": OAuth2( flows=OAuthFlows( authorizationCode=OAuthFlowAuthorizationCode( authorizationUrl="https://xxx.amazoncognito.com/oauth2/authorize", tokenUrl="https://xxx.amazoncognito.com/oauth2/token", ), ), ), }, ) @app.get("/protected", security=[{"oauth": ["admin"]}]) def protected() -> dict: return {"hello": "world"} @app.get("/unprotected", security=[{}]) # (1)! def unprotected() -> dict: return {"hello": "world"} @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context): return app.resolve(event, context) if __name__ == "__main__": print(app.get_openapi_json_schema()) ``` 1. To make security optional in a specific route, an empty security requirement ({}) can be included in the array. OpenAPI 3 lets you describe APIs protected using the following security schemes: | Security Scheme | Type | Description | | --- | --- | --- | | [HTTP auth](https://www.iana.org/assignments/http-authschemes/http-authschemes.xhtml) | `HTTPBase` | HTTP authentication schemes using the Authorization header (e.g: [Basic auth](https://swagger.io/docs/specification/authentication/basic-authentication/), [Bearer](https://swagger.io/docs/specification/authentication/bearer-authentication/)) | | [API keys](https://swagger.io/docs/specification/authentication/api-keys/) (e.g: query strings, cookies) | `APIKey` | API keys in headers, query strings or [cookies](https://swagger.io/docs/specification/authentication/cookie-authentication/). | | [OAuth 2](https://swagger.io/docs/specification/authentication/oauth2/) | `OAuth2` | Authorization protocol that gives an API client limited access to user data on a web server. | | [OpenID Connect Discovery](https://swagger.io/docs/specification/authentication/openid-connect-discovery/) | `OpenIdConnect` | Identity layer built [on top of the OAuth 2.0 protocol](https://openid.net/developers/how-connect-works/) and supported by some OAuth 2.0. | | [Mutual TLS](https://swagger.io/specification/#security-scheme-object). | `MutualTLS` | Client/server certificate mutual authentication scheme. | Using OAuth2 with the Swagger UI? You can use the `OAuth2Config` option to configure a default OAuth2 app on the generated Swagger UI. ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import ( APIGatewayRestResolver, ) from aws_lambda_powertools.event_handler.openapi.models import ( OAuth2, OAuthFlowAuthorizationCode, OAuthFlows, ) from aws_lambda_powertools.event_handler.openapi.swagger_ui import OAuth2Config tracer = Tracer() logger = Logger() oauth2 = OAuth2Config( client_id="xxxxxxxxxxxxxxxxxxxxxxxxxxxx", app_name="OAuth2 app", ) app = APIGatewayRestResolver(enable_validation=True) app.enable_swagger( oauth2_config=oauth2, security_schemes={ "oauth": OAuth2( flows=OAuthFlows( authorizationCode=OAuthFlowAuthorizationCode( authorizationUrl="https://xxx.amazoncognito.com/oauth2/authorize", tokenUrl="https://xxx.amazoncognito.com/oauth2/token", ), ), ), }, security=[{"oauth": []}], ) @app.get("/") def hello() -> str: return "world" @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context): return app.resolve(event, context) ``` #### OpenAPI extensions For a better experience when working with Lambda and Amazon API Gateway, customers can define extensions using the `openapi_extensions` parameter. We support defining OpenAPI extensions at the following levels of the OpenAPI JSON Schema: Root, Servers, Operation, and Security Schemes. Warning We do not support the `x-amazon-apigateway-any-method` and `x-amazon-apigateway-integrations` extensions. ``` from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.event_handler.openapi.models import APIKey, APIKeyIn, Server app = APIGatewayRestResolver(enable_validation=True) servers = Server( url="http://example.com", description="Example server", openapi_extensions={"x-amazon-apigateway-endpoint-configuration": {"vpcEndpoint": "myendpointid"}}, # (1)! ) @app.get( "/hello", openapi_extensions={"x-amazon-apigateway-integration": {"type": "aws", "uri": "my_lambda_arn"}}, # (2)! ) def hello(): return app.get_openapi_json_schema( servers=[servers], security_schemes={ "apikey": APIKey( name="X-API-KEY", description="API KeY", in_=APIKeyIn.header, openapi_extensions={"x-amazon-apigateway-authorizer": "custom"}, # (3)! ), }, openapi_extensions={"x-amazon-apigateway-gateway-responses": {"DEFAULT_4XX"}}, # (4)! ) def lambda_handler(event, context): return app.resolve(event, context) ``` 1. Server level 1. Operation level 1. Security scheme level 1. Root level ### Custom serializer You can instruct event handler to use a custom serializer to best suit your needs, for example take into account Enums when serializing. ``` import json from dataclasses import asdict, dataclass, is_dataclass from json import JSONEncoder import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @dataclass class Todo: userId: str id: str # noqa: A003 VNE003 "id" field is reserved title: str completed: bool class DataclassCustomEncoder(JSONEncoder): """A custom JSON encoder to serialize dataclass obj""" def default(self, obj): # Only called for values that aren't JSON serializable # where `obj` will be an instance of Todo in this example return asdict(obj) if is_dataclass(obj) else super().default(obj) def custom_serializer(obj) -> str: """Your custom serializer function APIGatewayRestResolver will use""" return json.dumps(obj, separators=(",", ":"), cls=DataclassCustomEncoder) app = APIGatewayRestResolver(serializer=custom_serializer) @app.get("/todos") @tracer.capture_method def get_todos(): ret: Response = requests.get("https://jsonplaceholder.typicode.com/todos") ret.raise_for_status() todos = [Todo(**todo) for todo in ret.json()] # for brevity, we'll limit to the first 10 only return {"todos": todos[:10]} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ### Custom body deserializer You can customize how the integrated [Event Source Data Classes](https://docs.powertools.aws.dev/lambda/python/latest/utilities/data_classes/#api-gateway-proxy) parse the JSON request body by providing your own deserializer function. By default it is `json.loads` ``` import json from decimal import Decimal from functools import partial from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() app = APIGatewayRestResolver(json_body_deserializer=partial(json.loads, parse_float=Decimal)) @app.get("/body") def get_body(): return app.current_event.json_body # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ### Split routes with Router As you grow the number of routes a given Lambda function should handle, it is natural to either break into smaller Lambda functions, or split routes into separate files to ease maintenance - that's where the `Router` feature is useful. Let's assume you have `split_route.py` as your Lambda function entrypoint and routes in `split_route_module.py`. This is how you'd use the `Router` feature. We import **Router** instead of **APIGatewayRestResolver**; syntax wise is exactly the same. Info This means all methods, including [middleware](#middleware) will work as usual. ``` from urllib.parse import quote import requests from requests import Response from aws_lambda_powertools import Tracer from aws_lambda_powertools.event_handler.api_gateway import Router tracer = Tracer() router = Router() endpoint = "https://jsonplaceholder.typicode.com/todos" @router.get("/todos") @tracer.capture_method def get_todos(): api_key = router.current_event.headers["X-Api-Key"] todos: Response = requests.get(endpoint, headers={"X-Api-Key": api_key}) todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} @router.get("/todos/") @tracer.capture_method def get_todo_by_id(todo_id: str): # value come as str api_key = router.current_event.headers["X-Api-Key"] todo_id = quote(todo_id, safe="") todos: Response = requests.get(f"{endpoint}/{todo_id}", headers={"X-Api-Key": api_key}) todos.raise_for_status() return {"todos": todos.json()} ``` We use `include_router` method and include all user routers registered in the `router` global object. Note This method merges routes, [context](#sharing-contextual-data) and [middleware](#middleware) from `Router` into the main resolver instance (`APIGatewayRestResolver()`). ``` import split_route_module from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() app.include_router(split_route_module.router) # (1)! # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. When using [middleware](#middleware) in both `Router` and main resolver, you can make `Router` middlewares to take precedence by using `include_router` before `app.use()`. #### Route prefix In the previous example, `split_route_module.py` routes had a `/todos` prefix. This might grow over time and become repetitive. When necessary, you can set a prefix when including a router object. This means you could remove `/todos` prefix altogether. ``` import split_route_module from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() # prefix '/todos' to any route in `split_route_module.router` app.include_router(split_route_module.router, prefix="/todos") # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` from urllib.parse import quote import requests from requests import Response from aws_lambda_powertools import Tracer from aws_lambda_powertools.event_handler.api_gateway import Router tracer = Tracer() router = Router() endpoint = "https://jsonplaceholder.typicode.com/todos" @router.get("/") @tracer.capture_method def get_todos(): api_key = router.current_event.headers["X-Api-Key"] todos: Response = requests.get(endpoint, headers={"X-Api-Key": api_key}) todos.raise_for_status() # for brevity, we'll limit to the first 10 only return {"todos": todos.json()[:10]} @router.get("/") @tracer.capture_method def get_todo_by_id(todo_id: str): # value come as str api_key = router.current_event.headers["X-Api-Key"] todo_id = quote(todo_id, safe="") todos: Response = requests.get(f"{endpoint}/{todo_id}", headers={"X-Api-Key": api_key}) todos.raise_for_status() return {"todos": todos.json()} # many more routes ``` #### Specialized router types You can use specialized router classes according to the type of event that you are resolving. This way you'll get type hints from your IDE as you access the `current_event` property. | Router | Resolver | `current_event` type | | --- | --- | --- | | APIGatewayRouter | APIGatewayRestResolver | APIGatewayProxyEvent | | APIGatewayHttpRouter | APIGatewayHttpResolver | APIGatewayProxyEventV2 | | ALBRouter | ALBResolver | ALBEvent | | LambdaFunctionUrlRouter | LambdaFunctionUrlResolver | LambdaFunctionUrlEvent | ``` from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.event_handler.router import APIGatewayRouter app = APIGatewayRestResolver() router = APIGatewayRouter() @router.get("/me") def get_self(): # router.current_event is a APIGatewayProxyEvent account_id = router.current_event.request_context.account_id return {"account_id": account_id} app.include_router(router) def lambda_handler(event, context): return app.resolve(event, context) ``` #### Sharing contextual data You can use `append_context` when you want to share data between your App and Router instances. Any data you share will be available via the `context` dictionary available in your App or Router context. We always clear data available in `context` after each invocation. This can be useful for middlewares injecting contextual information before a request is processed. ``` import split_route_append_context_module from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() app.include_router(split_route_append_context_module.router) # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: app.append_context(is_admin=True) # arbitrary number of key=value data return app.resolve(event, context) ``` ``` import requests from requests import Response from aws_lambda_powertools import Tracer from aws_lambda_powertools.event_handler.api_gateway import Router tracer = Tracer() router = Router() endpoint = "https://jsonplaceholder.typicode.com/todos" @router.get("/todos") @tracer.capture_method def get_todos(): is_admin: bool = router.context.get("is_admin", False) todos = {} if is_admin: todos: Response = requests.get(endpoint) todos.raise_for_status() todos = todos.json()[:10] # for brevity, we'll limit to the first 10 only return {"todos": todos} ``` #### Sample layout This is a sample project layout for a monolithic function with routes split in different files (`/todos`, `/health`). ``` . ├── pyproject.toml # project app & dev dependencies; poetry, pipenv, etc. ├── poetry.lock ├── src │ ├── __init__.py │ ├── requirements.txt # sam build detect it automatically due to CodeUri: src. poetry export --format src/requirements.txt │ └── todos │ ├── __init__.py │ ├── main.py # this will be our todos Lambda fn; it could be split in folders if we want separate fns same code base │ └── routers # routers module │ ├── __init__.py │ ├── health.py # /health routes. from routers import todos; health.router │ └── todos.py # /todos routes. from .routers import todos; todos.router ├── template.yml # SAM. CodeUri: src, Handler: todos.main.lambda_handler └── tests ├── __init__.py ├── unit │ ├── __init__.py │ └── test_todos.py # unit tests for the todos router │ └── test_health.py # unit tests for the health router └── functional ├── __init__.py ├── conftest.py # pytest fixtures for the functional tests └── test_main.py # functional tests for the main lambda handler ``` ### Considerations This utility is optimized for fast startup, minimal feature set, and to quickly on-board customers familiar with frameworks like Flask — it's not meant to be a fully fledged framework. Event Handler naturally leads to a single Lambda function handling multiple routes for a given service, which can be eventually broken into multiple functions. Both single (monolithic) and multiple functions (micro) offer different set of trade-offs worth knowing. Tip TL;DR. Start with a monolithic function, add additional functions with new handlers, and possibly break into micro functions if necessary. #### Monolithic function A monolithic function means that your final code artifact will be deployed to a single function. This is generally the best approach to start. ***Benefits*** - **Code reuse**. It's easier to reason about your service, modularize it and reuse code as it grows. Eventually, it can be turned into a standalone library. - **No custom tooling**. Monolithic functions are treated just like normal Python packages; no upfront investment in tooling. - **Faster deployment and debugging**. Whether you use all-at-once, linear, or canary deployments, a monolithic function is a single deployable unit. IDEs like PyCharm and VSCode have tooling to quickly profile, visualize, and step through debug any Python package. ***Downsides*** - **Cold starts**. Frequent deployments and/or high load can diminish the benefit of monolithic functions depending on your latency requirements, due to [Lambda scaling model](https://docs.aws.amazon.com/lambda/latest/dg/invocation-scaling.html). Always load test to pragmatically balance between your customer experience and development cognitive load. - **Granular security permissions**. The micro function approach enables you to use fine-grained permissions & access controls, separate external dependencies & code signing at the function level. Conversely, you could have multiple functions while duplicating the final code artifact in a monolithic approach. - Regardless, least privilege can be applied to either approaches. - **Higher risk per deployment**. A misconfiguration or invalid import can cause disruption if not caught earlier in automated testing. Multiple functions can mitigate misconfigurations but they would still share the same code artifact. You can further minimize risks with multiple environments in your CI/CD pipeline. #### Micro function A micro function means that your final code artifact will be different to each function deployed. This is generally the approach to start if you're looking for fine-grain control and/or high load on certain parts of your service. **Benefits** - **Granular scaling**. A micro function can benefit from the [Lambda scaling model](https://docs.aws.amazon.com/lambda/latest/dg/invocation-scaling.html) to scale differently depending on each part of your application. Concurrency controls and provisioned concurrency can also be used at a granular level for capacity management. - **Discoverability**. Micro functions are easier to visualize when using distributed tracing. Their high-level architectures can be self-explanatory, and complexity is highly visible — assuming each function is named to the business purpose it serves. - **Package size**. An independent function can be significant smaller (KB vs MB) depending on external dependencies it require to perform its purpose. Conversely, a monolithic approach can benefit from [Lambda Layers](https://docs.aws.amazon.com/lambda/latest/dg/invocation-layers.html) to optimize builds for external dependencies. **Downsides** - **Upfront investment**. You need custom build tooling to bundle assets, including [C bindings for runtime compatibility](https://docs.aws.amazon.com/lambda/latest/dg/lambda-runtimes.html). Operations become more elaborate — you need to standardize tracing labels/annotations, structured logging, and metrics to pinpoint root causes. - Engineering discipline is necessary for both approaches. Micro-function approach however requires further attention in consistency as the number of functions grow, just like any distributed system. - **Harder to share code**. Shared code must be carefully evaluated to avoid unnecessary deployments when that changes. Equally, if shared code isn't a library, your development, building, deployment tooling need to accommodate the distinct layout. - **Slower safe deployments**. Safely deploying multiple functions require coordination — AWS CodeDeploy deploys and verifies each function sequentially. This increases lead time substantially (minutes to hours) depending on the deployment strategy you choose. You can mitigate it by selectively enabling it in prod-like environments only, and where the risk profile is applicable. - Automated testing, operational and security reviews are essential to stability in either approaches. **Example** Consider a simplified micro function structured REST API that has two routes: - `/users` - an endpoint that will return all users of the application on `GET` requests - `/users/` - an endpoint that looks up a single users details by ID on `GET` requests Each endpoint will be it's own Lambda function that is configured as a [Lambda integration](https://docs.aws.amazon.com/apigateway/latest/developerguide/getting-started-with-lambda-integration.html). This allows you to set different configurations for each lambda (memory size, layers, etc.). ``` import json from dataclasses import dataclass from http import HTTPStatus from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import APIGatewayRestResolver, Response from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() # This would likely be a db lookup users = [ { "user_id": "b0b2a5bf-ee1e-4c5e-9a86-91074052739e", "email": "john.doe@example.com", "active": True, }, { "user_id": "3a9df6b1-938c-4e80-bd4a-0c966f4b1c1e", "email": "jane.smith@example.com", "active": False, }, { "user_id": "aa0d3d09-9cb9-42b9-9e63-1fb17ea52981", "email": "alex.wilson@example.com", "active": True, }, ] @dataclass class User: user_id: str email: str active: bool app = APIGatewayRestResolver() @app.get("/users") def all_active_users(): """HTTP Response for all active users""" all_users = [User(**user) for user in users] all_active_users = [user.__dict__ for user in all_users if user.active] return Response( status_code=HTTPStatus.OK.value, content_type="application/json", body=json.dumps(all_active_users), ) @logger.inject_lambda_context() def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` import json from dataclasses import dataclass from http import HTTPStatus from typing import Union from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import APIGatewayRestResolver, Response from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() # This would likely be a db lookup users = [ { "user_id": "b0b2a5bf-ee1e-4c5e-9a86-91074052739e", "email": "john.doe@example.com", "active": True, }, { "user_id": "3a9df6b1-938c-4e80-bd4a-0c966f4b1c1e", "email": "jane.smith@example.com", "active": False, }, { "user_id": "aa0d3d09-9cb9-42b9-9e63-1fb17ea52981", "email": "alex.wilson@example.com", "active": True, }, ] @dataclass class User: user_id: str email: str active: bool def get_user_by_id(user_id: str) -> Union[User, None]: for user_data in users: if user_data["user_id"] == user_id: return User( user_id=str(user_data["user_id"]), email=str(user_data["email"]), active=bool(user_data["active"]), ) return None app = APIGatewayRestResolver() @app.get("/users/") def all_active_users(user_id: str): """HTTP Response for all active users""" user = get_user_by_id(user_id) if user: return Response( status_code=HTTPStatus.OK.value, content_type="application/json", body=json.dumps(user.__dict__), ) else: return Response(status_code=HTTPStatus.NOT_FOUND) @logger.inject_lambda_context() def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: > micro-function-example Globals: Api: TracingEnabled: true Cors: # see CORS section AllowOrigin: "'https://example.com'" AllowHeaders: "'Content-Type,Authorization,X-Amz-Date'" MaxAge: "'300'" BinaryMediaTypes: # see Binary responses section - "*~1*" # converts to */* for any binary type # NOTE: use this stricter version if you're also using CORS; */* doesn't work with CORS # see: https://github.com/aws-powertools/powertools-lambda-python/issues/3373#issuecomment-1821144779 # - "image~1*" # converts to image/* # - "*~1csv" # converts to */csv, eg text/csv, application/csv Function: Timeout: 5 Runtime: python3.12 Resources: # Lambda Function Solely For /users endpoint AllUsersFunction: Type: AWS::Serverless::Function Properties: Handler: app.lambda_handler CodeUri: users Description: Function for /users endpoint Architectures: - x86_64 Tracing: Active Events: UsersPath: Type: Api Properties: Path: /users Method: GET MemorySize: 128 # Each Lambda Function can have it's own memory configuration Environment: Variables: POWERTOOLS_LOG_LEVEL: INFO Tags: LambdaPowertools: python # Lambda Function Solely For /users/{id} endpoint UserByIdFunction: Type: AWS::Serverless::Function Properties: Handler: app.lambda_handler CodeUri: users_by_id Description: Function for /users/{id} endpoint Architectures: - x86_64 Tracing: Active Events: UsersByIdPath: Type: Api Properties: Path: /users/{id+} Method: GET MemorySize: 128 # Each Lambda Function can have it's own memory configuration Environment: Variables: POWERTOOLS_LOG_LEVEL: INFO ``` Note You can see some of the downsides in this example such as some code reuse. If set up with proper build tooling, the `User` class could be shared across functions. This could be accomplished by packaging shared code as a [Lambda Layer](https://docs.aws.amazon.com/lambda/latest/dg/chapter-layers.html) or [Pants](https://www.pantsbuild.org/docs/awslambda-python). ## Testing your code You can test your routes by passing a proxy event request with required params. ``` from dataclasses import dataclass import assert_rest_api_resolver_response import pytest @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:123456789012:function:test" aws_request_id: str = "da658bd3-2d6f-4e7b-8ec2-937234644fdc" @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() def test_lambda_handler(lambda_context): minimal_event = { "path": "/todos", "httpMethod": "GET", "requestContext": {"requestId": "227b78aa-779d-47d4-a48e-ce62120393b8"}, # correlation ID } # Example of API Gateway REST API request event: # https://docs.aws.amazon.com/lambda/latest/dg/services-apigateway.html#apigateway-example-event ret = assert_rest_api_resolver_response.lambda_handler(minimal_event, lambda_context) assert ret["statusCode"] == 200 assert ret["body"] != "" ``` ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayRestResolver() @app.get("/todos") @tracer.capture_method def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() return {"todos": todos.json()[:10]} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` from dataclasses import dataclass import assert_http_api_response_module import pytest @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:123456789012:function:test" aws_request_id: str = "da658bd3-2d6f-4e7b-8ec2-937234644fdc" @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() def test_lambda_handler(lambda_context: LambdaContext): minimal_event = { "rawPath": "/todos", "requestContext": { "requestContext": {"requestId": "227b78aa-779d-47d4-a48e-ce62120393b8"}, # correlation ID "http": { "method": "GET", }, "stage": "$default", }, } # Example of API Gateway HTTP API request event: # https://docs.aws.amazon.com/apigateway/latest/developerguide/http-api-develop-integrations-lambda.html ret = assert_http_api_response_module.lambda_handler(minimal_event, lambda_context) assert ret["statusCode"] == 200 assert ret["body"] != "" ``` ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayHttpResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = APIGatewayHttpResolver() @app.get("/todos") @tracer.capture_method def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() return {"todos": todos.json()[:10]} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_HTTP) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` from dataclasses import dataclass import assert_alb_api_response_module import pytest @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:123456789012:function:test" aws_request_id: str = "da658bd3-2d6f-4e7b-8ec2-937234644fdc" @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() def test_lambda_handler(lambda_context: LambdaContext): minimal_event = { "path": "/todos", "httpMethod": "GET", "headers": {"x-amzn-trace-id": "b25827e5-0e30-4d52-85a8-4df449ee4c5a"}, } # Example of Application Load Balancer request event: # https://docs.aws.amazon.com/lambda/latest/dg/services-alb.html ret = assert_alb_api_response_module.lambda_handler(minimal_event, lambda_context) assert ret["statusCode"] == 200 assert ret["body"] != "" ``` ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import ALBResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = ALBResolver() @app.get("/todos") @tracer.capture_method def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() return {"todos": todos.json()[:10]} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPLICATION_LOAD_BALANCER) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` from dataclasses import dataclass import assert_function_url_api_response_module import pytest @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:123456789012:function:test" aws_request_id: str = "da658bd3-2d6f-4e7b-8ec2-937234644fdc" @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() def test_lambda_handler(lambda_context: LambdaContext): minimal_event = { "rawPath": "/todos", "requestContext": { "requestContext": {"requestId": "227b78aa-779d-47d4-a48e-ce62120393b8"}, # correlation ID "http": { "method": "GET", }, "stage": "$default", }, } # Example of Lambda Function URL request event: # https://docs.aws.amazon.com/lambda/latest/dg/urls-invocation.html#urls-payloads ret = assert_function_url_api_response_module.lambda_handler(minimal_event, lambda_context) assert ret["statusCode"] == 200 assert ret["body"] != "" ``` ``` import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import LambdaFunctionUrlResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = LambdaFunctionUrlResolver() @app.get("/todos") @tracer.capture_method def get_todos(): todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() return {"todos": todos.json()[:10]} # You can continue to use other utilities just as before @logger.inject_lambda_context(correlation_id_path=correlation_paths.LAMBDA_FUNCTION_URL) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ## FAQ **What's the difference between this utility and frameworks like Chalice?** Chalice is a full featured microframework that manages application and infrastructure. This utility, however, is largely focused on routing to reduce boilerplate and expects you to setup and manage infrastructure with your framework of choice. That said, [Chalice has native integration with Lambda Powertools](https://aws.github.io/chalice/topics/middleware.html) if you're looking for a more opinionated and web framework feature set. **What happened to `ApiGatewayResolver`?** It's been superseded by more explicit resolvers like `APIGatewayRestResolver`, `APIGatewayHttpResolver`, and `ALBResolver`. `ApiGatewayResolver` handled multiple types of event resolvers for convenience via `proxy_type` param. However, it made it impossible for static checkers like Mypy and IDEs IntelliSense to know what properties a `current_event` would have due to late bound resolution. This provided a suboptimal experience for customers not being able to find all properties available besides common ones between API Gateway REST, HTTP, and ALB - while manually annotating `app.current_event` would work it is not the experience we want to provide to customers. `ApiGatewayResolver` will be deprecated in v2 and have appropriate warnings as soon as we have a v2 draft. Event Handler for AWS AppSync and Amplify GraphQL Transformer. ``` stateDiagram-v2 direction LR EventSource: AWS Lambda Event Sources EventHandlerResolvers: AWS AppSync Direct invocation

AWS AppSync Batch invocation LambdaInit: Lambda invocation EventHandler: Event Handler EventHandlerResolver: Route event based on GraphQL type/field keys YourLogic: Run your registered resolver function EventHandlerResolverBuilder: Adapts response to Event Source contract LambdaResponse: Lambda response state EventSource { EventHandlerResolvers } EventHandlerResolvers --> LambdaInit LambdaInit --> EventHandler EventHandler --> EventHandlerResolver state EventHandler { [*] --> EventHandlerResolver: app.resolve(event, context) EventHandlerResolver --> YourLogic YourLogic --> EventHandlerResolverBuilder } EventHandler --> LambdaResponse ``` ## Key Features - Choose between strictly match a GraphQL field name or all of them to a function - Automatically parse API arguments to function arguments - Integrates with [Event Source Data classes utilities](../../../utilities/data_classes/) to access resolver and identity information - Support async Python 3.8+ functions and generators ## Terminology **[Direct Lambda Resolver](https://docs.aws.amazon.com/appsync/latest/devguide/direct-lambda-reference.html)**. A custom AppSync Resolver to bypass the use of Apache Velocity Template (VTL) and automatically map your function's response to a GraphQL field. **[Amplify GraphQL Transformer](https://docs.amplify.aws/cli/graphql-transformer/function)**. Custom GraphQL directives to define your application's data model using Schema Definition Language *(SDL)*, *e.g., `@function`*. Amplify CLI uses these directives to convert GraphQL SDL into full descriptive AWS CloudFormation templates. ## Getting started Tip: Designing GraphQL Schemas for the first time? Visit [AWS AppSync schema documentation](https://docs.aws.amazon.com/appsync/latest/devguide/designing-your-schema.html) to understand how to define types, nesting, and pagination. ### Required resources You must have an existing AppSync GraphQL API and IAM permissions to invoke your Lambda function. That said, there is no additional permissions to use Event Handler as routing requires no dependency (*standard library*). This is the sample infrastructure we are using for the initial examples with a AppSync Direct Lambda Resolver. ``` schema { query: Query mutation: Mutation } type Query { # these are fields you can attach resolvers to (type_name: Query, field_name: getTodo) getTodo(id: ID!): Todo listTodos: [Todo] } type Mutation { createTodo(title: String!): Todo } type Todo { id: ID! userId: String title: String completed: Boolean } ``` ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: Hello world Direct Lambda Resolver Globals: Function: Timeout: 5 Runtime: python3.12 Tracing: Active Environment: Variables: # Powertools for AWS Lambda (Python) env vars: https://docs.powertools.aws.dev/lambda/python/latest/#environment-variables POWERTOOLS_LOG_LEVEL: INFO POWERTOOLS_LOGGER_SAMPLE_RATE: 0.1 POWERTOOLS_LOGGER_LOG_EVENT: true POWERTOOLS_SERVICE_NAME: example Resources: TodosFunction: Type: AWS::Serverless::Function Properties: Handler: getting_started_graphql_api_resolver.lambda_handler CodeUri: ../src Description: Sample Direct Lambda Resolver # IAM Permissions and Roles AppSyncServiceRole: Type: "AWS::IAM::Role" Properties: AssumeRolePolicyDocument: Version: "2012-10-17" Statement: - Effect: "Allow" Principal: Service: - "appsync.amazonaws.com" Action: - "sts:AssumeRole" InvokeLambdaResolverPolicy: Type: "AWS::IAM::Policy" Properties: PolicyName: "DirectAppSyncLambda" PolicyDocument: Version: "2012-10-17" Statement: - Effect: "Allow" Action: "lambda:invokeFunction" Resource: - !GetAtt TodosFunction.Arn Roles: - !Ref AppSyncServiceRole # GraphQL API TodosApi: Type: "AWS::AppSync::GraphQLApi" Properties: Name: TodosApi AuthenticationType: "API_KEY" XrayEnabled: true TodosApiKey: Type: AWS::AppSync::ApiKey Properties: ApiId: !GetAtt TodosApi.ApiId TodosApiSchema: Type: "AWS::AppSync::GraphQLSchema" Properties: ApiId: !GetAtt TodosApi.ApiId DefinitionS3Location: ../src/getting_started_schema.graphql Metadata: cfn-lint: config: ignore_checks: - W3002 # allow relative path in DefinitionS3Location # Lambda Direct Data Source and Resolver TodosFunctionDataSource: Type: "AWS::AppSync::DataSource" Properties: ApiId: !GetAtt TodosApi.ApiId Name: "HelloWorldLambdaDirectResolver" Type: "AWS_LAMBDA" ServiceRoleArn: !GetAtt AppSyncServiceRole.Arn LambdaConfig: LambdaFunctionArn: !GetAtt TodosFunction.Arn ListTodosResolver: Type: "AWS::AppSync::Resolver" Properties: ApiId: !GetAtt TodosApi.ApiId TypeName: "Query" FieldName: "listTodos" DataSourceName: !GetAtt TodosFunctionDataSource.Name GetTodoResolver: Type: "AWS::AppSync::Resolver" Properties: ApiId: !GetAtt TodosApi.ApiId TypeName: "Query" FieldName: "getTodo" DataSourceName: !GetAtt TodosFunctionDataSource.Name CreateTodoResolver: Type: "AWS::AppSync::Resolver" Properties: ApiId: !GetAtt TodosApi.ApiId TypeName: "Mutation" FieldName: "createTodo" DataSourceName: !GetAtt TodosFunctionDataSource.Name Outputs: TodosFunction: Description: "Hello World Lambda Function ARN" Value: !GetAtt TodosFunction.Arn TodosApi: Value: !GetAtt TodosApi.GraphQLUrl ``` ### Resolver decorator You can define your functions to match GraphQL types and fields with the `app.resolver()` decorator. What is a type and field? A type would be a top-level **GraphQL Type** like `Query`, `Mutation`, `Todo`. A **GraphQL Field** would be `listTodos` under `Query`, `createTodo` under `Mutation`, etc. Here's an example with two separate functions to resolve `getTodo` and `listTodos` fields within the `Query` type. For completion, we use [Scalar type utilities](#scalar-functions) to generate the right output based on our schema definition. Important GraphQL arguments are passed as function keyword arguments. **Example** The GraphQL Query `getTodo(id: "todo_id_value")` will call `get_todo` as `get_todo(id="todo_id_value")`. ``` from typing import List, TypedDict import requests from requests import Response from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import AppSyncResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.data_classes.appsync import scalar_types_utils from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = AppSyncResolver() class Todo(TypedDict, total=False): id: str # noqa AA03 VNE003, required due to GraphQL Schema userId: str title: str completed: bool @app.resolver(type_name="Query", field_name="getTodo") @tracer.capture_method def get_todo( id: str = "", # noqa AA03 VNE003 shadows built-in id to match query argument, e.g., getTodo(id: "some_id") ) -> Todo: logger.info(f"Fetching Todo {id}") todos: Response = requests.get(f"https://jsonplaceholder.typicode.com/todos/{id}") todos.raise_for_status() return todos.json() @app.resolver(type_name="Query", field_name="listTodos") @tracer.capture_method def list_todos() -> List[Todo]: todos: Response = requests.get("https://jsonplaceholder.typicode.com/todos") todos.raise_for_status() # for brevity, we'll limit to the first 10 only return todos.json()[:10] @app.resolver(type_name="Mutation", field_name="createTodo") @tracer.capture_method def create_todo(title: str) -> Todo: payload = {"userId": scalar_types_utils.make_id(), "title": title, "completed": False} # dummy UUID str todo: Response = requests.post("https://jsonplaceholder.typicode.com/todos", json=payload) todo.raise_for_status() return todo.json() @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPSYNC_RESOLVER) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` schema { query: Query mutation: Mutation } type Query { # these are fields you can attach resolvers to (type_name: Query, field_name: getTodo) getTodo(id: ID!): Todo listTodos: [Todo] } type Mutation { createTodo(title: String!): Todo } type Todo { id: ID! userId: String title: String completed: Boolean } ``` ``` { "arguments": { "id": "7e362732-c8cd-4405-b090-144ac9b38960" }, "identity": null, "source": null, "request": { "headers": { "x-forwarded-for": "1.2.3.4, 5.6.7.8", "accept-encoding": "gzip, deflate, br", "cloudfront-viewer-country": "NL", "cloudfront-is-tablet-viewer": "false", "referer": "https://eu-west-1.console.aws.amazon.com/appsync/home?region=eu-west-1", "via": "2.0 9fce949f3749407c8e6a75087e168b47.cloudfront.net (CloudFront)", "cloudfront-forwarded-proto": "https", "origin": "https://eu-west-1.console.aws.amazon.com", "x-api-key": "da1-c33ullkbkze3jg5hf5ddgcs4fq", "content-type": "application/json", "x-amzn-trace-id": "Root=1-606eb2f2-1babc433453a332c43fb4494", "x-amz-cf-id": "SJw16ZOPuMZMINx5Xcxa9pB84oMPSGCzNOfrbJLvd80sPa0waCXzYQ==", "content-length": "114", "x-amz-user-agent": "AWS-Console-AppSync/", "x-forwarded-proto": "https", "host": "ldcvmkdnd5az3lm3gnf5ixvcyy.appsync-api.eu-west-1.amazonaws.com", "accept-language": "en-US,en;q=0.5", "user-agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:78.0) Gecko/20100101 Firefox/78.0", "cloudfront-is-desktop-viewer": "true", "cloudfront-is-mobile-viewer": "false", "accept": "*/*", "x-forwarded-port": "443", "cloudfront-is-smarttv-viewer": "false" } }, "prev": null, "info": { "parentTypeName": "Query", "selectionSetList": [ "title", "id" ], "selectionSetGraphQL": "{\n title\n id\n}", "fieldName": "getTodo", "variables": {} }, "stash": {} } ``` ``` { "arguments": {}, "identity": null, "source": null, "request": { "headers": { "x-forwarded-for": "1.2.3.4, 5.6.7.8", "accept-encoding": "gzip, deflate, br", "cloudfront-viewer-country": "NL", "cloudfront-is-tablet-viewer": "false", "referer": "https://eu-west-1.console.aws.amazon.com/appsync/home?region=eu-west-1", "via": "2.0 9fce949f3749407c8e6a75087e168b47.cloudfront.net (CloudFront)", "cloudfront-forwarded-proto": "https", "origin": "https://eu-west-1.console.aws.amazon.com", "x-api-key": "da1-c33ullkbkze3jg5hf5ddgcs4fq", "content-type": "application/json", "x-amzn-trace-id": "Root=1-606eb2f2-1babc433453a332c43fb4494", "x-amz-cf-id": "SJw16ZOPuMZMINx5Xcxa9pB84oMPSGCzNOfrbJLvd80sPa0waCXzYQ==", "content-length": "114", "x-amz-user-agent": "AWS-Console-AppSync/", "x-forwarded-proto": "https", "host": "ldcvmkdnd5az3lm3gnf5ixvcyy.appsync-api.eu-west-1.amazonaws.com", "accept-language": "en-US,en;q=0.5", "user-agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:78.0) Gecko/20100101 Firefox/78.0", "cloudfront-is-desktop-viewer": "true", "cloudfront-is-mobile-viewer": "false", "accept": "*/*", "x-forwarded-port": "443", "cloudfront-is-smarttv-viewer": "false" } }, "prev": null, "info": { "parentTypeName": "Query", "selectionSetList": [ "id", "title" ], "selectionSetGraphQL": "{\n id\n title\n}", "fieldName": "listTodos", "variables": {} }, "stash": {} } ``` ``` { "arguments": { "title": "Sample todo mutation" }, "identity": null, "source": null, "request": { "headers": { "x-forwarded-for": "203.0.113.1, 203.0.113.18", "cloudfront-viewer-country": "NL", "cloudfront-is-tablet-viewer": "false", "x-amzn-requestid": "fdc4f30b-44c2-475d-b2f9-9da0778d5275", "via": "2.0 f655cacd0d6f7c5dc935ea687af6f3c0.cloudfront.net (CloudFront)", "cloudfront-forwarded-proto": "https", "origin": "https://eu-west-1.console.aws.amazon.com", "content-length": "166", "x-forwarded-proto": "https", "accept-language": "en-US,en;q=0.5", "host": "kiuqayvn4jhhzio6whpnk7xj3a.appsync-api.eu-west-1.amazonaws.com", "user-agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:102.0) Gecko/20100101 Firefox/102.0", "cloudfront-is-mobile-viewer": "false", "accept": "application/json, text/plain, */*", "cloudfront-viewer-asn": "1136", "cloudfront-is-smarttv-viewer": "false", "accept-encoding": "gzip, deflate, br", "referer": "https://eu-west-1.console.aws.amazon.com/", "content-type": "application/json", "x-api-key": "da2-vsqnxwyzgzf4nh6kvoaidtvs7y", "sec-fetch-mode": "cors", "x-amz-cf-id": "0kxqijFPsbGSWJ1u3Z_sUS4Wu2hRoG_2T77aJPuoh_Q4bXAB3x0a3g==", "x-amzn-trace-id": "Root=1-63fef2cf-6d566e9f4a35b99e6212388e", "sec-fetch-dest": "empty", "x-amz-user-agent": "AWS-Console-AppSync/", "cloudfront-is-desktop-viewer": "true", "sec-fetch-site": "cross-site", "x-forwarded-port": "443" }, "domainName": null }, "prev": null, "info": { "selectionSetList": [ "id", "title", "completed" ], "selectionSetGraphQL": "{\n id\n title\n completed\n}", "fieldName": "createTodo", "parentTypeName": "Mutation", "variables": {} }, "stash": {} } ``` ### Scalar functions When working with [AWS AppSync Scalar types](https://docs.aws.amazon.com/appsync/latest/devguide/scalars.html), you might want to generate the same values for data validation purposes. For convenience, the most commonly used values are available as functions within `scalar_types_utils` module. ``` from aws_lambda_powertools.utilities.data_classes.appsync.scalar_types_utils import ( aws_date, aws_datetime, aws_time, aws_timestamp, make_id, ) # Scalars: https://docs.aws.amazon.com/appsync/latest/devguide/scalars.html my_id: str = make_id() # Scalar: ID! my_date: str = aws_date() # Scalar: AWSDate my_timestamp: str = aws_time() # Scalar: AWSTime my_datetime: str = aws_datetime() # Scalar: AWSDateTime my_epoch_timestamp: int = aws_timestamp() # Scalar: AWSTimestamp ``` Here's a table with their related scalar as a quick reference: | Scalar type | Scalar function | Sample value | | --- | --- | --- | | **ID** | `scalar_types_utils.make_id` | `e916c84d-48b6-484c-bef3-cee3e4d86ebf` | | **AWSDate** | `scalar_types_utils.aws_date` | `2022-07-08Z` | | **AWSTime** | `scalar_types_utils.aws_time` | `15:11:00.189Z` | | **AWSDateTime** | `scalar_types_utils.aws_datetime` | `2022-07-08T15:11:00.189Z` | | **AWSTimestamp** | `scalar_types_utils.aws_timestamp` | `1657293060` | ## Advanced ### Nested mappings Note The following examples use a more advanced schema. These schemas differ from [initial sample infrastructure we used earlier](#required-resources). You can nest `app.resolver()` decorator multiple times when resolving fields with the same return value. ``` from typing import List, TypedDict from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import AppSyncResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = AppSyncResolver() class Location(TypedDict, total=False): id: str # noqa AA03 VNE003, required due to GraphQL Schema name: str description: str address: str @app.resolver(field_name="listLocations") @app.resolver(field_name="locations") @tracer.capture_method def get_locations(name: str, description: str = "") -> List[Location]: # match GraphQL Query arguments return [{"name": name, "description": description}] @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPSYNC_RESOLVER) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` schema { query: Query } type Query { listLocations: [Location] } type Location { id: ID! name: String! description: String address: String } type Merchant { id: String! name: String! description: String locations: [Location] } ``` ### Async functions For Lambda Python3.8+ runtime, this utility supports async functions when you use in conjunction with `asyncio.run`. ``` import asyncio from typing import List, TypedDict import aiohttp from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import AppSyncResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.tracing import aiohttp_trace_config from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = AppSyncResolver() class Todo(TypedDict, total=False): id: str # noqa AA03 VNE003, required due to GraphQL Schema userId: str title: str completed: bool @app.resolver(type_name="Query", field_name="listTodos") async def list_todos() -> List[Todo]: async with aiohttp.ClientSession(trace_configs=[aiohttp_trace_config()]) as session: async with session.get("https://jsonplaceholder.typicode.com/todos") as resp: return await resp.json() @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPSYNC_RESOLVER) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: result = app.resolve(event, context) return asyncio.run(result) ``` ### Amplify GraphQL Transformer Assuming you have [Amplify CLI installed](https://docs.amplify.aws/cli/start/install), create a new API using `amplify add api` and use the following GraphQL Schema. ``` @model type Merchant { id: String! name: String! description: String # Resolves to `common_field` commonField: String @function(name: "merchantInfo-${env}") } type Location { id: ID! name: String! address: String # Resolves to `common_field` commonField: String @function(name: "merchantInfo-${env}") } type Query { # List of locations resolves to `list_locations` listLocations(page: Int, size: Int): [Location] @function(name: "merchantInfo-${env}") # List of locations resolves to `list_locations` findMerchant(search: str): [Merchant] @function(name: "searchMerchant-${env}") } ``` [Create two new basic Python functions](https://docs.amplify.aws/cli/function#set-up-a-function) via `amplify add function`. Note Amplify CLI generated functions use `Pipenv` as a dependency manager. Your function source code is located at **`amplify/backend/function/your-function-name`**. Within your function's folder, add Powertools for AWS Lambda (Python) as a dependency with `pipenv install aws-lambda-powertools`. Use the following code for `merchantInfo` and `searchMerchant` functions respectively. ``` from typing import List, TypedDict from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import AppSyncResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.data_classes.appsync import scalar_types_utils from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = AppSyncResolver() class Location(TypedDict, total=False): id: str # noqa AA03 VNE003, required due to GraphQL Schema name: str description: str address: str commonField: str @app.resolver(type_name="Query", field_name="listLocations") def list_locations(page: int = 0, size: int = 10) -> List[Location]: return [{"id": scalar_types_utils.make_id(), "name": "Smooth Grooves"}] @app.resolver(field_name="commonField") def common_field() -> str: # Would match all fieldNames matching 'commonField' return scalar_types_utils.make_id() @tracer.capture_lambda_handler @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPSYNC_RESOLVER) def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` from typing import List, TypedDict from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import AppSyncResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.data_classes.appsync import scalar_types_utils from aws_lambda_powertools.utilities.typing import LambdaContext app = AppSyncResolver() tracer = Tracer() logger = Logger() class Merchant(TypedDict, total=False): id: str # noqa AA03 VNE003, required due to GraphQL Schema name: str description: str commonField: str @app.resolver(type_name="Query", field_name="findMerchant") def find_merchant(search: str) -> List[Merchant]: merchants: List[Merchant] = [ { "id": scalar_types_utils.make_id(), "name": "Parry-Wood", "description": "Possimus doloremque tempora harum deleniti eum.", }, { "id": scalar_types_utils.make_id(), "name": "Shaw, Owen and Jones", "description": "Aliquam iste architecto suscipit in.", }, ] return [merchant for merchant in merchants if search == merchant["name"]] @tracer.capture_lambda_handler @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPSYNC_RESOLVER) def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "typeName": "Query", "fieldName": "listLocations", "arguments": { "page": 2, "size": 1 }, "identity": { "claims": { "iat": 1615366261 }, "username": "treid" }, "request": { "headers": { "x-amzn-trace-id": "Root=1-60488877-0b0c4e6727ab2a1c545babd0", "x-forwarded-for": "127.0.0.1", "cloudfront-viewer-country": "NL", "x-api-key": "da1-c33ullkbkze3jg5hf5ddgcs4fq" } } } ``` ``` { "typeName": "Merchant", "fieldName": "commonField", "arguments": {}, "identity": { "claims": { "iat": 1615366261 }, "username": "marieellis" }, "request": { "headers": { "x-amzn-trace-id": "Root=1-60488877-0b0c4e6727ab2a1c545babd0", "x-forwarded-for": "127.0.0.1" } }, } ``` ``` { "typeName": "Query", "fieldName": "findMerchant", "arguments": { "search": "Parry-Wood" }, "identity": { "claims": { "iat": 1615366261 }, "username": "wwilliams" }, "request": { "headers": { "x-amzn-trace-id": "Root=1-60488877-0b0c4e6727ab2a1c545babd0", "x-forwarded-for": "127.0.0.1" } }, } ``` ### Custom data models You can subclass [AppSyncResolverEvent](../../../utilities/data_classes/#appsync-resolver) to bring your own set of methods to handle incoming events, by using `data_model` param in the `resolve` method. ``` from typing import List, TypedDict from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import AppSyncResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.data_classes.appsync import scalar_types_utils from aws_lambda_powertools.utilities.data_classes.appsync_resolver_event import ( AppSyncResolverEvent, ) from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = AppSyncResolver() class Location(TypedDict, total=False): id: str # noqa AA03 VNE003, required due to GraphQL Schema name: str description: str address: str commonField: str class MyCustomModel(AppSyncResolverEvent): @property def country_viewer(self) -> str: return self.request_headers.get("cloudfront-viewer-country", "") @property def api_key(self) -> str: return self.request_headers.get("x-api-key", "") @app.resolver(type_name="Query", field_name="listLocations") def list_locations(page: int = 0, size: int = 10) -> List[Location]: # additional properties/methods will now be available under current_event if app.current_event: logger.debug(f"Request country origin: {app.current_event.country_viewer}") # type: ignore[attr-defined] return [{"id": scalar_types_utils.make_id(), "name": "Perry, James and Carroll"}] @tracer.capture_lambda_handler @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPSYNC_RESOLVER) def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context, data_model=MyCustomModel) ``` ``` schema { query: Query } type Query { listLocations: [Location] } type Location { id: ID! name: String! description: String address: String } type Merchant { id: String! name: String! description: String locations: [Location] } ``` ``` { "typeName": "Query", "fieldName": "listLocations", "arguments": { "page": 2, "size": 1 }, "identity": { "claims": { "iat": 1615366261 }, "username": "treid" }, "request": { "headers": { "x-amzn-trace-id": "Root=1-60488877-0b0c4e6727ab2a1c545babd0", "x-forwarded-for": "127.0.0.1", "cloudfront-viewer-country": "NL", "x-api-key": "da1-c33ullkbkze3jg5hf5ddgcs4fq" } } } ``` ### Split operations with Router Tip Read the **[considerations section for trade-offs between monolithic and micro functions](../api_gateway/#considerations)**, as it's also applicable here. As you grow the number of related GraphQL operations a given Lambda function should handle, it is natural to split them into separate files to ease maintenance - That's when the `Router` feature comes handy. Let's assume you have `split_operation.py` as your Lambda function entrypoint and routes in `split_operation_module.py`. This is how you'd use the `Router` feature. We import **Router** instead of **AppSyncResolver**; syntax wise is exactly the same. ``` from typing import List, TypedDict from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler.graphql_appsync.router import Router tracer = Tracer() logger = Logger() router = Router() class Location(TypedDict, total=False): id: str # noqa AA03 VNE003, required due to GraphQL Schema name: str description: str address: str @router.resolver(field_name="listLocations") @router.resolver(field_name="locations") @tracer.capture_method def get_locations(name: str, description: str = "") -> List[Location]: # match GraphQL Query arguments return [{"name": name, "description": description}] ``` We use `include_router` method and include all `location` operations registered in the `router` global object. ``` import split_operation_module from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import AppSyncResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = AppSyncResolver() app.include_router(split_operation_module.router) @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPSYNC_RESOLVER) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` #### Sharing contextual data You can use `append_context` when you want to share data between your App and Router instances. Any data you share will be available via the `context` dictionary available in your App or Router context. Warning For safety, we clear the context after each invocation, except for async single resolvers. For these, use `app.context.clear()` before returning the function. Tip This can also be useful for middlewares injecting contextual information before a request is processed. ``` import split_operation_append_context_module from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import AppSyncResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = AppSyncResolver() app.include_router(split_operation_append_context_module.router) @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPSYNC_RESOLVER) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: app.append_context(is_admin=True) # arbitrary number of key=value data return app.resolve(event, context) ``` ``` from typing import List, TypedDict from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler.appsync import Router tracer = Tracer() logger = Logger() router = Router() class Location(TypedDict, total=False): id: str # noqa AA03 VNE003, required due to GraphQL Schema name: str description: str address: str @router.resolver(field_name="listLocations") @router.resolver(field_name="locations") @tracer.capture_method def get_locations(name: str, description: str = "") -> List[Location]: # match GraphQL Query arguments is_admin: bool = router.context.get("is_admin", False) return [{"name": name, "description": description}] if is_admin else [] ``` ### Exception handling You can use **`exception_handler`** decorator with any Python exception. This allows you to handle a common exception outside your resolver, for example validation errors. The `exception_handler` function also supports passing a list of exception types you wish to handle with one handler. ``` from aws_lambda_powertools.event_handler import AppSyncResolver app = AppSyncResolver() @app.exception_handler(ValueError) def handle_value_error(ex: ValueError): return {"message": "error"} @app.resolver(field_name="createSomething") def create_something(): raise ValueError("Raising an exception") def lambda_handler(event, context): return app.resolve(event, context) ``` Warning This is not supported when using async single resolvers. ### Batch processing ``` stateDiagram-v2 direction LR LambdaInit: Lambda invocation EventHandler: Event Handler EventHandlerResolver: Route event based on GraphQL type/field keys Client: Client query (listPosts) YourLogic: Run your registered resolver function EventHandlerResolverBuilder: Verifies response is a list AppSyncBatchPostsResolution: query listPosts AppSyncBatchPostsItems: get all posts data (id, title, relatedPosts) AppSyncBatchRelatedPosts: get related posts (id, title, relatedPosts) AppSyncBatchAggregate: aggregate batch resolver event AppSyncBatchLimit: reached batch size limit LambdaResponse: Lambda response Client --> AppSyncBatchResolverMode state AppSyncBatchResolverMode { [*] --> AppSyncBatchPostsResolution AppSyncBatchPostsResolution --> AppSyncBatchPostsItems AppSyncBatchPostsItems --> AppSyncBatchRelatedPosts: N additional queries AppSyncBatchRelatedPosts --> AppSyncBatchRelatedPosts AppSyncBatchRelatedPosts --> AppSyncBatchAggregate AppSyncBatchRelatedPosts --> AppSyncBatchAggregate AppSyncBatchRelatedPosts --> AppSyncBatchAggregate AppSyncBatchAggregate --> AppSyncBatchLimit } AppSyncBatchResolverMode --> LambdaInit: 1x Invoke with N events LambdaInit --> EventHandler state EventHandler { [*] --> EventHandlerResolver: app.resolve(event, context) EventHandlerResolver --> YourLogic YourLogic --> EventHandlerResolverBuilder EventHandlerResolverBuilder --> LambdaResponse } ``` *Batch resolvers mechanics: visualizing N+1 in `relatedPosts` field.* #### Understanding N+1 problem When AWS AppSync has [batching enabled for Lambda Resolvers](https://docs.aws.amazon.com/appsync/latest/devguide/tutorial-lambda-resolvers.html#advanced-use-case-batching), it will group as many requests as possible before invoking your Lambda invocation. Effectively solving the [N+1 problem in GraphQL](https://aws.amazon.com/blogs/mobile/introducing-configurable-batching-size-for-aws-appsync-lambda-resolvers/). For example, say you have a query named `listPosts`. For each post, you also want `relatedPosts`. **Without batching**, AppSync will: 1. Invoke your Lambda function to get the first post 1. Invoke your Lambda function for each related post 1. Repeat 1 until done ``` sequenceDiagram participant Client participant AppSync participant Lambda participant Database Client->>AppSync: GraphQL Query Note over Client,AppSync: query listPosts {
id
title
relatedPosts { id title }
} AppSync->>Lambda: Fetch N posts (listPosts) Lambda->>Database: Query Database->>Lambda: Posts Lambda-->>AppSync: Return posts (id, title) loop Fetch N related posts (relatedPosts) AppSync->>Lambda: Invoke function (N times) Lambda->>Database: Query Database-->>Lambda: Return related posts Lambda-->>AppSync: Return related posts end AppSync-->>Client: Return posts and their related posts ``` #### Batch resolvers You can use `@batch_resolver` or `@async_batch_resolver` decorators to receive the entire batch of requests. In this mode, you must return results in the same order of your batch items, so AppSync can associate the results back to the client. ``` from __future__ import annotations from typing import Any from aws_lambda_powertools.event_handler import AppSyncResolver from aws_lambda_powertools.utilities.data_classes import AppSyncResolverEvent from aws_lambda_powertools.utilities.typing import LambdaContext app = AppSyncResolver() # mimic DB data for simplicity posts_related = { "1": {"title": "post1"}, "2": {"title": "post2"}, "3": {"title": "post3"}, } def search_batch_posts(posts: list) -> dict[str, Any]: return {post_id: posts_related.get(post_id) for post_id in posts} @app.batch_resolver(type_name="Query", field_name="relatedPosts") def related_posts(event: list[AppSyncResolverEvent]) -> list[Any]: # (1)! # Extract all post_ids in order post_ids: list = [record.source.get("post_id") for record in event] # (2)! # Get unique post_ids while preserving order unique_post_ids = list(dict.fromkeys(post_ids)) # Fetch posts in a single batch operation fetched_posts = search_batch_posts(unique_post_ids) # Return results in original order return [fetched_posts.get(post_id) for post_id in post_ids] def lambda_handler(event, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. The entire batch is sent to the resolver. You need to iterate through it to process all records. 1. We use `post_id` as our unique identifier of the GraphQL request. ``` [ { "arguments":{}, "identity":"None", "source":{ "post_id":"1", "author":"Author1" }, "prev":"None", "info":{ "selectionSetList":[ "post_id", "author" ], "selectionSetGraphQL":"{\n post_id\n author\n}", "fieldName":"relatedPosts", "parentTypeName":"Post", "variables":{} } }, { "arguments":{}, "identity":"None", "source":{ "post_id":"2", "author":"Author2" }, "prev":"None", "info":{ "selectionSetList":[ "post_id", "author" ], "selectionSetGraphQL":"{\n post_id\n author\n}", "fieldName":"relatedPosts", "parentTypeName":"Post", "variables":{} } }, { "arguments":{}, "identity":"None", "source":{ "post_id":"1", "author":"Author1" }, "prev":"None", "info":{ "selectionSetList":[ "post_id", "author" ], "selectionSetGraphQL":"{\n post_id\n author\n}", "fieldName":"relatedPosts", "parentTypeName":"Post", "variables":{} } } ] ``` ``` query MyQuery { getPost(post_id: "2") { relatedPosts { post_id author relatedPosts { post_id author } } } } ``` ##### Processing items individually ``` stateDiagram-v2 direction LR LambdaInit: Lambda invocation EventHandler: Event Handler EventHandlerResolver: Route event based on GraphQL type/field keys Client: Client query (listPosts) YourLogic: Call your registered resolver function N times EventHandlerResolverErrorHandling: Gracefully handle errors with null response EventHandlerResolverBuilder: Aggregate responses to match batch size AppSyncBatchPostsResolution: query listPosts AppSyncBatchPostsItems: get all posts data (id, title, relatedPosts) AppSyncBatchRelatedPosts: get related posts (id, title, relatedPosts) AppSyncBatchAggregate: aggregate batch resolver event AppSyncBatchLimit: reached batch size limit LambdaResponse: Lambda response Client --> AppSyncBatchResolverMode state AppSyncBatchResolverMode { [*] --> AppSyncBatchPostsResolution AppSyncBatchPostsResolution --> AppSyncBatchPostsItems AppSyncBatchPostsItems --> AppSyncBatchRelatedPosts: N additional queries AppSyncBatchRelatedPosts --> AppSyncBatchRelatedPosts AppSyncBatchRelatedPosts --> AppSyncBatchAggregate AppSyncBatchRelatedPosts --> AppSyncBatchAggregate AppSyncBatchRelatedPosts --> AppSyncBatchAggregate AppSyncBatchAggregate --> AppSyncBatchLimit } AppSyncBatchResolverMode --> LambdaInit: 1x Invoke with N events LambdaInit --> EventHandler state EventHandler { [*] --> EventHandlerResolver: app.resolve(event, context) EventHandlerResolver --> YourLogic YourLogic --> EventHandlerResolverErrorHandling EventHandlerResolverErrorHandling --> EventHandlerResolverBuilder EventHandlerResolverBuilder --> LambdaResponse } ``` *Batch resolvers: reducing Lambda invokes but fetching data N times (similar to single resolver).* In rare scenarios, you might want to process each item individually, trading ease of use for increased latency as you handle one batch item at a time. You can toggle `aggregate` parameter in `@batch_resolver` decorator for your resolver function to be called N times. This does not resolve the N+1 problem, but shifts it to the Lambda runtime. In this mode, we will: 1. Aggregate each response we receive from your function in the exact order it receives 1. Gracefully handle errors by adding `None` in the final response for each batch item that failed processing - You can customize `nul` or error responses back to the client in the [AppSync resolver mapping templates](https://docs.aws.amazon.com/appsync/latest/devguide/tutorial-lambda-resolvers.html#returning-individual-errors) ``` from typing import Any, Dict from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import AppSyncResolver from aws_lambda_powertools.utilities.data_classes import AppSyncResolverEvent from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() app = AppSyncResolver() posts_related = { "1": {"title": "post1"}, "2": {"title": "post2"}, "3": {"title": "post3"}, } @app.batch_resolver(type_name="Query", field_name="relatedPosts", aggregate=False) # (1)! def related_posts(event: AppSyncResolverEvent, post_id: str = "") -> Dict[str, Any]: return posts_related[post_id] def lambda_handler(event, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. You need to disable the aggregated event by using `aggregate` flag. The resolver receives and processes each record one at a time. ``` [ { "arguments":{}, "identity":"None", "source":{ "post_id":"1", "author":"Author1" }, "prev":"None", "info":{ "selectionSetList":[ "post_id", "author" ], "selectionSetGraphQL":"{\n post_id\n author\n}", "fieldName":"relatedPosts", "parentTypeName":"Post", "variables":{} } }, { "arguments":{}, "identity":"None", "source":{ "post_id":"2", "author":"Author2" }, "prev":"None", "info":{ "selectionSetList":[ "post_id", "author" ], "selectionSetGraphQL":"{\n post_id\n author\n}", "fieldName":"relatedPosts", "parentTypeName":"Post", "variables":{} } }, { "arguments":{}, "identity":"None", "source":{ "post_id":"1", "author":"Author1" }, "prev":"None", "info":{ "selectionSetList":[ "post_id", "author" ], "selectionSetGraphQL":"{\n post_id\n author\n}", "fieldName":"relatedPosts", "parentTypeName":"Post", "variables":{} } } ] ``` ``` query MyQuery { getPost(post_id: "2") { relatedPosts { post_id author relatedPosts { post_id author } } } } ``` ##### Raise on error ``` stateDiagram-v2 direction LR LambdaInit: Lambda invocation EventHandler: Event Handler EventHandlerResolver: Route event based on GraphQL type/field keys Client: Client query (listPosts) YourLogic: Call your registered resolver function N times EventHandlerResolverErrorHandling: Error? EventHandlerResolverHappyPath: No error? EventHandlerResolverUnhappyPath: Propagate any exception EventHandlerResolverBuilder: Aggregate responses to match batch size AppSyncBatchPostsResolution: query listPosts AppSyncBatchPostsItems: get all posts data (id, title, relatedPosts) AppSyncBatchRelatedPosts: get related posts (id, title, relatedPosts) AppSyncBatchAggregate: aggregate batch resolver event AppSyncBatchLimit: reached batch size limit LambdaResponse: Lambda response LambdaErrorResponse: Lambda error Client --> AppSyncBatchResolverMode state AppSyncBatchResolverMode { [*] --> AppSyncBatchPostsResolution AppSyncBatchPostsResolution --> AppSyncBatchPostsItems AppSyncBatchPostsItems --> AppSyncBatchRelatedPosts: N additional queries AppSyncBatchRelatedPosts --> AppSyncBatchRelatedPosts AppSyncBatchRelatedPosts --> AppSyncBatchAggregate AppSyncBatchRelatedPosts --> AppSyncBatchAggregate AppSyncBatchRelatedPosts --> AppSyncBatchAggregate AppSyncBatchAggregate --> AppSyncBatchLimit } AppSyncBatchResolverMode --> LambdaInit: 1x Invoke with N events LambdaInit --> EventHandler state EventHandler { [*] --> EventHandlerResolver: app.resolve(event, context) EventHandlerResolver --> YourLogic YourLogic --> EventHandlerResolverHappyPath YourLogic --> EventHandlerResolverErrorHandling EventHandlerResolverHappyPath --> EventHandlerResolverBuilder EventHandlerResolverErrorHandling --> EventHandlerResolverUnhappyPath EventHandlerResolverUnhappyPath --> LambdaErrorResponse EventHandlerResolverBuilder --> LambdaResponse } ``` *Batch resolvers: reducing Lambda invokes but fetching data N times (similar to single resolver).* You can toggle `raise_on_error` parameter in `@batch_resolver` to propagate any exception instead of gracefully returning `None` for a given batch item. This is useful when you want to stop processing immediately in the event of an unhandled or unrecoverable exception. ``` from typing import Any, Dict from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import AppSyncResolver from aws_lambda_powertools.utilities.data_classes import AppSyncResolverEvent from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() app = AppSyncResolver() posts_related = { "1": {"title": "post1"}, "2": {"title": "post2"}, "3": {"title": "post3"}, } @app.batch_resolver(type_name="Query", field_name="relatedPosts", aggregate=False, raise_on_error=True) # (1)! def related_posts(event: AppSyncResolverEvent, post_id: str = "") -> Dict[str, Any]: return posts_related[post_id] def lambda_handler(event, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. You can enable enable the error handling by using `raise_on_error` flag. ``` [ { "arguments":{}, "identity":"None", "source":{ "post_id":"1", "author":"Author1" }, "prev":"None", "info":{ "selectionSetList":[ "post_id", "author" ], "selectionSetGraphQL":"{\n post_id\n author\n}", "fieldName":"relatedPosts", "parentTypeName":"Post", "variables":{} } }, { "arguments":{}, "identity":"None", "source":{ "post_id":"2", "author":"Author2" }, "prev":"None", "info":{ "selectionSetList":[ "post_id", "author" ], "selectionSetGraphQL":"{\n post_id\n author\n}", "fieldName":"relatedPosts", "parentTypeName":"Post", "variables":{} } }, { "arguments":{}, "identity":"None", "source":{ "post_id":"1", "author":"Author1" }, "prev":"None", "info":{ "selectionSetList":[ "post_id", "author" ], "selectionSetGraphQL":"{\n post_id\n author\n}", "fieldName":"relatedPosts", "parentTypeName":"Post", "variables":{} } } ] ``` ``` query MyQuery { getPost(post_id: "2") { relatedPosts { post_id author relatedPosts { post_id author } } } } ``` #### Async batch resolver Similar to `@batch_resolver` explained in [batch resolvers](#batch-resolvers), you can use `async_batch_resolver` to handle async functions. ``` from __future__ import annotations from typing import Any from aws_lambda_powertools.event_handler import AppSyncResolver from aws_lambda_powertools.utilities.data_classes import AppSyncResolverEvent from aws_lambda_powertools.utilities.typing import LambdaContext app = AppSyncResolver() # mimic DB data for simplicity posts_related = { "1": {"title": "post1"}, "2": {"title": "post2"}, "3": {"title": "post3"}, } async def search_batch_posts(posts: list) -> dict[str, Any]: return {post_id: posts_related.get(post_id) for post_id in posts} @app.async_batch_resolver(type_name="Query", field_name="relatedPosts") async def related_posts(event: list[AppSyncResolverEvent]) -> list[Any]: # Extract all post_ids in order post_ids: list = [record.source.get("post_id") for record in event] # Get unique post_ids while preserving order unique_post_ids = list(dict.fromkeys(post_ids)) # Fetch posts in a single batch operation fetched_posts = await search_batch_posts(unique_post_ids) # Return results in original order return [fetched_posts.get(post_id) for post_id in post_ids] def lambda_handler(event, context: LambdaContext) -> dict: return app.resolve(event, context) # (1)! ``` 1. `async_batch_resolver` takes care of running and waiting for coroutine completion. ``` [ { "arguments":{}, "identity":"None", "source":{ "post_id":"1", "author":"Author1" }, "prev":"None", "info":{ "selectionSetList":[ "post_id", "author" ], "selectionSetGraphQL":"{\n post_id\n author\n}", "fieldName":"relatedPosts", "parentTypeName":"Post", "variables":{} } }, { "arguments":{}, "identity":"None", "source":{ "post_id":"2", "author":"Author2" }, "prev":"None", "info":{ "selectionSetList":[ "post_id", "author" ], "selectionSetGraphQL":"{\n post_id\n author\n}", "fieldName":"relatedPosts", "parentTypeName":"Post", "variables":{} } }, { "arguments":{}, "identity":"None", "source":{ "post_id":"1", "author":"Author1" }, "prev":"None", "info":{ "selectionSetList":[ "post_id", "author" ], "selectionSetGraphQL":"{\n post_id\n author\n}", "fieldName":"relatedPosts", "parentTypeName":"Post", "variables":{} } } ] ``` ``` query MyQuery { getPost(post_id: "2") { relatedPosts { post_id author relatedPosts { post_id author } } } } ``` ## Testing your code You can test your resolvers by passing a mocked or actual AppSync Lambda event that you're expecting. You can use either `app.resolve(event, context)` or simply `app(event, context)`. Here's an example of how you can test your synchronous resolvers: ``` from __future__ import annotations import json from dataclasses import dataclass from pathlib import Path import pytest from assert_graphql_response_module import Location, app # instance of AppSyncResolver @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:123456789012:function:test" aws_request_id: str = "da658bd3-2d6f-4e7b-8ec2-937234644fdc" @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() def test_direct_resolver(lambda_context): # GIVEN fake_event = json.loads(Path("assert_graphql_response.json").read_text()) # WHEN result: list[Location] = app(fake_event, lambda_context) # THEN assert result[0]["name"] == "Perkins-Reed" ``` ``` from typing import List, TypedDict from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import AppSyncResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = AppSyncResolver() class Location(TypedDict, total=False): id: str # noqa AA03 VNE003, required due to GraphQL Schema name: str description: str address: str @app.resolver(field_name="listLocations") @app.resolver(field_name="locations") @tracer.capture_method def get_locations(name: str, description: str = "") -> List[Location]: # match GraphQL Query arguments return [{"name": name, "description": description}] @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPSYNC_RESOLVER) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "typeName": "Query", "fieldName": "listLocations", "arguments": { "name": "Perkins-Reed", "description": "Nulla sed amet. Earum libero qui sunt perspiciatis. Non aliquid accusamus." }, "selectionSetList": [ "id", "name" ], "identity": { "claims": { "sub": "192879fc-a240-4bf1-ab5a-d6a00f3063f9", "email_verified": true, "iss": "https://cognito-idp.us-west-2.amazonaws.com/us-west-xxxxxxxxxxx", "phone_number_verified": false, "cognito:username": "jdoe", "aud": "7471s60os7h0uu77i1tk27sp9n", "event_id": "bc334ed8-a938-4474-b644-9547e304e606", "token_use": "id", "auth_time": 1599154213, "phone_number": "+19999999999", "exp": 1599157813, "iat": 1599154213, "email": "jdoe@email.com" }, "defaultAuthStrategy": "ALLOW", "groups": null, "issuer": "https://cognito-idp.us-west-2.amazonaws.com/us-west-xxxxxxxxxxx", "sourceIp": [ "1.1.1.1" ], "sub": "192879fc-a240-4bf1-ab5a-d6a00f3063f9", "username": "jdoe" }, "request": { "headers": { "x-amzn-trace-id": "Root=1-60488877-0b0c4e6727ab2a1c545babd0", "x-forwarded-for": "127.0.0.1", "cloudfront-viewer-country": "NL", "x-api-key": "da1-c33ullkbkze3jg5hf5ddgcs4fq" } } } ``` And an example for testing asynchronous resolvers. Note that this requires the `pytest-asyncio` package. This tests a specific async GraphQL operation. Note Alternatively, you can continue call `lambda_handler` function synchronously as it'd run `asyncio.run` to await for the coroutine to complete. ``` import json from dataclasses import dataclass from pathlib import Path from typing import List import pytest from assert_async_graphql_response_module import ( # instance of AppSyncResolver Todo, app, ) @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:123456789012:function:test" aws_request_id: str = "da658bd3-2d6f-4e7b-8ec2-937234644fdc" @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() @pytest.mark.asyncio async def test_async_direct_resolver(lambda_context): # GIVEN fake_event = json.loads(Path("assert_async_graphql_response.json").read_text()) # WHEN result: List[Todo] = await app(fake_event, lambda_context) # alternatively, you can also run a sync test against `lambda_handler` # since `lambda_handler` awaits the coroutine to complete # THEN assert result[0]["userId"] == 1 assert result[0]["id"] == 1 assert result[0]["completed"] is False ``` ``` import asyncio from typing import List, TypedDict import aiohttp from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import AppSyncResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.tracing import aiohttp_trace_config from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = AppSyncResolver() class Todo(TypedDict, total=False): id: str # noqa AA03 VNE003, required due to GraphQL Schema userId: str title: str completed: bool @app.resolver(type_name="Query", field_name="listTodos") async def list_todos() -> List[Todo]: async with aiohttp.ClientSession(trace_configs=[aiohttp_trace_config()]) as session: async with session.get("https://jsonplaceholder.typicode.com/todos") as resp: result: List[Todo] = await resp.json() return result[:2] # first two results to demo assertion @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPSYNC_RESOLVER) @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: result = app.resolve(event, context) return asyncio.run(result) ``` ``` { "typeName": "Query", "fieldName": "listTodos", "arguments": {}, "selectionSetList": [ "id", "userId", "completed" ], "identity": { "claims": { "sub": "192879fc-a240-4bf1-ab5a-d6a00f3063f9", "email_verified": true, "iss": "https://cognito-idp.us-west-2.amazonaws.com/us-west-xxxxxxxxxxx", "phone_number_verified": false, "cognito:username": "jdoe", "aud": "7471s60os7h0uu77i1tk27sp9n", "event_id": "bc334ed8-a938-4474-b644-9547e304e606", "token_use": "id", "auth_time": 1599154213, "phone_number": "+19999999999", "exp": 1599157813, "iat": 1599154213, "email": "jdoe@email.com" }, "defaultAuthStrategy": "ALLOW", "groups": null, "issuer": "https://cognito-idp.us-west-2.amazonaws.com/us-west-xxxxxxxxxxx", "sourceIp": [ "1.1.1.1" ], "sub": "192879fc-a240-4bf1-ab5a-d6a00f3063f9", "username": "jdoe" }, "request": { "headers": { "x-amzn-trace-id": "Root=1-60488877-0b0c4e6727ab2a1c545babd0", "x-forwarded-for": "127.0.0.1", "cloudfront-viewer-country": "NL", "x-api-key": "da1-c33ullkbkze3jg5hf5ddgcs4fq" } } } ``` Event Handler for AWS AppSync real-time events. ``` stateDiagram-v2 direction LR EventSource: AppSync Events EventHandlerResolvers: Publish & Subscribe events LambdaInit: Lambda invocation EventHandler: Event Handler EventHandlerResolver: Route event based on namespace/channel YourLogic: Run your registered handler function EventHandlerResolverBuilder: Adapts response to AppSync contract LambdaResponse: Lambda response state EventSource { EventHandlerResolvers } EventHandlerResolvers --> LambdaInit LambdaInit --> EventHandler EventHandler --> EventHandlerResolver state EventHandler { [*] --> EventHandlerResolver: app.resolve(event, context) EventHandlerResolver --> YourLogic YourLogic --> EventHandlerResolverBuilder } EventHandler --> LambdaResponse ``` ## Key Features - Easily handle publish and subscribe events with dedicated handler methods - Automatic routing based on namespace and channel patterns - Support for wildcard patterns to create catch-all handlers - Support for async functions - Aggregation for batch processing - Graceful error handling for individual events ## Terminology **[AWS AppSync Events](https://docs.aws.amazon.com/appsync/latest/eventapi/event-api-welcome.html)**. A service that enables you to quickly build secure, scalable real-time WebSocket APIs without managing infrastructure or writing API code. It handles connection management, message broadcasting, authentication, and monitoring, reducing time to market and operational costs. ## Getting started Tip: New to AppSync Real-time API? Visit [AWS AppSync Real-time documentation](https://docs.aws.amazon.com/appsync/latest/eventapi/event-api-getting-started.html) to understand how to set up subscriptions and pub/sub messaging. ### Required resources You must have an existing AppSync Events API with real-time capabilities enabled and IAM permissions to invoke your Lambda function. That said, there are no additional permissions required to use Event Handler as routing requires no dependency (*standard library*). ``` AWSTemplateFormatVersion: '2010-09-09' Transform: AWS::Serverless-2016-10-31 Metadata: cfn-lint: ignore_checks: - E3002 Globals: Function: Timeout: 5 MemorySize: 256 Runtime: python3.13 Tracing: Active Environment: Variables: POWERTOOLS_LOG_LEVEL: INFO POWERTOOLS_SERVICE_NAME: hello Resources: HelloWorldFunction: Type: AWS::Serverless::Function Properties: Handler: index.handler CodeUri: hello_world WebsocketAPI: Type: AWS::AppSync::Api Properties: EventConfig: AuthProviders: - AuthType: API_KEY ConnectionAuthModes: - AuthType: API_KEY DefaultPublishAuthModes: - AuthType: API_KEY DefaultSubscribeAuthModes: - AuthType: API_KEY Name: RealTimeEventAPI NameSpaceDataSource: Type: AWS::AppSync::DataSource Properties: ApiId: !GetAtt WebsocketAPI.ApiId LambdaConfig: LambdaFunctionArn: !GetAtt HelloWorldFunction.Arn Name: powertools_lambda ServiceRoleArn: !GetAtt DataSourceIAMRole.Arn Type: AWS_LAMBDA WebsocketApiKey: Type: AWS::AppSync::ApiKey Properties: ApiId: !GetAtt WebsocketAPI.ApiId WebsocketAPINamespace: Type: AWS::AppSync::ChannelNamespace Properties: ApiId: !GetAtt WebsocketAPI.ApiId Name: powertools HandlerConfigs: OnPublish: Behavior: DIRECT Integration: DataSourceName: powertools_lambda LambdaConfig: InvokeType: REQUEST_RESPONSE OnSubscribe: Behavior: DIRECT Integration: DataSourceName: powertools_lambda LambdaConfig: InvokeType: REQUEST_RESPONSE DataSourceIAMRole: Type: AWS::IAM::Role Properties: AssumeRolePolicyDocument: Version: '2012-10-17' Statement: - Effect: Allow Principal: Service: appsync.amazonaws.com Action: sts:AssumeRole Policies: - PolicyName: LambdaInvokePolicy PolicyDocument: Version: '2012-10-17' Statement: - Effect: Allow Action: - lambda:InvokeFunction Resource: !GetAtt HelloWorldFunction.Arn ``` ### AppSync request and response format AppSync Events uses a specific event format for Lambda requests and responses. In most scenarios, Powertools for AWS simplifies this interaction by automatically formatting resolver returns to match the expected AppSync response structure. ``` { "identity":"None", "result":"None", "request":{ "headers": { "x-forwarded-for": "1.1.1.1, 2.2.2.2", "user-agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36", }, "domainName":"None" }, "info":{ "channel":{ "path":"/default/channel", "segments":[ "default", "channel" ] }, "channelNamespace":{ "name":"default" }, "operation":"PUBLISH" }, "error":"None", "prev":"None", "stash":{ }, "outErrors":[ ], "events":[ { "payload":{ "data":"data_1" }, "id":"1" }, { "payload":{ "data":"data_2" }, "id":"2" } ] } ``` ``` { "events":[ { "payload":{ "data":"data_1" }, "id":"1" }, { "payload":{ "data":"data_2" }, "id":"2" } ] } ``` ``` { "events":[ { "error": "Error message", "id":"1" }, { "payload":{ "data":"data_2" }, "id":"2" } ] } ``` ``` { "error": "Exception - An exception occurred" } ``` #### Events response with error When processing events with Lambda, you can return errors to AppSync in three ways: - **Item specific error:** Return an `error` key within each individual item's response. AppSync Events expects this format for item-specific errors. - **Fail entire request:** Return a JSON object with a top-level `error` key. This signals a general failure, and AppSync treats the entire request as unsuccessful. - **Unauthorized exception**: Raise the **UnauthorizedException** exception to reject a subscribe or publish request with HTTP 403. ### Resolver decorator Important The event handler automatically parses the incoming event data and invokes the appropriate handler based on the namespace/channel pattern you register. You can define your handlers for different event types using the `app.on_publish()`, `app.async_on_publish()`, and `app.on_subscribe()` methods. By default, the resolver processes messages individually. For batch processing, see the [Aggregated Processing](#aggregated-processing) section. ``` from __future__ import annotations from typing import TYPE_CHECKING, Any from aws_lambda_powertools.event_handler import AppSyncEventsResolver if TYPE_CHECKING: from aws_lambda_powertools.utilities.typing import LambdaContext app = AppSyncEventsResolver() @app.on_publish("/default/channel") def handle_channel1_publish(payload: dict[str, Any]): # (1)! # Process the payload for this specific channel return { "processed": True, "original_payload": payload, } def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) ``` 1. The `payload` argument is mandatory and will be passed as a dictionary. ``` from __future__ import annotations from typing import TYPE_CHECKING from aws_lambda_powertools import Metrics from aws_lambda_powertools.event_handler import AppSyncEventsResolver from aws_lambda_powertools.event_handler.events_appsync.exceptions import UnauthorizedException from aws_lambda_powertools.metrics import MetricUnit if TYPE_CHECKING: from aws_lambda_powertools.utilities.typing import LambdaContext app = AppSyncEventsResolver() metrics = Metrics(namespace="AppSyncEvents", service="GettingStartedWithSubscribeEvents") @app.on_subscribe("/*") def handle_all_subscriptions(): path = app.current_event.info.channel_path # Perform access control checks if not is_authorized(path): raise UnauthorizedException("You are not authorized to subscribe to this channel") metrics.add_dimension(name="channel", value=path) metrics.add_metric(name="subscription", unit=MetricUnit.Count, value=1) return True def is_authorized(path: str): # Your authorization logic here return path != "not_allowed_path_here" @metrics.log_metrics(capture_cold_start_metric=True) def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) ``` ## Advanced ### Wildcard patterns and handler precedence You can use wildcard patterns to create catch-all handlers for multiple channels or namespaces. This is particularly useful for centralizing logic that applies to multiple channels. When an event matches with multiple handlers, the most specific pattern takes precedence. Supported wildcard patterns Only the following patterns are supported: - `/namespace/*` - Matches all channels in the specified namespace - `/*` - Matches all channels in all namespaces Patterns like `/namespace/channel*` or `/namespace/*/subpath` are not supported. More specific routes will always take precedence over less specific ones. For example, `/default/channel1` will take precedence over `/default/*`, which will take precedence over `/*`. ``` from __future__ import annotations from typing import TYPE_CHECKING, Any from aws_lambda_powertools.event_handler import AppSyncEventsResolver if TYPE_CHECKING: from aws_lambda_powertools.utilities.typing import LambdaContext app = AppSyncEventsResolver() @app.on_publish("/default/channel1") def handle_specific_channel(payload: dict[str, Any]): # This handler will be called for events on /default/channel1 return {"source": "specific_handler", "data": payload} @app.on_publish("/default/*") def handle_default_namespace(payload: dict[str, Any]): # This handler will be called for all channels in the default namespace # EXCEPT for /default/channel1 which has a more specific handler return {"source": "namespace_handler", "data": payload} @app.on_publish("/*") def handle_all_channels(payload: dict[str, Any]): # This handler will be called for all channels in all namespaces # EXCEPT for those that have more specific handlers return {"source": "catch_all_handler", "data": payload} def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) ``` If the event doesn't match any registered handler, the Event Handler will log a warning and skip processing the event. ### Aggregated processing Aggregate Processing When `aggregate=True`, your handler receives a list of all events, requiring you to manage the response format. Ensure your response includes results for each event in the expected [AppSync Request and Response Format](#appsync-request-and-response-format). In some scenarios, you might want to process all events for a channel as a batch rather than individually. This is useful when you need to: - Optimize database operations by making a single batch query - Ensure all events are processed together or not at all - Apply custom error handling logic for the entire batch You can enable this with the `aggregate` parameter: ``` from __future__ import annotations from typing import TYPE_CHECKING, Any import boto3 from boto3.dynamodb.types import TypeSerializer from aws_lambda_powertools.event_handler import AppSyncEventsResolver if TYPE_CHECKING: from aws_lambda_powertools.utilities.typing import LambdaContext dynamodb = boto3.client("dynamodb") serializer = TypeSerializer() app = AppSyncEventsResolver() def marshall(item: dict[str, Any]) -> dict[str, Any]: return {k: serializer.serialize(v) for k, v in item.items()} @app.on_publish("/default/foo/*", aggregate=True) async def handle_default_namespace_batch(payload: list[dict[str, Any]]): # (1)! write_operations: list = [] write_operations.extend({"PutRequest": {"Item": marshall(item)}} for item in payload) if write_operations: dynamodb.batch_write_item( RequestItems={ "your-table-name": write_operations, }, ) return payload def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) ``` 1. The `payload` argument is mandatory and will be passed as a list of dictionary. ### Handling errors You can filter or reject events by raising exceptions in your resolvers or by formatting the payload according to the expected response structure. This instructs AppSync not to propagate that specific message, so subscribers will not receive it. #### Handling errors with individual items When processing items individually with `aggregate=False`, you can raise an exception to fail a specific message. When this happens, the Event Handler will catch it and include the exception name and message in the response. ``` from __future__ import annotations from typing import TYPE_CHECKING, Any from aws_lambda_powertools.event_handler import AppSyncEventsResolver if TYPE_CHECKING: from aws_lambda_powertools.utilities.typing import LambdaContext app = AppSyncEventsResolver() class ValidationError(Exception): pass @app.on_publish("/default/channel") def handle_channel1_publish(payload: dict[str, Any]): if not is_valid_payload(payload): raise ValidationError("Invalid payload format") return {"processed": payload["data"]} def is_valid_payload(payload: dict[str, Any]): return "data" in payload def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) ``` ``` { "events":[ { "error": "Error message", "id":"1" }, { "payload":{ "data":"data_2" }, "id":"2" } ] } ``` #### Handling errors with batch of items When processing batch of items with `aggregate=True`, you must format the payload according the expected response. ``` from __future__ import annotations from typing import TYPE_CHECKING, Any from aws_lambda_powertools.event_handler import AppSyncEventsResolver if TYPE_CHECKING: from aws_lambda_powertools.utilities.typing import LambdaContext app = AppSyncEventsResolver() @app.on_publish("/default/*", aggregate=True) def handle_default_namespace_batch(payload: list[dict[str, Any]]): results: list = [] # Process all events in the batch together for event in payload: try: # Process each event results.append({"id": event.get("id"), "payload": {"processed": True, "originalEvent": event}}) except Exception as e: # Handle errors for individual events results.append( { "error": str(e), "id": event.get("id"), }, ) return results def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) ``` ``` { "events":[ { "error": "Error message", "id":"1" }, { "payload":{ "data":"data_2" }, "id":"2" } ] } ``` If instead you want to fail the entire batch, you can throw an exception. This will cause the Event Handler to return an error response to AppSync and fail the entire batch. ``` from __future__ import annotations from typing import TYPE_CHECKING, Any from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import AppSyncEventsResolver if TYPE_CHECKING: from aws_lambda_powertools.utilities.typing import LambdaContext app = AppSyncEventsResolver() logger = Logger() class ChannelException(Exception): pass @app.on_publish("/default/*", aggregate=True) def handle_default_namespace_batch(payload: list[dict[str, Any]]): results: list = [] # Process all events in the batch together for event in payload: try: # Process each event results.append({"id": event.get("id"), "payload": {"processed": True, "originalEvent": event}}) except Exception as e: logger.error("Found and error") raise ChannelException("An exception occurred") from e return results def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) ``` ``` { "error": "ChannelException - An exception occurred" } ``` #### Authorization control Raising `UnauthorizedException` will cause the Lambda invocation to fail. You can also do content based authorization for channel by raising the `UnauthorizedException` exception. This can cause two situations: - **When working with publish events** Powertools for AWS stop processing messages and subscribers will not receive any message. - **When working with subscribe events** the subscription won't be established. ``` from __future__ import annotations from typing import TYPE_CHECKING, Any from aws_lambda_powertools.event_handler import AppSyncEventsResolver from aws_lambda_powertools.event_handler.events_appsync.exceptions import UnauthorizedException if TYPE_CHECKING: from aws_lambda_powertools.utilities.typing import LambdaContext app = AppSyncEventsResolver() @app.on_publish("/default/foo") def handle_specific_channel(payload: dict[str, Any]): return payload @app.on_publish("/*") def handle_root_channel(payload: dict[str, Any]): raise UnauthorizedException("You can only publish to /default/foo") @app.on_subscribe("/default/foo") def handle_subscription_specific_channel(): return True @app.on_subscribe("/*") def handle_subscription_root_channel(): raise UnauthorizedException("You can only subscribe to /default/foo") def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) ``` ### Processing events with async resolvers Use the `@app.async_on_publish()` decorator to process events asynchronously. We use `asyncio` module to support async functions, and we ensure reliable execution by managing the event loop. Events order and AppSync Events AppSync does not rely on event order. As long as each event includes the original `id`, AppSync processes them correctly regardless of the order in which they are received. ``` from __future__ import annotations import asyncio from typing import TYPE_CHECKING, Any from aws_lambda_powertools.event_handler import AppSyncEventsResolver if TYPE_CHECKING: from aws_lambda_powertools.utilities.typing import LambdaContext app = AppSyncEventsResolver() @app.async_on_publish("/default/channel1") async def handle_channel1_publish(payload: dict[str, Any]): return await async_process_data(payload) async def async_process_data(payload: dict[str, Any]): await asyncio.sleep(0.1) return {"processed": payload, "async": True} def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) ``` ### Accessing Lambda context and event You can access to the original Lambda event or context for additional information. These are accessible via the app instance: ``` from __future__ import annotations from typing import TYPE_CHECKING, Any from aws_lambda_powertools.event_handler import AppSyncEventsResolver from aws_lambda_powertools.utilities.data_classes import AppSyncResolverEventsEvent if TYPE_CHECKING: from aws_lambda_powertools.utilities.typing import LambdaContext app = AppSyncEventsResolver() @app.on_publish("/default/channel1") def handle_channel1_publish(payload: dict[str, Any]): # Access the full event and context lambda_event: AppSyncResolverEventsEvent = app.current_event # Access request headers header_user_agent = lambda_event.request_headers["user-agent"] return { "originalMessage": payload, "userAgent": header_user_agent, } def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) ``` ## Event Handler workflow ### Working with single items ``` sequenceDiagram participant Client participant AppSync participant Lambda participant EventHandler note over Client,EventHandler: Individual Event Processing (aggregate=False) Client->>+AppSync: Send multiple events to channel AppSync->>+Lambda: Invoke Lambda with batch of events Lambda->>+EventHandler: Process events with aggregate=False loop For each event in batch EventHandler->>EventHandler: Process individual event end EventHandler-->>-Lambda: Return array of processed events Lambda-->>-AppSync: Return event-by-event responses AppSync-->>-Client: Report individual event statuses ``` ### Working with aggregated items ``` sequenceDiagram participant Client participant AppSync participant Lambda participant EventHandler note over Client,EventHandler: Aggregate Processing Workflow Client->>+AppSync: Send multiple events to channel AppSync->>+Lambda: Invoke Lambda with batch of events Lambda->>+EventHandler: Process events with aggregate=True EventHandler->>EventHandler: Batch of events EventHandler->>EventHandler: Process entire batch at once EventHandler->>EventHandler: Format response for each event EventHandler-->>-Lambda: Return aggregated results Lambda-->>-AppSync: Return success responses AppSync-->>-Client: Confirm all events processed ``` ### Authorization fails for publish ``` sequenceDiagram participant Client participant AppSync participant Lambda participant EventHandler note over Client,EventHandler: Publish Event Authorization Flow Client->>AppSync: Publish message to channel AppSync->>Lambda: Invoke Lambda with publish event Lambda->>EventHandler: Process publish event alt Authorization Failed EventHandler->>EventHandler: Authorization check fails EventHandler->>Lambda: Raise UnauthorizedException Lambda->>AppSync: Return error response AppSync--xClient: Message not delivered AppSync--xAppSync: No distribution to subscribers else Authorization Passed EventHandler->>Lambda: Return successful response Lambda->>AppSync: Return processed event AppSync->>Client: Acknowledge message AppSync->>AppSync: Distribute to subscribers end ``` ### Authorization fails for subscribe ``` sequenceDiagram participant Client participant AppSync participant Lambda participant EventHandler note over Client,EventHandler: Subscribe Event Authorization Flow Client->>AppSync: Request subscription to channel AppSync->>Lambda: Invoke Lambda with subscribe event Lambda->>EventHandler: Process subscribe event alt Authorization Failed EventHandler->>EventHandler: Authorization check fails EventHandler->>Lambda: Raise UnauthorizedException Lambda->>AppSync: Return error response AppSync--xClient: Subscription denied (HTTP 403) else Authorization Passed EventHandler->>Lambda: Return successful response Lambda->>AppSync: Return authorization success AppSync->>Client: Subscription established end ``` ## Testing your code You can test your event handlers by passing a mocked or actual AppSync Events Lambda event. ### Testing publish events ``` import json from pathlib import Path from aws_lambda_powertools.event_handler import AppSyncEventsResolver class LambdaContext: def __init__(self): self.function_name = "test-func" self.memory_limit_in_mb = 128 self.invoked_function_arn = "arn:aws:lambda:eu-west-1:809313241234:function:test-func" self.aws_request_id = "52fdfc07-2182-154f-163f-5f0f9a621d72" def get_remaining_time_in_millis(self) -> int: return 1000 def test_publish_event_with_synchronous_resolver(): """Test handling a publish event with a synchronous resolver.""" # GIVEN a sample publish event with Path.open("getting_started_with_testing_publish_event.json", "r") as f: event = json.load(f) lambda_context = LambdaContext() # GIVEN an AppSyncEventsResolver with a synchronous resolver app = AppSyncEventsResolver() @app.on_publish(path="/default/*") def test_handler(payload): return {"processed": True, "data": payload["data"]} # WHEN we resolve the event result = app.resolve(event, lambda_context) # THEN we should get the correct response expected_result = { "events": [ {"id": "123", "payload": {"processed": True, "data": "test data"}}, ], } assert result == expected_result ``` ``` { "identity":"None", "result":"None", "request":{ "headers": { "x-forwarded-for": "1.1.1.1, 2.2.2.2", "cloudfront-viewer-country": "US", "cloudfront-is-tablet-viewer": "false", "via": "2.0 xxxxxxxxxxxxxxxx.cloudfront.net (CloudFront)", "cloudfront-forwarded-proto": "https", "origin": "https://us-west-1.console.aws.amazon.com", "content-length": "217", "accept-language": "en-US,en;q=0.9", "host": "xxxxxxxxxxxxxxxx.appsync-api.us-west-1.amazonaws.com", "x-forwarded-proto": "https", "user-agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36", "accept": "*/*", "cloudfront-is-mobile-viewer": "false", "cloudfront-is-smarttv-viewer": "false", "accept-encoding": "gzip, deflate, br", "referer": "https://us-west-1.console.aws.amazon.com/appsync/home?region=us-west-1", "content-type": "application/json", "sec-fetch-mode": "cors", "x-amz-cf-id": "3aykhqlUwQeANU-HGY7E_guV5EkNeMMtwyOgiA==", "x-amzn-trace-id": "Root=1-5f512f51-fac632066c5e848ae714", "authorization": "eyJraWQiOiJScWFCSlJqYVJlM0hrSnBTUFpIcVRXazNOW...", "sec-fetch-dest": "empty", "x-amz-user-agent": "AWS-Console-AppSync/", "cloudfront-is-desktop-viewer": "true", "sec-fetch-site": "cross-site", "x-forwarded-port": "443" }, "domainName":"None" }, "info":{ "channel":{ "path":"/default/channel", "segments":[ "default", "channel" ] }, "channelNamespace":{ "name":"default" }, "operation":"PUBLISH" }, "error":"None", "prev":"None", "stash":{ }, "outErrors":[ ], "events":[ { "payload":{ "data": "test data" }, "id":"123" } ] } ``` ### Testing subscribe events ``` import json from pathlib import Path from aws_lambda_powertools.event_handler import AppSyncEventsResolver class LambdaContext: def __init__(self): self.function_name = "test-func" self.memory_limit_in_mb = 128 self.invoked_function_arn = "arn:aws:lambda:eu-west-1:809313241234:function:test-func" self.aws_request_id = "52fdfc07-2182-154f-163f-5f0f9a621d72" def get_remaining_time_in_millis(self) -> int: return 1000 def test_subscribe_event_with_valid_return(): """Test error handling during publish event processing.""" # GIVEN a sample publish event with Path.open("getting_started_with_testing_publish_event.json", "r") as f: event = json.load(f) lambda_context = LambdaContext() # GIVEN an AppSyncEventsResolver with a resolver that returns ok app = AppSyncEventsResolver() @app.on_subscribe(path="/default/*") def test_handler(): pass # WHEN we resolve the event result = app.resolve(event, lambda_context) # THEN we should return None because subscribe always must return None assert result is None ``` ``` { "identity":"None", "result":"None", "request":{ "headers": { "x-forwarded-for": "1.1.1.1, 2.2.2.2", "cloudfront-viewer-country": "US", "cloudfront-is-tablet-viewer": "false", "via": "2.0 xxxxxxxxxxxxxxxx.cloudfront.net (CloudFront)", "cloudfront-forwarded-proto": "https", "origin": "https://us-west-1.console.aws.amazon.com", "content-length": "217", "accept-language": "en-US,en;q=0.9", "host": "xxxxxxxxxxxxxxxx.appsync-api.us-west-1.amazonaws.com", "x-forwarded-proto": "https", "user-agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36", "accept": "*/*", "cloudfront-is-mobile-viewer": "false", "cloudfront-is-smarttv-viewer": "false", "accept-encoding": "gzip, deflate, br", "referer": "https://us-west-1.console.aws.amazon.com/appsync/home?region=us-west-1", "content-type": "application/json", "sec-fetch-mode": "cors", "x-amz-cf-id": "3aykhqlUwQeANU-HGY7E_guV5EkNeMMtwyOgiA==", "x-amzn-trace-id": "Root=1-5f512f51-fac632066c5e848ae714", "authorization": "eyJraWQiOiJScWFCSlJqYVJlM0hrSnBTUFpIcVRXazNOW...", "sec-fetch-dest": "empty", "x-amz-user-agent": "AWS-Console-AppSync/", "cloudfront-is-desktop-viewer": "true", "sec-fetch-site": "cross-site", "x-forwarded-port": "443" }, "domainName":"None" }, "info":{ "channel":{ "path":"/default/channel", "segments":[ "default", "channel" ] }, "channelNamespace":{ "name":"default" }, "operation":"SUBSCRIBE" }, "error":"None", "prev":"None", "stash":{ }, "outErrors":[ ], "events":[] } ``` Create [Agents for Amazon Bedrock](https://docs.aws.amazon.com/bedrock/latest/userguide/agents.html#agents-how) using event handlers and auto generation of OpenAPI schemas. ``` flowchart LR Bedrock[LLM] <-- uses --> Agent You[User input] --> Agent Agent -- consults --> OpenAPI Agent[Agents for Amazon Bedrock] -- invokes --> Lambda subgraph OpenAPI Schema end subgraph Lambda[Lambda Function] direction TB Parsing[Parameter Parsing] --> Validation Validation[Parameter Validation] --> Routing Routing --> Code[Your code] Code --> ResponseValidation[Response Validation] ResponseValidation --> ResponseBuilding[Response Building] end subgraph ActionGroup[Action Group] OpenAPI -. generated from .-> Lambda end style Code fill:#ffa500,color:black,font-weight:bold,stroke-width:3px style You stroke:#0F0,stroke-width:2px ``` ## Key features - Minimal boilerplate to build Agents for Amazon Bedrock - Automatic generation of [OpenAPI schemas](https://www.openapis.org/) from your business logic code - Built-in data validation for requests and responses - Similar experience to authoring [REST and HTTP APIs](../api_gateway/) ## Terminology **Data validation** automatically validates the user input and the response of your AWS Lambda function against a set of constraints defined by you. **Event handler** is a Powertools for AWS feature that processes an event, runs data parsing and validation, routes the request to a specific function, and returns a response to the caller in the proper format. **[OpenAPI schema](https://www.openapis.org/)** is an industry standard JSON-serialized string that represents the structure and parameters of your API. **Action group** is a collection of two resources where you define the actions that the agent should carry out: an OpenAPI schema to define the APIs that the agent can invoke to carry out its tasks, and a Lambda function to execute those actions. **Large Language Models (LLM)** are very large deep learning models that are pre-trained on vast amounts of data, capable of extracting meanings from a sequence of text and understanding the relationship between words and phrases on it. **Agent for Amazon Bedrock** is an Amazon Bedrock feature to build and deploy conversational agents that can interact with your customers using Large Language Models (LLM) and AWS Lambda functions. ## Getting started All examples shared in this documentation are available within the [project repository](https://github.com/aws-powertools/powertools-lambda-python/tree/develop/examples) ### Install This is unnecessary if you're installing Powertools for AWS Lambda (Python) via [Lambda Layer/SAR](../../../#lambda-layer). You need to add `pydantic` as a dependency in your preferred tool *e.g., requirements.txt, pyproject.toml*. At this time, we only support Pydantic V2. ### Required resources To build Agents for Amazon Bedrock, you will need: | Requirement | Description | SAM Supported | CDK Supported | | --- | --- | --- | --- | | [Lambda Function](#your-first-agent) | Defines your business logic for the action group | ✅ | ✅ | | [OpenAPI Schema](#generating-openapi-schemas) | API description, structure, and action group parameters | ❌ | ✅ | | [Bedrock Service Role](https://docs.aws.amazon.com/bedrock/latest/userguide/agents-permissions.html) | Allows Amazon Bedrock to invoke foundation models | ✅ | ✅ | | Agents for Bedrock | The service that will combine all the above to create the conversational agent | ❌ | ✅ | Using [AWS SAM](https://aws.amazon.com/serverless/sam/) you can create your Lambda function and the necessary permissions. However, you still have to create your Agent for Amazon Bedrock [using the AWS console](https://docs.aws.amazon.com/bedrock/latest/userguide/agents-create.html). ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: > Agents for Amazon Bedrock example with Powertools for AWS Lambda (Python) Globals: # https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-specification-template-anatomy-globals.html Function: Timeout: 30 Runtime: python3.12 Tracing: Active Environment: Variables: POWERTOOLS_SERVICE_NAME: PowertoolsHelloWorld POWERTOOLS_LOG_LEVEL: INFO Resources: ApiFunction: Type: AWS::Serverless::Function Properties: Handler: getting_started.lambda_handler Description: Agent for Amazon Bedrock handler function CodeUri: ../src BedrockPermission: # (1)! Type: AWS::Lambda::Permission Properties: Action: lambda:InvokeFunction FunctionName: !GetAtt ApiFunction.Arn Principal: bedrock.amazonaws.com SourceAccount: !Sub ${AWS::AccountId} BedrockServiceRole: Type: AWS::IAM::Role Properties: AssumeRolePolicyDocument: Version: "2012-10-17" Statement: - Effect: Allow Principal: Service: - bedrock.amazonaws.com Action: - sts:AssumeRole Policies: - PolicyName: bedrock PolicyDocument: Version: "2012-10-17" Statement: - Effect: Allow Action: - bedrock:InvokeModel Resource: # (2)! - !Sub arn:aws:${AWS::Region}:region::foundation-model/anthropic.claude-v2 - !Sub arn:aws:${AWS::Region}:region::foundation-model/anthropic.claude-v2:1 - !Sub arn:aws:${AWS::Region}:region::foundation-model/anthropic.claude-instant-v1 Outputs: BedrockServiceRole: Description: The role ARN to be used by Amazon Bedrock Value: !GetAtt BedrockServiceRole.Arn # (3)! ``` 1. Amazon Bedrock needs permissions to invoke this Lambda function 1. Check the [supported foundational models](https://docs.aws.amazon.com/bedrock/latest/userguide/agents-supported.html) 1. You need the role ARN when creating the Agent for Amazon Bedrock This example uses the [Generative AI CDK constructs](https://awslabs.github.io/generative-ai-cdk-constructs/src/cdk-lib/bedrock/#agents) to create your Agent with [AWS CDK](https://aws.amazon.com/cdk/). These constructs abstract the underlying permission setup and code bundling of your Lambda function. ``` from aws_cdk import ( Stack, ) from aws_cdk.aws_lambda import Runtime from aws_cdk.aws_lambda_python_alpha import PythonFunction from cdklabs.generative_ai_cdk_constructs import bedrock from constructs import Construct class AgentsCdkStack(Stack): def __init__(self, scope: Construct, construct_id: str, **kwargs) -> None: super().__init__(scope, construct_id, **kwargs) action_group_function = PythonFunction( self, "LambdaFunction", runtime=Runtime.PYTHON_3_12, entry="./lambda", # (1)! index="app.py", handler="lambda_handler", ) agent = bedrock.Agent( self, "Agent", foundation_model=bedrock.BedrockFoundationModel.ANTHROPIC_CLAUDE_INSTANT_V1_2, instruction="You are a helpful and friendly agent that answers questions about insurance claims.", ) action_group: bedrock.AgentActionGroup = bedrock.AgentActionGroup( name="InsureClaimsSupport", description="Use these functions for insurance claims support", executor=bedrock.ActionGroupExecutor.fromlambda_function( lambda_function=action_group_function, ), enabled=True, api_schema=bedrock.ApiSchema.from_local_asset("./lambda/openapi.json"), # (2)! ) agent.add_action_group(action_group) ``` 1. The path to your Lambda function handler 1. The path to the OpenAPI schema describing your API ### Your first Agent To create an agent, use the `BedrockAgentResolver` to annotate your actions. This is similar to the way [all the other Event Handler](../api_gateway/) resolvers work. You are required to add a `description` parameter in each endpoint, doing so will improve Bedrock's understanding of your actions. The resolvers used by Agents for Amazon Bedrock are compatible with all Powertools for AWS Lambda [features](../../../#features). For reference, we use [Logger](../../logger/) and [Tracer](../../tracer/) in this example. ``` from time import time from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import BedrockAgentResolver from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = BedrockAgentResolver() @app.get("/current_time", description="Gets the current time in seconds") # (1)! @tracer.capture_method def current_time() -> int: return int(time()) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) # (2)! ``` 1. `description` is a **required** field that should contain a human readable description of your action 1. We take care of **parsing**, **validating**, **routing** and **responding** to the request. Powertools for AWS Lambda [generates this automatically](#generating-openapi-schemas) from the Lambda handler. ``` { "openapi": "3.0.3", "info": { "title": "Powertools API", "version": "1.0.0" }, "servers": [ { "url": "/" } ], "paths": { "/current_time": { "get": { "summary": "GET /current_time", "description": "Gets the current time in seconds", "operationId": "current_time_current_time_get", "responses": { "200": { "description": "Successful Response", "content": { "application/json": { "schema": { "type": "integer", "title": "Return" } } } }, "422": { "description": "Validation Error", "content": { "application/json": { "schema": { "$ref": "#/components/schemas/HTTPValidationError" } } } } } } } }, "components": { "schemas": { "HTTPValidationError": { "properties": { "detail": { "items": { "$ref": "#/components/schemas/ValidationError" }, "type": "array", "title": "Detail" } }, "type": "object", "title": "HTTPValidationError" }, "ValidationError": { "properties": { "loc": { "items": { "anyOf": [ { "type": "string" }, { "type": "integer" } ] }, "type": "array", "title": "Location" }, "msg": { "type": "string", "title": "Message" }, "type": { "type": "string", "title": "Error Type" } }, "type": "object", "required": [ "loc", "msg", "type" ], "title": "ValidationError" } } } } ``` ``` { "sessionId": "123456789012345", "sessionAttributes": {}, "inputText": "What is the current time?", "promptSessionAttributes": {}, "apiPath": "/current_time", "agent": { "name": "TimeAgent", "version": "DRAFT", "id": "XLHH72XNF2", "alias": "TSTALIASID" }, "httpMethod": "GET", "messageVersion": "1.0", "actionGroup": "CurrentTime" } ``` ``` { "messageVersion": "1.0", "response": { "actionGroup": "CurrentTime", "apiPath": "/current_time", "httpMethod": "GET", "httpStatusCode": 200, "responseBody": { "application/json": { "body": "1704708165" } } } } ``` What happens under the hood? Powertools will handle the request from the Agent, parse, validate, and route it to the correct method in your code. The response is then validated and formatted back to the Agent. ``` sequenceDiagram actor User User->>Agent: What is the current time? Agent->>OpenAPI schema: consults OpenAPI schema-->>Agent: GET /current_time Agent-->>Agent: LLM interaction box Powertools participant Lambda participant Parsing participant Validation participant Routing participant Your Code end Agent->>Lambda: GET /current_time activate Lambda Lambda->>Parsing: parses parameters Parsing->>Validation: validates input Validation->>Routing: finds method to call Routing->>Your Code: executes activate Your Code Your Code->>Routing: 1709215709 deactivate Your Code Routing->>Validation: returns output Validation->>Parsing: validates output Parsing->>Lambda: formats response Lambda->>Agent: 1709215709 deactivate Lambda Agent-->>Agent: LLM interaction Agent->>User: "The current time is 14:08:29 GMT" ``` ### Validating input and output You can define the expected format for incoming data and responses by using type annotations. Define constraints using standard Python types, [dataclasses](https://docs.python.org/3/library/dataclasses.html) or [Pydantic models](https://docs.pydantic.dev/latest/concepts/models/). Pydantic is a popular library for data validation using Python type annotations. This example uses [Pydantic's EmailStr](https://docs.pydantic.dev/2.0/usage/types/string_types/#emailstr) to validate the email address passed to the `schedule_meeting` function. The function then returns a boolean indicating if the meeting was successfully scheduled. ``` from pydantic import EmailStr from typing_extensions import Annotated from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import BedrockAgentResolver from aws_lambda_powertools.event_handler.openapi.params import Body, Query from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = BedrockAgentResolver() # (1)! @app.get("/schedule_meeting", description="Schedules a meeting with the team") @tracer.capture_method def schedule_meeting( email: Annotated[EmailStr, Query(description="The email address of the customer")], # (2)! ) -> Annotated[bool, Body(description="Whether the meeting was scheduled successfully")]: # (3)! logger.info("Scheduling a meeting", email=email) return True @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) ``` 1. No need to add the `enable_validation` parameter, as it's enabled by default. 1. Describe each input using human-readable descriptions 1. Add the typing annotations to your parameters and return types, and let the event handler take care of the rest ``` { "openapi": "3.0.3", "info": { "title": "Powertools API", "version": "1.0.0" }, "servers": [ { "url": "/" } ], "paths": { "/schedule_meeting": { "get": { "summary": "GET /schedule_meeting", "description": "Schedules a meeting with the team", "operationId": "schedule_meeting_schedule_meeting_get", "parameters": [ { "description": "The email address of the customer", "required": true, "schema": { "type": "string", "format": "email", "title": "Email", "description": "The email address of the customer" }, "name": "email", "in": "query" } ], "responses": { "200": { "description": "Successful Response", "content": { "application/json": { "schema": { "type": "boolean", "title": "Return", "description": "Whether the meeting was scheduled successfully" } } } }, "422": { "description": "Validation Error", "content": { "application/json": { "schema": { "$ref": "#/components/schemas/HTTPValidationError" } } } } } } } }, "components": { "schemas": { "HTTPValidationError": { "properties": { "detail": { "items": { "$ref": "#/components/schemas/ValidationError" }, "type": "array", "title": "Detail" } }, "type": "object", "title": "HTTPValidationError" }, "ValidationError": { "properties": { "loc": { "items": { "anyOf": [ { "type": "string" }, { "type": "integer" } ] }, "type": "array", "title": "Location" }, "msg": { "type": "string", "title": "Message" }, "type": { "type": "string", "title": "Error Type" } }, "type": "object", "required": [ "loc", "msg", "type" ], "title": "ValidationError" } } } } ``` ``` { "sessionId": "123456789012345", "sessionAttributes": {}, "inputText": "Schedule a meeting with the team. My email is foo@example.org", "promptSessionAttributes": {}, "apiPath": "/schedule_meeting", "parameters": [ { "name": "email", "type": "string", "value": "foo@example.org" } ], "agent": { "name": "TimeAgent", "version": "DRAFT", "id": "XLHH72XNF2", "alias": "TSTALIASID" }, "httpMethod": "GET", "messageVersion": "1.0", "actionGroup": "SupportAssistant" } ``` ``` { "messageVersion": "1.0", "response": { "actionGroup": "SupportAssistant", "apiPath": "/schedule_meeting", "httpMethod": "GET", "httpStatusCode": 200, "responseBody": { "application/json": { "body": "true" } } } } ``` #### When validation fails If the request validation fails, your event handler will not be called, and an error message is returned to Bedrock. Similarly, if the response fails validation, your handler will abort the response. What does this mean for my Agent? The event handler will always return a response according to the OpenAPI schema. A validation failure always results in a [422 response](https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/422). However, how Amazon Bedrock interprets that failure is non-deterministic, since it depends on the characteristics of the LLM being used. ``` { "sessionId": "123456789012345", "sessionAttributes": {}, "inputText": "Schedule a meeting with the team. My email is foo@example@org", "promptSessionAttributes": {}, "apiPath": "/schedule_meeting", "parameters": [ { "name": "email", "type": "string", "value": "foo@example@org" } ], "agent": { "name": "TimeAgent", "version": "DRAFT", "id": "XLHH72XNF2", "alias": "TSTALIASID" }, "httpMethod": "GET", "messageVersion": "1.0", "actionGroup": "SupportAssistant" } ``` ``` { "messageVersion": "1.0", "response": { "actionGroup": "SupportAssistant", "apiPath": "/schedule_meeting", "httpMethod": "GET", "httpStatusCode": 200, "responseBody": { "application/json": { "body": "{\"statusCode\":422,\"detail\":[{\"loc\":[\"query\",\"email\"],\"type\":\"value_error.email\"}]}" } } } } ``` ``` sequenceDiagram Agent->>Lambda: input payload activate Lambda Lambda->>Parsing: parses input parameters Parsing->>Validation: validates input Validation-->Validation: failure box BedrockAgentResolver participant Lambda participant Parsing participant Validation participant Routing participant Your Code end Note right of Validation: Your code is never called Validation->>Agent: 422 response deactivate Lambda ``` ### Generating OpenAPI schemas Use the `get_openapi_json_schema` function provided by the resolver to produce a JSON-serialized string that represents your OpenAPI schema. You can print this string or save it to a file. You'll use the file later when creating the Agent. You'll need to regenerate the OpenAPI schema and update your Agent everytime your API changes. ``` from time import time from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import BedrockAgentResolver from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = BedrockAgentResolver() @app.get("/current_time", description="Gets the current time in seconds") @tracer.capture_method def current_time() -> int: return int(time()) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) if __name__ == "__main__": # (1)! print(app.get_openapi_json_schema()) # (2)! ``` 1. This ensures that it's only executed when running the file directly, and not when running on the Lambda runtime. 1. You can use [additional options](#customizing-openapi-metadata) to customize the OpenAPI schema. ``` { "openapi": "3.0.3", "info": { "title": "Powertools API", "version": "1.0.0" }, "servers": [ { "url": "/" } ], "paths": { "/current_time": { "get": { "summary": "GET /current_time", "description": "Gets the current time in seconds", "operationId": "current_time_current_time_get", "responses": { "200": { "description": "Successful Response", "content": { "application/json": { "schema": { "type": "integer", "title": "Return" } } } }, "422": { "description": "Validation Error", "content": { "application/json": { "schema": { "$ref": "#/components/schemas/HTTPValidationError" } } } } } } } }, "components": { "schemas": { "HTTPValidationError": { "properties": { "detail": { "items": { "$ref": "#/components/schemas/ValidationError" }, "type": "array", "title": "Detail" } }, "type": "object", "title": "HTTPValidationError" }, "ValidationError": { "properties": { "loc": { "items": { "anyOf": [ { "type": "string" }, { "type": "integer" } ] }, "type": "array", "title": "Location" }, "msg": { "type": "string", "title": "Message" }, "type": { "type": "string", "title": "Error Type" } }, "type": "object", "required": [ "loc", "msg", "type" ], "title": "ValidationError" } } } } ``` To get the OpenAPI schema, run the Python script from your terminal. The script will generate the schema directly to standard output, which you can redirect to a file. ``` python3 app.py > schema.json ``` ### Crafting effective OpenAPI schemas Working with Agents for Amazon Bedrock will introduce [non-deterministic behaviour to your system](https://docs.aws.amazon.com/bedrock/latest/userguide/agents-how.html#agents-rt). Why is that? Amazon Bedrock uses LLMs to understand and respond to user input. These models are trained on vast amounts of data and are capable of extracting meanings from a sequence of text and understanding the relationship between words and phrases on it. However, this means that the same input can result in different outputs, depending on the characteristics of the LLM being used. The OpenAPI schema provides context and semantics to the Agent that will support the decision process for invoking our Lambda function. Sparse or ambiguous schemas can result in unexpected outcomes. We recommend enriching your OpenAPI schema with as many details as possible to help the Agent understand your functions, and make correct invocations. To achieve that, keep the following suggestions in mind: - Always describe your function behaviour using the `description` field in your annotations - When refactoring, update your description field to match the function outcomes - Use distinct `description` for each function to have clear separation of semantics ### Video walkthrough To create an Agent for Amazon Bedrock, refer to the [official documentation](https://docs.aws.amazon.com/bedrock/latest/userguide/agents-create.html) provided by AWS. The following video demonstrates the end-to-end process: During the creation process, you should use the schema [previously generated](#generating-openapi-schemas) when prompted for an OpenAPI specification. ## Advanced ### Accessing custom request fields The event sent by Agents for Amazon Bedrock into your Lambda function contains a [number of extra event fields](#request_fields_table), exposed in the `app.current_event` field. Why is this useful? You can for instance identify new conversations (`session_id`) or store and analyze entire conversations (`input_text`). In this example, we [append correlation data](../../logger/#appending-additional-keys) to all generated logs. This can be used to aggregate logs by `session_id` and observe the entire conversation between a user and the Agent. ``` from time import time from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import BedrockAgentResolver from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() app = BedrockAgentResolver() @app.get("/current_time", description="Gets the current time in seconds") # (1)! def current_time() -> int: logger.append_keys( session_id=app.current_event.session_id, action_group=app.current_event.action_group, input_text=app.current_event.input_text, ) logger.info("Serving current_time") return int(time()) @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) ``` The input event fields are: | Name | Type | Description | | --- | --- | --- | | message_version | `str` | The version of the message that identifies the format of the event data going into the Lambda function and the expected format of the response from a Lambda function. Amazon Bedrock only supports version 1.0. | | agent | `BedrockAgentInfo` | Contains information about the name, ID, alias, and version of the agent that the action group belongs to. | | input_text | `str` | The user input for the conversation turn. | | session_id | `str` | The unique identifier of the agent session. | | action_group | `str` | The name of the action group. | | api_path | `str` | The path to the API operation, as defined in the OpenAPI schema. | | http_method | `str` | The method of the API operation, as defined in the OpenAPI schema. | | parameters | `List[BedrockAgentProperty]` | Contains a list of objects. Each object contains the name, type, and value of a parameter in the API operation, as defined in the OpenAPI schema. | | request_body | `BedrockAgentRequestBody` | Contains the request body and its properties, as defined in the OpenAPI schema. | | session_attributes | `Dict[str, str]` | Contains session attributes and their values. | | prompt_session_attributes | `Dict[str, str]` | Contains prompt attributes and their values. | ### Additional metadata To enrich the view that Agents for Amazon Bedrock has of your Lambda functions, use a combination of [Pydantic Models](https://docs.pydantic.dev/latest/concepts/models/) and [OpenAPI](https://www.openapis.org/) type annotations to add constraints to your APIs parameters. When is this useful? Adding constraints to your function parameters can help you to enforce data validation and improve the understanding of your APIs by Amazon Bedrock. #### Customizing OpenAPI parameters Whenever you use OpenAPI parameters to validate [query strings](../api_gateway/#validating-query-strings) or [path parameters](../api_gateway/#validating-path-parameters), you can enhance validation and OpenAPI documentation by using any of these parameters: | Field name | Type | Description | | --- | --- | --- | | `alias` | `str` | Alternative name for a field, used when serializing and deserializing data | | `validation_alias` | `str` | Alternative name for a field during validation (but not serialization) | | `serialization_alias` | `str` | Alternative name for a field during serialization (but not during validation) | | `description` | `str` | Human-readable description | | `gt` | `float` | Greater than. If set, value must be greater than this. Only applicable to numbers | | `ge` | `float` | Greater than or equal. If set, value must be greater than or equal to this. Only applicable to numbers | | `lt` | `float` | Less than. If set, value must be less than this. Only applicable to numbers | | `le` | `float` | Less than or equal. If set, value must be less than or equal to this. Only applicable to numbers | | `min_length` | `int` | Minimum length for strings | | `max_length` | `int` | Maximum length for strings | | `pattern` | `string` | A regular expression that the string must match. | | `strict` | `bool` | If `True`, strict validation is applied to the field. See [Strict Mode](https://docs.pydantic.dev/latest/concepts/strict_mode/) for details | | `multiple_of` | `float` | Value must be a multiple of this. Only applicable to numbers | | `allow_inf_nan` | `bool` | Allow `inf`, `-inf`, `nan`. Only applicable to numbers | | `max_digits` | `int` | Maximum number of allow digits for strings | | `decimal_places` | `int` | Maximum number of decimal places allowed for numbers | | `openapi_examples` | `dict[str, Example]` | A list of examples to be displayed in the SwaggerUI interface. Avoid using the `examples` field for this purpose. | | `deprecated` | `bool` | Marks the field as deprecated | | `include_in_schema` | `bool` | If `False` the field will not be part of the exported OpenAPI schema | | `json_schema_extra` | `JsonDict` | Any additional JSON schema data for the schema property | To implement these customizations, include extra constraints when defining your parameters: ``` import requests from typing_extensions import Annotated from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import BedrockAgentResolver from aws_lambda_powertools.event_handler.openapi.params import Body, Query from aws_lambda_powertools.utilities.typing import LambdaContext app = BedrockAgentResolver() logger = Logger() @app.post( "/todos", description="Creates a TODO", ) def create_todo( title: Annotated[str, Query(max_length=200, strict=True, description="The TODO title")], # (1)! ) -> Annotated[bool, Body(description="Was the TODO created correctly?")]: todo = requests.post("https://jsonplaceholder.typicode.com/todos", data={"title": title}) try: todo.raise_for_status() return True except Exception: logger.exception("Error creating TODO") return False def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` 1. Title should not be larger than 200 characters and [strict mode](https://docs.pydantic.dev/latest/concepts/strict_mode/) is activated #### Customizing API operations Customize your API endpoints by adding metadata to endpoint definitions. Here's a breakdown of various customizable fields: | Field Name | Type | Description | | --- | --- | --- | | `summary` | `str` | A concise overview of the main functionality of the endpoint. This brief introduction is usually displayed in autogenerated API documentation and helps consumers quickly understand what the endpoint does. | | `description` | `str` | A more detailed explanation of the endpoint, which can include information about the operation's behavior, including side effects, error states, and other operational guidelines. | | `responses` | `Dict[int, Dict[str, OpenAPIResponse]]` | A dictionary that maps each HTTP status code to a Response Object as defined by the [OpenAPI Specification](https://swagger.io/specification/#response-object). This allows you to describe expected responses, including default or error messages, and their corresponding schemas or models for different status codes. | | `response_description` | `str` | Provides the default textual description of the response sent by the endpoint when the operation is successful. It is intended to give a human-readable understanding of the result. | | `tags` | `List[str]` | Tags are a way to categorize and group endpoints within the API documentation. They can help organize the operations by resources or other heuristic. | | `operation_id` | `str` | A unique identifier for the operation, which can be used for referencing this operation in documentation or code. This ID must be unique across all operations described in the API. | | `include_in_schema` | `bool` | A boolean value that determines whether or not this operation should be included in the OpenAPI schema. Setting it to `False` can hide the endpoint from generated documentation and schema exports, which might be useful for private or experimental endpoints. | | `deprecated` | `bool` | A boolean value that determines whether or not this operation should be marked as deprecated in the OpenAPI schema. | To implement these customizations, include extra parameters when defining your routes: ``` import requests from typing_extensions import Annotated from aws_lambda_powertools.event_handler import BedrockAgentResolver from aws_lambda_powertools.event_handler.openapi.params import Body, Path from aws_lambda_powertools.utilities.typing import LambdaContext app = BedrockAgentResolver() @app.get( "/todos/", summary="Retrieves a TODO item, returning it's title", description="Loads a TODO item identified by the `todo_id`", response_description="The TODO title", responses={ 200: {"description": "TODO item found"}, 404: { "description": "TODO not found", }, }, tags=["todos"], ) def get_todo_title( todo_id: Annotated[int, Path(description="The ID of the TODO item from which to retrieve the title")], ) -> Annotated[str, Body(description="The TODO title")]: todo = requests.get(f"https://jsonplaceholder.typicode.com/todos/{todo_id}") todo.raise_for_status() return todo.json()["title"] def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` #### Enabling user confirmation You can enable user confirmation with Bedrock Agents to have your application ask for explicit user approval before invoking an action. ``` from time import time from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import BedrockAgentResolver from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() app = BedrockAgentResolver() @app.get( "/current_time", description="Gets the current time in seconds", openapi_extensions={"x-requireConfirmation": "ENABLED"}, # (1)! ) def current_time() -> int: return int(time()) @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) if __name__ == "__main__": print(app.get_openapi_json_schema()) ``` 1. Add an openapi extension ### Fine grained responses Note The default response only includes the essential fields to keep the payload size minimal, as AWS Lambda has a maximum response size of 25 KB. You can use `BedrockResponse` class to add additional fields as needed, such as [session attributes, prompt session attributes, and knowledge base configurations](https://docs.aws.amazon.com/bedrock/latest/userguide/agents-lambda.html#agents-lambda-response). ``` from http import HTTPStatus from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import BedrockAgentResolver from aws_lambda_powertools.event_handler.api_gateway import BedrockResponse from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = BedrockAgentResolver() @app.get("/return_with_session", description="Returns a hello world with session attributes") @tracer.capture_method def hello_world(): return BedrockResponse( status_code=HTTPStatus.OK.value, body={"message": "Hello from Bedrock!"}, session_attributes={"user_id": "123"}, prompt_session_attributes={"context": "testing"}, knowledge_bases_configuration=[ { "knowledgeBaseId": "kb-123", "retrievalConfiguration": { "vectorSearchConfiguration": {"numberOfResults": 3, "overrideSearchType": "HYBRID"}, }, }, ], ) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) ``` ## Testing your code Test your routes by passing an [Agent for Amazon Bedrock proxy event](https://docs.aws.amazon.com/bedrock/latest/userguide/agents-lambda.html#agents-lambda-input) request: ``` from dataclasses import dataclass import assert_bedrock_agent_response_module import pytest @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:123456789012:function:test" aws_request_id: str = "da658bd3-2d6f-4e7b-8ec2-937234644fdc" @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() def test_lambda_handler(lambda_context: LambdaContext): minimal_event = { "apiPath": "/current_time", "httpMethod": "GET", "inputText": "What is the current time?", } # Example of Bedrock Agent API request event: # https://docs.aws.amazon.com/bedrock/latest/userguide/agents-lambda.html#agents-lambda-input ret = assert_bedrock_agent_response_module.lambda_handler(minimal_event, lambda_context) assert ret["response"]["httpStatuScode"] == 200 assert ret["response"]["responseBody"]["application/json"]["body"] != "" ``` ``` import time from typing_extensions import Annotated from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import BedrockAgentResolver from aws_lambda_powertools.event_handler.openapi.params import Body from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() logger = Logger() app = BedrockAgentResolver() @app.get("/current_time", description="Gets the current time") @tracer.capture_method def current_time() -> Annotated[int, Body(description="Current time in milliseconds")]: return round(time.time() * 1000) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` Metrics creates custom metrics asynchronously by logging metrics to standard output following [Amazon CloudWatch Embedded Metric Format (EMF)](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch_Embedded_Metric_Format.html). These metrics can be visualized through [Amazon CloudWatch Console](https://console.aws.amazon.com/cloudwatch/). ## Key features - Aggregate up to 100 metrics using a single CloudWatch EMF object (large JSON blob) - Validate against common metric definitions mistakes (metric unit, values, max dimensions, max metrics, etc) - Metrics are created asynchronously by CloudWatch service, no custom stacks needed - Context manager to create a one off metric with a different dimension ## Terminologies If you're new to Amazon CloudWatch, there are five terminologies you must be aware of before using this utility: - **Namespace**. It's the highest level container that will group multiple metrics from multiple services for a given application, for example `ServerlessEcommerce`. - **Dimensions**. Metrics metadata in key-value format. They help you slice and dice metrics visualization, for example `ColdStart` metric by Payment `service`. - **Metric**. It's the name of the metric, for example: `SuccessfulBooking` or `UpdatedBooking`. - **Unit**. It's a value representing the unit of measure for the corresponding metric, for example: `Count` or `Seconds`. - **Resolution**. It's a value representing the storage resolution for the corresponding metric. Metrics can be either Standard or High resolution. Read more [here](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/publishingMetrics.html#high-resolution-metrics). Metric terminology, visually explained ## Getting started Tip All examples shared in this documentation are available within the [project repository](https://github.com/aws-powertools/powertools-lambda-python/tree/develop/examples). Metric has two global settings that will be used across all metrics emitted: | Setting | Description | Environment variable | Constructor parameter | | --- | --- | --- | --- | | **Metric namespace** | Logical container where all metrics will be placed e.g. `ServerlessAirline` | `POWERTOOLS_METRICS_NAMESPACE` | `namespace` | | **Service** | Optionally, sets **service** metric dimension across all metrics e.g. `payment` | `POWERTOOLS_SERVICE_NAME` | `service` | Info `POWERTOOLS_METRICS_DISABLED` will not disable default metrics created by AWS services. Tip Use your application or main service as the metric namespace to easily group all metrics. ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: Powertools for AWS Lambda (Python) version Globals: Function: Timeout: 5 Runtime: python3.12 Tracing: Active Environment: Variables: POWERTOOLS_SERVICE_NAME: booking POWERTOOLS_METRICS_NAMESPACE: ServerlessAirline POWERTOOLS_METRICS_FUNCTION_NAME: my-function-name Layers: # Find the latest Layer version in the official documentation # https://docs.powertools.aws.dev/lambda/python/latest/#lambda-layer - !Sub arn:aws:lambda:${AWS::Region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15 Resources: CaptureLambdaHandlerExample: Type: AWS::Serverless::Function Properties: CodeUri: ../src Handler: capture_lambda_handler.handler ``` Note For brevity, all code snippets in this page will rely on environment variables above being set. This ensures we instantiate `metrics = Metrics()` over `metrics = Metrics(service="booking", namespace="ServerlessAirline")`, etc. ### Creating metrics You can create metrics using `add_metric`, and you can create dimensions for all your aggregate metrics using `add_dimension` method. Tip You can initialize Metrics in any other module too. It'll keep track of your aggregate metrics in memory to optimize costs (one blob instead of multiples). ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) ``` ``` import os from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext STAGE = os.getenv("STAGE", "dev") metrics = Metrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_dimension(name="environment", value=STAGE) metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) ``` Tip: Autocomplete Metric Units `MetricUnit` enum facilitate finding a supported metric unit by CloudWatch. Alternatively, you can pass the value as a string if you already know them *e.g. `unit="Count"`*. Note: Metrics overflow CloudWatch EMF supports a max of 100 metrics per batch. Metrics utility will flush all metrics when adding the 100th metric. Subsequent metrics (101th+) will be aggregated into a new EMF object, for your convenience. Warning: Do not create metrics or dimensions outside the handler Metrics or dimensions added in the global scope will only be added during cold start. Disregard if that's the intended behavior. ### Adding high-resolution metrics You can create [high-resolution metrics](https://aws.amazon.com/about-aws/whats-new/2023/02/amazon-cloudwatch-high-resolution-metric-extraction-structured-logs/) passing `resolution` parameter to `add_metric`. When is it useful? High-resolution metrics are data with a granularity of one second and are very useful in several situations such as telemetry, time series, real-time incident management, and others. ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricResolution, MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1, resolution=MetricResolution.High) ``` Tip: Autocomplete Metric Resolutions `MetricResolution` enum facilitates finding a supported metric resolution by CloudWatch. Alternatively, you can pass the values 1 or 60 (must be one of them) as an integer *e.g. `resolution=1`*. ### Adding multi-value metrics You can call `add_metric()` with the same metric name multiple times. The values will be grouped together in a list. ``` import os from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext STAGE = os.getenv("STAGE", "dev") metrics = Metrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_dimension(name="environment", value=STAGE) metrics.add_metric(name="TurbineReads", unit=MetricUnit.Count, value=1) metrics.add_metric(name="TurbineReads", unit=MetricUnit.Count, value=8) ``` ``` { "_aws": { "Timestamp": 1656685750622, "CloudWatchMetrics": [ { "Namespace": "ServerlessAirline", "Dimensions": [ [ "environment", "service" ] ], "Metrics": [ { "Name": "TurbineReads", "Unit": "Count" } ] } ] }, "environment": "dev", "service": "booking", "TurbineReads": [ 1.0, 8.0 ] } ``` ### Adding default dimensions You can use `set_default_dimensions` method, or `default_dimensions` parameter in `log_metrics` decorator, to persist dimensions across Lambda invocations. If you'd like to remove them at some point, you can use `clear_default_dimensions` method. ``` import os from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext STAGE = os.getenv("STAGE", "dev") metrics = Metrics() metrics.set_default_dimensions(environment=STAGE, another="one") @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="TurbineReads", unit=MetricUnit.Count, value=1) metrics.add_metric(name="TurbineReads", unit=MetricUnit.Count, value=8) ``` ``` import os from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext STAGE = os.getenv("STAGE", "dev") metrics = Metrics() DEFAULT_DIMENSIONS = {"environment": STAGE, "another": "one"} # ensures metrics are flushed upon request completion/failure @metrics.log_metrics(default_dimensions=DEFAULT_DIMENSIONS) def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="TurbineReads", unit=MetricUnit.Count, value=1) metrics.add_metric(name="TurbineReads", unit=MetricUnit.Count, value=8) ``` **Note:** Dimensions with empty values will not be included. ### Changing default timestamp When creating metrics, we use the current timestamp. If you want to change the timestamp of all the metrics you create, utilize the `set_timestamp` function. You can specify a datetime object or an integer representing an epoch timestamp in milliseconds. Note that when specifying the timestamp using an integer, it must adhere to the epoch timezone format in milliseconds. Info If you need to use different timestamps across multiple metrics, opt for [single_metric](#working-with-different-timestamp). ``` import datetime from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) metric_timestamp = int((datetime.datetime.now() - datetime.timedelta(days=2)).timestamp() * 1000) metrics.set_timestamp(metric_timestamp) ``` ### Flushing metrics As you finish adding all your metrics, you need to serialize and flush them to standard output. You can do that automatically with the `log_metrics` decorator. This decorator also **validates**, **serializes**, and **flushes** all your metrics. During metrics validation, if no metrics are provided then a warning will be logged, but no exception will be raised. ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) ``` ``` { "_aws": { "Timestamp": 1656686788803, "CloudWatchMetrics": [ { "Namespace": "ServerlessAirline", "Dimensions": [ [ "service" ] ], "Metrics": [ { "Name": "SuccessfulBooking", "Unit": "Count" } ] } ] }, "service": "booking", "SuccessfulBooking": [ 1.0 ] } ``` Tip: Metric validation If metrics are provided, and any of the following criteria are not met, **`SchemaValidationError`** exception will be raised: - Maximum of 29 user-defined dimensions - Namespace is set, and no more than one - Metric units must be [supported by CloudWatch](https://docs.aws.amazon.com/AmazonCloudWatch/latest/APIReference/API_MetricDatum.html) #### Raising SchemaValidationError on empty metrics If you want to ensure at least one metric is always emitted, you can pass `raise_on_empty_metrics` to the **log_metrics** decorator: ``` from aws_lambda_powertools.metrics import Metrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics(raise_on_empty_metrics=True) def lambda_handler(event: dict, context: LambdaContext): # no metrics being created will now raise SchemaValidationError ... ``` Suppressing warning messages on empty metrics If you expect your function to execute without publishing metrics every time, you can suppress the warning with **`warnings.filterwarnings("ignore", "No application metrics to publish*")`**. ### Capturing cold start metric You can optionally capture cold start metrics with `log_metrics` decorator via `capture_cold_start_metric` param. ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics(capture_cold_start_metric=True) def lambda_handler(event: dict, context: LambdaContext): ... ``` ``` { "_aws": { "Timestamp": 1656687493142, "CloudWatchMetrics": [ { "Namespace": "ServerlessAirline", "Dimensions": [ [ "function_name", "service" ] ], "Metrics": [ { "Name": "ColdStart", "Unit": "Count" } ] } ] }, "function_name": "test", "service": "booking", "ColdStart": [ 1.0 ] } ``` If it's a cold start invocation, this feature will: - Create a separate EMF blob solely containing a metric named `ColdStart` - Add `function_name` and `service` dimensions This has the advantage of keeping cold start metric separate from your application metrics, where you might have unrelated dimensions. Info We do not emit 0 as a value for ColdStart metric for cost reasons. [Let us know](https://github.com/aws-powertools/powertools-lambda-python/issues/new?assignees=&labels=feature-request%2C+triage&template=feature_request.md&title=) if you'd prefer a flag to override it. #### Customizing function name for cold start metrics When emitting cold start metrics, the `function_name` dimension defaults to `context.function_name`. If you want to change the value you can set the `function_name` parameter in the metrics constructor, or define the environment variable `POWERTOOLS_METRICS_FUNCTION_NAME`. The priority of the `function_name` dimension value is defined as: 1. `function_name` constructor option 1. `POWERTOOLS_METRICS_FUNCTION_NAME` environment variable 1. `context.function_name` property ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics(function_name="my-function-name") @metrics.log_metrics(capture_cold_start_metric=True) def lambda_handler(event: dict, context: LambdaContext): ... ``` ### Environment variables The following environment variable is available to configure Metrics at a global scope: | Setting | Description | Environment variable | Default | | --- | --- | --- | --- | | **Namespace Name** | Sets **namespace** used for metrics. | `POWERTOOLS_METRICS_NAMESPACE` | `None` | | **Service** | Sets **service** metric dimension across all metrics e.g. `payment` | `POWERTOOLS_SERVICE_NAME` | `None` | | **Function Name** | Function name used as dimension for the **ColdStart** metric. | `POWERTOOLS_METRICS_FUNCTION_NAME` | `None` | | **Disable Powertools Metrics** | **Disables** all metrics emitted by Powertools. | `POWERTOOLS_METRICS_DISABLED` | `None` | `POWERTOOLS_METRICS_NAMESPACE` is also available on a per-instance basis with the `namespace` parameter, which will consequently override the environment variable value. ## Advanced ### Adding metadata You can add high-cardinality data as part of your Metrics log with `add_metadata` method. This is useful when you want to search highly contextual information along with your metrics in your logs. Info **This will not be available during metrics visualization** - Use **dimensions** for this purpose ``` from uuid import uuid4 from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) metrics.add_metadata(key="booking_id", value=f"{uuid4()}") ``` ``` { "_aws": { "Timestamp": 1656688250155, "CloudWatchMetrics": [ { "Namespace": "ServerlessAirline", "Dimensions": [ [ "service" ] ], "Metrics": [ { "Name": "SuccessfulBooking", "Unit": "Count" } ] } ] }, "service": "booking", "booking_id": "00347014-341d-4b8e-8421-a89d3d588ab3", "SuccessfulBooking": [ 1.0 ] } ``` ### Single metric CloudWatch EMF uses the same dimensions and timestamp across all your metrics. Use `single_metric` if you have a metric that should have different dimensions or timestamp. #### Working with different dimensions Generally, using different dimensions would be an edge case since you [pay for unique metric](https://aws.amazon.com/cloudwatch/pricing). Keep the following formula in mind: **unique metric = (metric_name + dimension_name + dimension_value)** ``` import os from aws_lambda_powertools import single_metric from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext STAGE = os.getenv("STAGE", "dev") def lambda_handler(event: dict, context: LambdaContext): with single_metric(name="MySingleMetric", unit=MetricUnit.Count, value=1) as metric: metric.add_dimension(name="environment", value=STAGE) ``` ``` { "_aws": { "Timestamp": 1656689267834, "CloudWatchMetrics": [ { "Namespace": "ServerlessAirline", "Dimensions": [ [ "environment", "service" ] ], "Metrics": [ { "Name": "MySingleMetric", "Unit": "Count" } ] } ] }, "environment": "dev", "service": "booking", "MySingleMetric": [ 1.0 ] } ``` By default it will skip all previously defined dimensions including default dimensions. Use `default_dimensions` keyword argument if you want to reuse default dimensions or specify custom dimensions from a dictionary. ``` import os from aws_lambda_powertools import single_metric from aws_lambda_powertools.metrics import Metrics, MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext STAGE = os.getenv("STAGE", "dev") metrics = Metrics() metrics.set_default_dimensions(environment=STAGE) def lambda_handler(event: dict, context: LambdaContext): with single_metric( name="RecordsCount", unit=MetricUnit.Count, value=10, default_dimensions=metrics.default_dimensions, ) as metric: metric.add_dimension(name="TableName", value="Users") ``` ``` import os from aws_lambda_powertools import single_metric from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext STAGE = os.getenv("STAGE", "dev") def lambda_handler(event: dict, context: LambdaContext): with single_metric( name="RecordsCount", unit=MetricUnit.Count, value=10, default_dimensions={"environment": STAGE}, ) as metric: metric.add_dimension(name="TableName", value="Users") ``` #### Working with different timestamp When working with multiple metrics, customers may need different timestamps between them. In such cases, utilize `single_metric` to flush individual metrics with specific timestamps. ``` from aws_lambda_powertools import Logger, single_metric from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() def lambda_handler(event: dict, context: LambdaContext): for record in event: record_id: str = record.get("record_id") amount: int = record.get("amount") timestamp: int = record.get("timestamp") with single_metric(name="Orders", unit=MetricUnit.Count, value=amount, namespace="Powertools") as metric: logger.info(f"Processing record id {record_id}") metric.set_timestamp(timestamp) ``` ``` [ { "record_id": "6ba7b810-9dad-11d1-80b4-00c04fd430c8", "amount": 10, "timestamp": 1648195200000 }, { "record_id": "6ba7b811-9dad-11d1-80b4-00c04fd430c8", "amount": 30, "timestamp": 1648224000000 }, { "record_id": "6ba7b812-9dad-11d1-80b4-00c04fd430c8", "amount": 25, "timestamp": 1648209600000 }, { "record_id": "6ba7b813-9dad-11d1-80b4-00c04fd430c8", "amount": 40, "timestamp": 1648177200000 }, { "record_id": "6ba7b814-9dad-11d1-80b4-00c04fd430c8", "amount": 32, "timestamp": 1648216800000 } ] ``` ### Flushing metrics manually If you are using the [AWS Lambda Web Adapter](https://github.com/awslabs/aws-lambda-web-adapter) project, or a middleware with custom metric logic, you can use `flush_metrics()`. This method will serialize, print metrics available to standard output, and clear in-memory metrics data. Warning This does not capture Cold Start metrics, and metric data validation still applies. Contrary to the `log_metrics` decorator, you are now also responsible to flush metrics in the event of an exception. ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() def book_flight(flight_id: str, **kwargs): # logic to book flight ... metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) def lambda_handler(event: dict, context: LambdaContext): try: book_flight(flight_id=event.get("flight_id", "")) finally: metrics.flush_metrics() ``` ### Metrics isolation You can use `EphemeralMetrics` class when looking to isolate multiple instances of metrics with distinct namespaces and/or dimensions. This is a typical use case is for multi-tenant, or emitting same metrics for distinct applications. ``` from aws_lambda_powertools.metrics import EphemeralMetrics, MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = EphemeralMetrics() @metrics.log_metrics def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) ``` **Differences between `EphemeralMetrics` and `Metrics`** `EphemeralMetrics` has only one difference while keeping nearly the exact same set of features: | Feature | Metrics | EphemeralMetrics | | --- | --- | --- | | **Share data across instances** (metrics, dimensions, metadata, etc.) | Yes | - | Why not changing the default `Metrics` behaviour to not share data across instances? This is an intentional design to prevent accidental data deduplication or data loss issues due to [CloudWatch EMF](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch_Embedded_Metric_Format_Specification.html) metric dimension constraint. In CloudWatch, there are two metric ingestion mechanisms: [EMF (async)](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch_Embedded_Metric_Format_Specification.html) and [`PutMetricData` API (sync)](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/cloudwatch.html#CloudWatch.Client.put_metric_data). The former creates metrics asynchronously via CloudWatch Logs, and the latter uses a synchronous and more flexible ingestion API. Key concept CloudWatch [considers a metric unique](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/cloudwatch_concepts.html#Metric) by a combination of metric **name**, metric **namespace**, and zero or more metric **dimensions**. With EMF, metric dimensions are shared with any metrics you define. With `PutMetricData` API, you can set a [list](https://docs.aws.amazon.com/AmazonCloudWatch/latest/APIReference/API_MetricDatum.html) defining one or more metrics with distinct dimensions. This is a subtle yet important distinction. Imagine you had the following metrics to emit: | Metric Name | Dimension | Intent | | --- | --- | --- | | **SuccessfulBooking** | service="booking", **tenant_id**="sample" | Application metric | | **IntegrationLatency** | service="booking", function_name="sample" | Operational metric | | **ColdStart** | service="booking", function_name="sample" | Operational metric | The `tenant_id` dimension could vary leading to two common issues: 1. `ColdStart` metric will be created multiple times (N * number of unique tenant_id dimension value), despite the `function_name` being the same 1. `IntegrationLatency` metric will be also created multiple times due to `tenant_id` as well as `function_name` (may or not be intentional) These issues are exacerbated when you create **(A)** metric dimensions conditionally, **(B)** multiple metrics' instances throughout your code instead of reusing them (globals). Subsequent metrics' instances will have (or lack) different metric dimensions resulting in different metrics and data points with the same name. Intentional design to address these scenarios **On 1**, when you enable [capture_start_metric feature](#capturing-cold-start-metric), we transparently create and flush an additional EMF JSON Blob that is independent from your application metrics. This prevents data pollution. **On 2**, you can use `EphemeralMetrics` to create an additional EMF JSON Blob from your application metric (`SuccessfulBooking`). This ensures that `IntegrationLatency` operational metric data points aren't tied to any dynamic dimension values like `tenant_id`. That is why `Metrics` shares data across instances by default, as that covers 80% of use cases and different personas using Powertools. This allows them to instantiate `Metrics` in multiple places throughout their code - be a separate file, a middleware, or an abstraction that sets default dimensions. ### Observability providers > An observability provider is an [AWS Lambda Partner](https://docs.aws.amazon.com/lambda/latest/dg/extensions-api-partners.html) offering a platform for logging, metrics, traces, etc. We provide a thin-wrapper on top of the most requested observability providers. We strive to keep a similar UX as close as possible while keeping our value add features. Missing your preferred provider? Please create a [feature request](https://github.com/aws-powertools/powertools-lambda-python/issues/new?assignees=&labels=feature-request%2Ctriage&projects=&template=feature_request.yml&title=Feature+request%3A+TITLE). Current providers: | Provider | Notes | | --- | --- | | [Datadog](./datadog) | Uses Datadog SDK and Datadog Lambda Extension by default | ## Testing your code ### Setting environment variables Tip Ignore this section, if: - You are explicitly setting namespace/default dimension via `namespace` and `service` parameters - You're not instantiating `Metrics` in the global namespace For example, `Metrics(namespace="ServerlessAirline", service="booking")` Make sure to set `POWERTOOLS_METRICS_NAMESPACE` and `POWERTOOLS_SERVICE_NAME` before running your tests to prevent failing on `SchemaValidation` exception. You can set it before you run tests or via pytest plugins like [dotenv](https://pypi.org/project/pytest-dotenv/). ``` POWERTOOLS_SERVICE_NAME="booking" POWERTOOLS_METRICS_NAMESPACE="ServerlessAirline" python -m pytest ``` ### Clearing metrics `Metrics` keep metrics in memory across multiple instances. If you need to test this behavior, you can use the following Pytest fixture to ensure metrics are reset incl. cold start: ``` import pytest from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics.provider import cold_start @pytest.fixture(scope="function", autouse=True) def reset_metric_set(): # Clear out every metric data prior to every test metrics = Metrics() metrics.clear_metrics() cold_start.is_cold_start = True # ensure each test has cold start metrics.clear_default_dimensions() # remove persisted default dimensions, if any yield ``` ### Functional testing You can read standard output and assert whether metrics have been flushed. Here's an example using `pytest` with `capsys` built-in fixture: ``` import json import add_metrics def test_log_metrics(capsys): add_metrics.lambda_handler({}, {}) log = capsys.readouterr().out.strip() # remove any extra line metrics_output = json.loads(log) # deserialize JSON str # THEN we should have no exceptions # and a valid EMF object should be flushed correctly assert "SuccessfulBooking" in log # basic string assertion in JSON str assert "SuccessfulBooking" in metrics_output["_aws"]["CloudWatchMetrics"][0]["Metrics"][0]["Name"] ``` ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) ``` This will be needed when using `capture_cold_start_metric=True`, or when both `Metrics` and `single_metric` are used. ``` import json from dataclasses import dataclass import assert_multiple_emf_blobs_module import pytest @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:809313241:function:test" aws_request_id: str = "52fdfc07-2182-154f-163f-5f0f9a621d72" @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() def capture_metrics_output_multiple_emf_objects(capsys): return [json.loads(line.strip()) for line in capsys.readouterr().out.split("\n") if line] def test_log_metrics(capsys, lambda_context: LambdaContext): assert_multiple_emf_blobs_module.lambda_handler({}, lambda_context) cold_start_blob, custom_metrics_blob = capture_metrics_output_multiple_emf_objects(capsys) # Since `capture_cold_start_metric` is used # we should have one JSON blob for cold start metric and one for the application assert cold_start_blob["ColdStart"] == [1.0] assert cold_start_blob["function_name"] == "test" assert "SuccessfulBooking" in custom_metrics_blob ``` ``` from aws_lambda_powertools import Metrics from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.typing import LambdaContext metrics = Metrics() @metrics.log_metrics(capture_cold_start_metric=True) def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", unit=MetricUnit.Count, value=1) ``` Tip For more elaborate assertions and comparisons, check out [our functional testing for Metrics utility.](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/tests/functional/metrics/required_dependencies/test_metrics_cloudwatch_emf.py) This observability provider creates custom metrics by flushing metrics to [Datadog Lambda extension](https://docs.datadoghq.com/serverless/installation/python/?tab=datadogcli), or to standard output via [Datadog Forwarder](https://docs.datadoghq.com/logs/guide/forwarder/?tab=cloudformation). These metrics can be visualized in the [Datadog console](https://app.datadoghq.com/metric/explore). ``` stateDiagram-v2 direction LR LambdaFn: Your Lambda function LambdaCode: DatadogMetrics DatadogSDK: Datadog SDK DatadogExtension: Datadog Lambda Extension Datadog: Datadog Dashboard LambdaExtension: Lambda Extension LambdaFn --> LambdaCode LambdaCode --> DatadogSDK DatadogSDK --> DatadogExtension DatadogExtension --> Datadog: async state LambdaExtension { DatadogExtension } ``` ## Key features - Flush metrics to Datadog extension or standard output - Validate against common metric definitions mistakes - Support to add default tags ## Terminologies If you're new to Datadog Metrics, there are three terminologies you must be aware of before using this utility: - **Namespace**. It's the highest level container that will group multiple metrics from multiple services for a given application, for example `ServerlessEcommerce`. - **Metric**. It's the name of the metric, for example: SuccessfulBooking or UpdatedBooking. - **Tags**. Metrics metadata in key-value pair format. They help provide contextual information, and filter org organize metrics. You can read more details in the [Datadog official documentation](https://docs.datadoghq.com/metrics/custom_metrics/). ## Getting started Tip All examples shared in this documentation are available within the [project repository](https://github.com/aws-powertools/powertools-lambda-python/tree/develop/examples). ### Install > **Using Datadog Forwarder?** You can skip this step. We recommend using [Datadog SDK](https://docs.datadoghq.com/serverless/installation/python/) and Datadog Lambda Extension with this feature for optimal results. For Datadog SDK, you can add `aws-lambda-powertools[datadog]` as a dependency in your preferred tool, or as a Lambda Layer in the following example: ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: Powertools for AWS Lambda (Python) version Globals: Function: Timeout: 5 Runtime: python3.12 Tracing: Active Environment: Variables: POWERTOOLS_METRICS_NAMESPACE: ServerlessAirline # [Production setup] # DATADOG_API_KEY_SECRET_ARN: "" # [Development only] DD_API_KEY: "" # Configuration details: https://docs.datadoghq.com/serverless/installation/python/?tab=datadogcli DD_SITE: datadoghq.com Layers: # Find the latest Layer version in the official documentation # https://docs.powertools.aws.dev/lambda/python/latest/#lambda-layer - !Sub arn:aws:lambda:${AWS::Region}:017000801446:layer:AWSLambdaPowertoolsPythonV3-python312-x86_64:15 # Find the latest Layer version in the Datadog official documentation # Datadog SDK # Latest versions: https://github.com/DataDog/datadog-lambda-python/releases - !Sub arn:aws:lambda:${AWS::Region}:464622532012:layer:Datadog-Python312:78 # Datadog Lambda Extension # Latest versions: https://github.com/DataDog/datadog-lambda-extension/releases - !Sub arn:aws:lambda:${AWS::Region}:464622532012:layer:Datadog-Extension:45 Resources: CaptureLambdaHandlerExample: Type: AWS::Serverless::Function Properties: CodeUri: ../src Handler: capture_lambda_handler.handler ``` ### Creating metrics You can create metrics using `add_metric`. By default, we will generate the current timestamp for you. Alternatively, you can use the `timestamp` parameter to set a custom one in epoch time. ``` from aws_lambda_powertools.metrics.provider.datadog import DatadogMetrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = DatadogMetrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", value=1) ``` ``` import time from aws_lambda_powertools.metrics.provider.datadog import DatadogMetrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = DatadogMetrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", value=1, timestamp=int(time.time())) ``` Warning: Do not create metrics outside the handler Metrics added in the global scope will only be added during cold start. Disregard if you that's the intended behavior. ### Adding tags You can add any number of tags to your metrics via keyword arguments (`key=value`). They are helpful to filter, organize, and aggregate your metrics later. We will emit a warning for tags [beyond the 200 chars limit](https://docs.datadoghq.com/getting_started/tagging/). ``` from aws_lambda_powertools.metrics.provider.datadog import DatadogMetrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = DatadogMetrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", value=1, tag1="powertools", tag2="python") ``` ### Adding default tags You can persist tags across Lambda invocations and `DatadogMetrics` instances via `set_default_tags` method, or `default_tags` parameter in the `log_metrics` decorator. If you'd like to remove them at some point, you can use the `clear_default_tags` method. Metric tag takes precedence over default tags of the same name When adding tags with the same name via `add_metric` and `set_default_tags`, `add_metric` takes precedence. ``` from aws_lambda_powertools.metrics.provider.datadog import DatadogMetrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = DatadogMetrics() metrics.set_default_tags(tag1="powertools", tag2="python") @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", value=1) ``` ``` from aws_lambda_powertools.metrics.provider.datadog import DatadogMetrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = DatadogMetrics() default_tags = {"tag1": "powertools", "tag2": "python"} @metrics.log_metrics(default_tags=default_tags) # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", value=1) ``` ### Flushing metrics Use `log_metrics` decorator to automatically serialize and flush your metrics (SDK or Forwarder) at the end of your invocation. This decorator also ensures metrics are flushed in the event of an exception, including warning you in case you forgot to add metrics. ``` from aws_lambda_powertools.metrics.provider.datadog import DatadogMetrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = DatadogMetrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", value=1, tag1="powertools", tag2="python") ``` ``` { "m":"SuccessfulBooking", "v":1, "e":1691707076, "t":[ "tag1:powertools", "tag2:python" ] } ``` #### Raising SchemaValidationError on empty metrics Use `raise_on_empty_metrics=True` if you want to ensure at least one metric is always emitted. ``` from aws_lambda_powertools.metrics.provider.datadog import DatadogMetrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = DatadogMetrics() @metrics.log_metrics(raise_on_empty_metrics=True) # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): # no metrics being created will now raise SchemaValidationError return ``` Suppressing warning messages on empty metrics If you expect your function to execute without publishing metrics every time, you can suppress the warning with **`warnings.filterwarnings("ignore", "No application metrics to publish*")`**. ### Capturing cold start metric You can optionally capture cold start metrics with `log_metrics` decorator via `capture_cold_start_metric` param. ``` from aws_lambda_powertools.metrics.provider.datadog import DatadogMetrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = DatadogMetrics() @metrics.log_metrics(capture_cold_start_metric=True) def lambda_handler(event: dict, context: LambdaContext): return ``` ``` { "m":"ColdStart", "v":1, "e":1691707488, "t":[ "function_name:HelloWorldFunction" ] } ``` If it's a cold start invocation, this feature will: - Create a separate Datadog metric solely containing a metric named `ColdStart` - Add `function_name` metric tag This has the advantage of keeping cold start metric separate from your application metrics, where you might have unrelated tags. Info We do not emit 0 as a value for ColdStart metric for cost reasons. [Let us know](https://github.com/aws-powertools/powertools-lambda-python/issues/new?assignees=&labels=feature-request%2C+triage&template=feature_request.md&title=) if you'd prefer a flag to override it. ### Environment variables You can use any of the following environment variables to configure `DatadogMetrics`: | Setting | Description | Environment variable | Constructor parameter | | --- | --- | --- | --- | | **Metric namespace** | Logical container where all metrics will be placed e.g. `ServerlessAirline` | `POWERTOOLS_METRICS_NAMESPACE` | `namespace` | | **Flush to log** | Use this when you want to flush metrics to be exported through Datadog Forwarder | `DD_FLUSH_TO_LOG` | `flush_to_log` | | **Disable Powertools Metrics** | Optionally, disables all Powertools metrics. | `POWERTOOLS_METRICS_DISABLED` | N/A | Info `POWERTOOLS_METRICS_DISABLED` will not disable default metrics created by AWS services. ## Advanced ### Flushing metrics manually If you are using the [AWS Lambda Web Adapter](https://github.com/awslabs/aws-lambda-web-adapter) project, or a middleware with custom metric logic, you can use `flush_metrics()`. This method will serialize, print metrics available to standard output, and clear in-memory metrics data. Warning This does not capture Cold Start metrics, and metric data validation still applies. Contrary to the `log_metrics` decorator, you are now also responsible to flush metrics in the event of an exception. ``` from aws_lambda_powertools.metrics.provider.datadog import DatadogMetrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = DatadogMetrics() def book_flight(flight_id: str, **kwargs): # logic to book flight ... metrics.add_metric(name="SuccessfulBooking", value=1) def lambda_handler(event: dict, context: LambdaContext): try: book_flight(flight_id=event.get("flight_id", "")) finally: metrics.flush_metrics() ``` ### Integrating with Datadog Forwarder Use `flush_to_log=True` in `DatadogMetrics` to integrate with the legacy [Datadog Forwarder](https://docs.datadoghq.com/logs/guide/forwarder/?tab=cloudformation). This will serialize and flush metrics to standard output. ``` from aws_lambda_powertools.metrics.provider.datadog import DatadogMetrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = DatadogMetrics(flush_to_log=True) @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", value=1) ``` ``` { "m":"SuccessfulBooking", "v":1, "e":1691768022, "t":[ ] } ``` ## Testing your code ### Setting environment variables Tip Ignore this section, if: - You are explicitly setting namespace via `namespace` parameter - You're not instantiating `DatadogMetrics` in the global namespace For example, `DatadogMetrics(namespace="ServerlessAirline")` Make sure to set `POWERTOOLS_METRICS_NAMESPACE` before running your tests to prevent failing on `SchemaValidation` exception. You can set it before you run tests or via pytest plugins like [dotenv](https://pypi.org/project/pytest-dotenv/). ``` POWERTOOLS_METRICS_NAMESPACE="ServerlessAirline" DD_FLUSH_TO_LOG="True" python -m pytest # (1)! ``` 1. **`DD_FLUSH_TO_LOG=True`** makes it easier to test by flushing final metrics to standard output. ### Clearing metrics `DatadogMetrics` keep metrics in memory across multiple instances. If you need to test this behavior, you can use the following Pytest fixture to ensure metrics are reset incl. cold start: ``` import pytest from aws_lambda_powertools.metrics.provider import cold_start from aws_lambda_powertools.metrics.provider.datadog import DatadogMetrics @pytest.fixture(scope="function", autouse=True) def reset_metric_set(): # Clear out every metric data prior to every test metrics = DatadogMetrics() metrics.clear_metrics() cold_start.is_cold_start = True # ensure each test has cold start yield ``` ### Functional testing You can read standard output and assert whether metrics have been flushed. Here's an example using `pytest` with `capsys` built-in fixture: ``` import add_datadog_metrics def test_log_metrics(capsys): add_datadog_metrics.lambda_handler({}, {}) log = capsys.readouterr().out.strip() # remove any extra line assert "SuccessfulBooking" in log # basic string assertion in JSON str ``` ``` from aws_lambda_powertools.metrics.provider.datadog import DatadogMetrics from aws_lambda_powertools.utilities.typing import LambdaContext metrics = DatadogMetrics() @metrics.log_metrics # ensures metrics are flushed upon request completion/failure def lambda_handler(event: dict, context: LambdaContext): metrics.add_metric(name="SuccessfulBooking", value=1) ``` Tip For more elaborate assertions and comparisons, check out [our functional testing for DatadogMetrics utility.](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/tests/functional/metrics/datadog/test_metrics_datadog.py) # Utilities The batch processing utility handles partial failures when processing batches from Amazon SQS, Amazon Kinesis Data Streams, and Amazon DynamoDB Streams. ``` stateDiagram-v2 direction LR BatchSource: Amazon SQS

Amazon Kinesis Data Streams

Amazon DynamoDB Streams

LambdaInit: Lambda invocation BatchProcessor: Batch Processor RecordHandler: Record Handler function YourLogic: Your logic to process each batch item LambdaResponse: Lambda response BatchSource --> LambdaInit LambdaInit --> BatchProcessor BatchProcessor --> RecordHandler state BatchProcessor { [*] --> RecordHandler: Your function RecordHandler --> YourLogic } RecordHandler --> BatchProcessor: Collect results BatchProcessor --> LambdaResponse: Report items that failed processing ``` ## Key features - Reports batch item failures to reduce number of retries for a record upon errors - Simple interface to process each batch record - Integrates with [Event Source Data Classes](../data_classes/) and [Parser (Pydantic)](../parser/) for self-documenting record schema - Build your own batch processor by extending primitives ## Background When using SQS, Kinesis Data Streams, or DynamoDB Streams as a Lambda event source, your Lambda functions are triggered with a batch of messages. If your function fails to process any message from the batch, the entire batch returns to your queue or stream. This same batch is then retried until either condition happens first: **a)** your Lambda function returns a successful response, **b)** record reaches maximum retry attempts, or **c)** records expire. ``` journey section Conditions Successful response: 5: Success Maximum retries: 3: Failure Records expired: 1: Failure ``` This behavior changes when you enable Report Batch Item Failures feature in your Lambda function event source configuration: - [**SQS queues**](#sqs-standard). Only messages reported as failure will return to the queue for a retry, while successful ones will be deleted. - [**Kinesis data streams**](#kinesis-and-dynamodb-streams) and [**DynamoDB streams**](#kinesis-and-dynamodb-streams). Single reported failure will use its sequence number as the stream checkpoint. Multiple reported failures will use the lowest sequence number as checkpoint. Warning: This utility lowers the chance of processing records more than once; it does not guarantee it We recommend implementing processing logic in an [idempotent manner](../idempotency/) wherever possible. You can find more details on how Lambda works with either [SQS](https://docs.aws.amazon.com/lambda/latest/dg/with-sqs.html), [Kinesis](https://docs.aws.amazon.com/lambda/latest/dg/with-kinesis.html), or [DynamoDB](https://docs.aws.amazon.com/lambda/latest/dg/with-ddb.html) in the AWS Documentation. ## Getting started For this feature to work, you need to **(1)** configure your Lambda function event source to use `ReportBatchItemFailures`, and **(2)** return [a specific response](https://docs.aws.amazon.com/lambda/latest/dg/with-sqs.html#services-sqs-batchfailurereporting) to report which records failed to be processed. You use your preferred deployment framework to set the correct configuration while this utility handles the correct response to be returned. ### Required resources The remaining sections of the documentation will rely on these samples. For completeness, this demonstrates IAM permissions and Dead Letter Queue where batch records will be sent after 2 retries were attempted. You do not need any additional IAM permissions to use this utility, except for what each event source requires. ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: partial batch response sample Globals: Function: Timeout: 5 MemorySize: 256 Runtime: python3.12 Tracing: Active Environment: Variables: POWERTOOLS_LOG_LEVEL: INFO POWERTOOLS_SERVICE_NAME: hello Resources: HelloWorldFunction: Type: AWS::Serverless::Function Properties: Handler: app.lambda_handler CodeUri: hello_world Policies: - SQSPollerPolicy: QueueName: !GetAtt SampleQueue.QueueName Events: Batch: Type: SQS Properties: Queue: !GetAtt SampleQueue.Arn FunctionResponseTypes: - ReportBatchItemFailures SampleDLQ: Type: AWS::SQS::Queue SampleQueue: Type: AWS::SQS::Queue Properties: VisibilityTimeout: 30 # Fn timeout * 6 SqsManagedSseEnabled: true RedrivePolicy: maxReceiveCount: 2 deadLetterTargetArn: !GetAtt SampleDLQ.Arn ``` ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: partial batch response sample Globals: Function: Timeout: 5 MemorySize: 256 Runtime: python3.12 Tracing: Active Environment: Variables: POWERTOOLS_LOG_LEVEL: INFO POWERTOOLS_SERVICE_NAME: hello Resources: HelloWorldFunction: Type: AWS::Serverless::Function Properties: Handler: app.lambda_handler CodeUri: hello_world Policies: # Lambda Destinations require additional permissions # to send failure records to DLQ from Kinesis/DynamoDB - Version: "2012-10-17" Statement: Effect: "Allow" Action: - sqs:GetQueueAttributes - sqs:GetQueueUrl - sqs:SendMessage Resource: !GetAtt SampleDLQ.Arn Events: KinesisStream: Type: Kinesis Properties: Stream: !GetAtt SampleStream.Arn BatchSize: 100 StartingPosition: LATEST MaximumRetryAttempts: 2 DestinationConfig: OnFailure: Destination: !GetAtt SampleDLQ.Arn FunctionResponseTypes: - ReportBatchItemFailures SampleDLQ: Type: AWS::SQS::Queue SampleStream: Type: AWS::Kinesis::Stream Properties: ShardCount: 1 StreamEncryption: EncryptionType: KMS KeyId: alias/aws/kinesis ``` ``` AWSTemplateFormatVersion: '2010-09-09' Transform: AWS::Serverless-2016-10-31 Description: partial batch response sample Globals: Function: Timeout: 5 MemorySize: 256 Runtime: python3.12 Tracing: Active Environment: Variables: POWERTOOLS_LOG_LEVEL: INFO POWERTOOLS_SERVICE_NAME: hello Resources: HelloWorldFunction: Type: AWS::Serverless::Function Properties: Handler: app.lambda_handler CodeUri: hello_world Policies: # Lambda Destinations require additional permissions # to send failure records from Kinesis/DynamoDB - Version: "2012-10-17" Statement: Effect: "Allow" Action: - sqs:GetQueueAttributes - sqs:GetQueueUrl - sqs:SendMessage Resource: !GetAtt SampleDLQ.Arn Events: DynamoDBStream: Type: DynamoDB Properties: Stream: !GetAtt SampleTable.StreamArn StartingPosition: LATEST MaximumRetryAttempts: 2 DestinationConfig: OnFailure: Destination: !GetAtt SampleDLQ.Arn FunctionResponseTypes: - ReportBatchItemFailures SampleDLQ: Type: AWS::SQS::Queue SampleTable: Type: AWS::DynamoDB::Table Properties: BillingMode: PAY_PER_REQUEST AttributeDefinitions: - AttributeName: pk AttributeType: S - AttributeName: sk AttributeType: S KeySchema: - AttributeName: pk KeyType: HASH - AttributeName: sk KeyType: RANGE SSESpecification: SSEEnabled: true StreamSpecification: StreamViewType: NEW_AND_OLD_IMAGES ``` ### Processing messages from SQS Processing batches from SQS works in three stages: 1. Instantiate **`BatchProcessor`** and choose **`EventType.SQS`** for the event type 1. Define your function to handle each batch record, and use [`SQSRecord`](../data_classes/#sqs) type annotation for autocompletion 1. Use **`process_partial_response`** to kick off processing This code example uses Tracer and Logger for completion. ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import ( BatchProcessor, EventType, process_partial_response, ) from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.SQS) # (1)! tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: SQSRecord): # (2)! payload: str = record.json_body # if json string data, otherwise record.body for str logger.info(payload) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): return process_partial_response( # (3)! event=event, record_handler=record_handler, processor=processor, context=context, ) ``` 1. **Step 1**. Creates a partial failure batch processor for SQS queues. See [partial failure mechanics for details](#partial-failure-mechanics) 1. **Step 2**. Defines a function to receive one record at a time from the batch 1. **Step 3**. Kicks off processing ``` import json from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import BatchProcessor, EventType from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.SQS) tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: SQSRecord): payload: str = record.body if payload: item: dict = json.loads(payload) logger.info(item) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): batch = event["Records"] with processor(records=batch, handler=record_handler): processed_messages = processor.process() # kick off processing, return list[tuple] logger.info(f"Processed ${len(processed_messages)} messages") return processor.response() ``` The second record failed to be processed, therefore the processor added its message ID in the response. ``` { "batchItemFailures": [ { "itemIdentifier": "244fc6b4-87a3-44ab-83d2-361172410c3a" } ] } ``` ``` { "Records": [ { "messageId": "059f36b4-87a3-44ab-83d2-661975830a7d", "receiptHandle": "AQEBwJnKyrHigUMZj6rYigCgxlaS3SLy0a", "body": "{\"Message\": \"success\"}", "attributes": { "ApproximateReceiveCount": "1", "SentTimestamp": "1545082649183", "SenderId": "AIDAIENQZJOLO23YVJ4VO", "ApproximateFirstReceiveTimestamp": "1545082649185" }, "messageAttributes": {}, "md5OfBody": "e4e68fb7bd0e697a0ae8f1bb342846b3", "eventSource": "aws:sqs", "eventSourceARN": "arn:aws:sqs:us-east-2: 123456789012:my-queue", "awsRegion": "us-east-1" }, { "messageId": "244fc6b4-87a3-44ab-83d2-361172410c3a", "receiptHandle": "AQEBwJnKyrHigUMZj6rYigCgxlaS3SLy0a", "body": "SGVsbG8sIHRoaXMgaXMgYSB0ZXN0Lg==", "attributes": { "ApproximateReceiveCount": "1", "SentTimestamp": "1545082649183", "SenderId": "AIDAIENQZJOLO23YVJ4VO", "ApproximateFirstReceiveTimestamp": "1545082649185" }, "messageAttributes": {}, "md5OfBody": "e4e68fb7bd0e697a0ae8f1bb342846b3", "eventSource": "aws:sqs", "eventSourceARN": "arn:aws:sqs:us-east-2: 123456789012:my-queue", "awsRegion": "us-east-1" } ] } ``` #### FIFO queues When working with [SQS FIFO queues](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/FIFO-queues.html), a batch may include messages from different group IDs. By default, we will stop processing at the first failure and mark unprocessed messages as failed to preserve ordering. However, this behavior may not be optimal for customers who wish to proceed with processing messages from a different group ID. Enable the `skip_group_on_error` option for seamless processing of messages from various group IDs. This setup ensures that messages from a failed group ID are sent back to SQS, enabling uninterrupted processing of messages from the subsequent group ID. ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import ( SqsFifoPartialProcessor, process_partial_response, ) from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext processor = SqsFifoPartialProcessor() # (1)! tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: SQSRecord): payload: str = record.json_body # if json string data, otherwise record.body for str logger.info(payload) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): return process_partial_response(event=event, record_handler=record_handler, processor=processor, context=context) ``` 1. **Step 1**. Creates a partial failure batch processor for SQS FIFO queues. See [partial failure mechanics for details](#partial-failure-mechanics) ``` import json from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import SqsFifoPartialProcessor from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext processor = SqsFifoPartialProcessor() tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: SQSRecord): payload: str = record.body if payload: item: dict = json.loads(payload) logger.info(item) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): batch = event["Records"] with processor(records=batch, handler=record_handler): processor.process() # kick off processing, return List[Tuple] return processor.response() ``` ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import ( SqsFifoPartialProcessor, process_partial_response, ) from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext processor = SqsFifoPartialProcessor(skip_group_on_error=True) tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: SQSRecord): payload: str = record.json_body # if json string data, otherwise record.body for str logger.info(payload) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): return process_partial_response(event=event, record_handler=record_handler, processor=processor, context=context) ``` ### Processing messages from Kinesis Processing batches from Kinesis works in three stages: 1. Instantiate **`BatchProcessor`** and choose **`EventType.KinesisDataStreams`** for the event type 1. Define your function to handle each batch record, and use [`KinesisStreamRecord`](../data_classes/#kinesis-streams) type annotation for autocompletion 1. Use **`process_partial_response`** to kick off processing This code example uses Tracer and Logger for completion. ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import ( BatchProcessor, EventType, process_partial_response, ) from aws_lambda_powertools.utilities.data_classes.kinesis_stream_event import ( KinesisStreamRecord, ) from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.KinesisDataStreams) # (1)! tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: KinesisStreamRecord): logger.info(record.kinesis.data_as_text) payload: dict = record.kinesis.data_as_json() logger.info(payload) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): return process_partial_response(event=event, record_handler=record_handler, processor=processor, context=context) ``` 1. **Step 1**. Creates a partial failure batch processor for Kinesis Data Streams. See [partial failure mechanics for details](#partial-failure-mechanics) ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import BatchProcessor, EventType from aws_lambda_powertools.utilities.data_classes.kinesis_stream_event import ( KinesisStreamRecord, ) from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.KinesisDataStreams) tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: KinesisStreamRecord): logger.info(record.kinesis.data_as_text) payload: dict = record.kinesis.data_as_json() logger.info(payload) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): batch = event["Records"] with processor(records=batch, handler=record_handler): processed_messages = processor.process() # kick off processing, return list[tuple] logger.info(f"Processed ${len(processed_messages)} messages") return processor.response() ``` The second record failed to be processed, therefore the processor added its sequence number in the response. ``` { "batchItemFailures": [ { "itemIdentifier": "6006958808509702859251049540584488075644979031228738" } ] } ``` ``` { "Records": [ { "kinesis": { "kinesisSchemaVersion": "1.0", "partitionKey": "1", "sequenceNumber": "4107859083838847772757075850904226111829882106684065", "data": "eyJNZXNzYWdlIjogInN1Y2Nlc3MifQ==", "approximateArrivalTimestamp": 1545084650.987 }, "eventSource": "aws:kinesis", "eventVersion": "1.0", "eventID": "shardId-000000000006:4107859083838847772757075850904226111829882106684065", "eventName": "aws:kinesis:record", "invokeIdentityArn": "arn:aws:iam::123456789012:role/lambda-role", "awsRegion": "us-east-2", "eventSourceARN": "arn:aws:kinesis:us-east-2:123456789012:stream/lambda-stream" }, { "kinesis": { "kinesisSchemaVersion": "1.0", "partitionKey": "1", "sequenceNumber": "6006958808509702859251049540584488075644979031228738", "data": "c3VjY2Vzcw==", "approximateArrivalTimestamp": 1545084650.987 }, "eventSource": "aws:kinesis", "eventVersion": "1.0", "eventID": "shardId-000000000006:6006958808509702859251049540584488075644979031228738", "eventName": "aws:kinesis:record", "invokeIdentityArn": "arn:aws:iam::123456789012:role/lambda-role", "awsRegion": "us-east-2", "eventSourceARN": "arn:aws:kinesis:us-east-2:123456789012:stream/lambda-stream" } ] } ``` ### Processing messages from DynamoDB Processing batches from DynamoDB Streams works in three stages: 1. Instantiate **`BatchProcessor`** and choose **`EventType.DynamoDBStreams`** for the event type 1. Define your function to handle each batch record, and use [`DynamoDBRecord`](../data_classes/#dynamodb-streams) type annotation for autocompletion 1. Use **`process_partial_response`** to kick off processing This code example uses Tracer and Logger for completion. ``` import json from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import ( BatchProcessor, EventType, process_partial_response, ) from aws_lambda_powertools.utilities.data_classes.dynamo_db_stream_event import ( DynamoDBRecord, ) from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.DynamoDBStreams) # (1)! tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: DynamoDBRecord): if record.dynamodb and record.dynamodb.new_image: logger.info(record.dynamodb.new_image) message = record.dynamodb.new_image.get("Message") if message: payload: dict = json.loads(message) logger.info(payload) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): return process_partial_response(event=event, record_handler=record_handler, processor=processor, context=context) ``` 1. **Step 1**. Creates a partial failure batch processor for DynamoDB Streams. See [partial failure mechanics for details](#partial-failure-mechanics) ``` import json from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import BatchProcessor, EventType from aws_lambda_powertools.utilities.data_classes.dynamo_db_stream_event import ( DynamoDBRecord, ) from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.DynamoDBStreams) tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: DynamoDBRecord): if record.dynamodb and record.dynamodb.new_image: logger.info(record.dynamodb.new_image) message = record.dynamodb.new_image.get("Message") if message: payload: dict = json.loads(message) logger.info(payload) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): batch = event["Records"] with processor(records=batch, handler=record_handler): processed_messages = processor.process() # kick off processing, return list[tuple] logger.info(f"Processed ${len(processed_messages)} messages") return processor.response() ``` The second record failed to be processed, therefore the processor added its sequence number in the response. ``` { "batchItemFailures": [ { "itemIdentifier": "8640712661" } ] } ``` ``` { "Records": [ { "eventID": "1", "eventVersion": "1.0", "dynamodb": { "Keys": { "Id": { "N": "101" } }, "NewImage": { "Message": { "S": "failure" } }, "StreamViewType": "NEW_AND_OLD_IMAGES", "SequenceNumber": "3275880929", "SizeBytes": 26 }, "awsRegion": "us-west-2", "eventName": "INSERT", "eventSourceARN": "eventsource_arn", "eventSource": "aws:dynamodb" }, { "eventID": "1", "eventVersion": "1.0", "dynamodb": { "Keys": { "Id": { "N": "101" } }, "NewImage": { "SomethingElse": { "S": "success" } }, "StreamViewType": "NEW_AND_OLD_IMAGES", "SequenceNumber": "8640712661", "SizeBytes": 26 }, "awsRegion": "us-west-2", "eventName": "INSERT", "eventSourceARN": "eventsource_arn", "eventSource": "aws:dynamodb" } ] } ``` ### Error handling By default, we catch any exception raised by your record handler function. This allows us to **(1)** continue processing the batch, **(2)** collect each batch item that failed processing, and **(3)** return the appropriate response correctly without failing your Lambda function execution. ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import ( BatchProcessor, EventType, process_partial_response, ) from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.SQS) tracer = Tracer() logger = Logger() class InvalidPayload(Exception): ... @tracer.capture_method def record_handler(record: SQSRecord): payload: str = record.body logger.info(payload) if not payload: raise InvalidPayload("Payload does not contain minimum information to be processed.") # (1)! @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): return process_partial_response( # (2)! event=event, record_handler=record_handler, processor=processor, context=context, ) ``` 1. Any exception works here. See [extending BatchProcessor section, if you want to override this behavior.](#extending-batchprocessor) 1. Exceptions raised in `record_handler` will propagate to `process_partial_response`. We catch them and include each failed batch item identifier in the response dictionary (see `Sample response` tab). ``` { "batchItemFailures": [ { "itemIdentifier": "244fc6b4-87a3-44ab-83d2-361172410c3a" } ] } ``` ### Partial failure mechanics All batch items will be passed to the record handler for processing, even if exceptions are thrown - Here's the behavior after completing the batch: - **All records successfully processed**. We will return an empty list of item failures `{'batchItemFailures': []}` - **Partial success with some exceptions**. We will return a list of all item IDs/sequence numbers that failed processing - **All records failed to be processed**. We will raise `BatchProcessingError` exception with a list of all exceptions raised when processing The following sequence diagrams explain how each Batch processor behaves under different scenarios. #### SQS Standard > Read more about [Batch Failure Reporting feature in AWS Lambda](https://docs.aws.amazon.com/lambda/latest/dg/with-sqs.html#services-sqs-batchfailurereporting). Sequence diagram to explain how [`BatchProcessor` works](#processing-messages-from-sqs) with SQS Standard queues. ``` sequenceDiagram autonumber participant SQS queue participant Lambda service participant Lambda function Lambda service->>SQS queue: Poll Lambda service->>Lambda function: Invoke (batch event) Lambda function->>Lambda service: Report some failed messages activate SQS queue Lambda service->>SQS queue: Delete successful messages SQS queue-->>SQS queue: Failed messages return Note over SQS queue,Lambda service: Process repeat deactivate SQS queue ``` *SQS mechanism with Batch Item Failures* #### SQS FIFO > Read more about [Batch Failure Reporting feature in AWS Lambda](https://docs.aws.amazon.com/lambda/latest/dg/with-sqs.html#services-sqs-batchfailurereporting). Sequence diagram to explain how [`SqsFifoPartialProcessor` works](#fifo-queues) with SQS FIFO queues without `skip_group_on_error` flag. ``` sequenceDiagram autonumber participant SQS queue participant Lambda service participant Lambda function Lambda service->>SQS queue: Poll Lambda service->>Lambda function: Invoke (batch event) activate Lambda function Lambda function-->Lambda function: Process 2 out of 10 batch items Lambda function--xLambda function: Fail on 3rd batch item Lambda function->>Lambda service: Report 3rd batch item and unprocessed messages as failure deactivate Lambda function activate SQS queue Lambda service->>SQS queue: Delete successful messages (1-2) SQS queue-->>SQS queue: Failed messages return (3-10) deactivate SQS queue ``` *SQS FIFO mechanism with Batch Item Failures* Sequence diagram to explain how [`SqsFifoPartialProcessor` works](#fifo-queues) with SQS FIFO queues with `skip_group_on_error` flag. ``` sequenceDiagram autonumber participant SQS queue participant Lambda service participant Lambda function Lambda service->>SQS queue: Poll Lambda service->>Lambda function: Invoke (batch event) activate Lambda function Lambda function-->Lambda function: Process 2 out of 10 batch items Lambda function--xLambda function: Fail on 3rd batch item Lambda function-->Lambda function: Process messages from another MessageGroupID Lambda function->>Lambda service: Report 3rd batch item and all messages within the same MessageGroupID as failure deactivate Lambda function activate SQS queue Lambda service->>SQS queue: Delete successful messages processed SQS queue-->>SQS queue: Failed messages return deactivate SQS queue ``` *SQS FIFO mechanism with Batch Item Failures* #### Kinesis and DynamoDB Streams > Read more about [Batch Failure Reporting feature](https://docs.aws.amazon.com/lambda/latest/dg/with-kinesis.html#services-kinesis-batchfailurereporting). Sequence diagram to explain how `BatchProcessor` works with both [Kinesis Data Streams](#processing-messages-from-kinesis) and [DynamoDB Streams](#processing-messages-from-dynamodb). For brevity, we will use `Streams` to refer to either services. For theory on stream checkpoints, see this [blog post](https://aws.amazon.com/blogs/compute/optimizing-batch-processing-with-custom-checkpoints-in-aws-lambda/) ``` sequenceDiagram autonumber participant Streams participant Lambda service participant Lambda function Lambda service->>Streams: Poll latest records Lambda service->>Lambda function: Invoke (batch event) activate Lambda function Lambda function-->Lambda function: Process 2 out of 10 batch items Lambda function--xLambda function: Fail on 3rd batch item Lambda function-->Lambda function: Continue processing batch items (4-10) Lambda function->>Lambda service: Report batch item as failure (3) deactivate Lambda function activate Streams Lambda service->>Streams: Checkpoints to sequence number from 3rd batch item Lambda service->>Streams: Poll records starting from updated checkpoint deactivate Streams ``` *Kinesis and DynamoDB streams mechanism with single batch item failure* The behavior changes slightly when there are multiple item failures. Stream checkpoint is updated to the lowest sequence number reported. Note that the batch item sequence number could be different from batch item number in the illustration. ``` sequenceDiagram autonumber participant Streams participant Lambda service participant Lambda function Lambda service->>Streams: Poll latest records Lambda service->>Lambda function: Invoke (batch event) activate Lambda function Lambda function-->Lambda function: Process 2 out of 10 batch items Lambda function--xLambda function: Fail on 3-5 batch items Lambda function-->Lambda function: Continue processing batch items (6-10) Lambda function->>Lambda service: Report batch items as failure (3-5) deactivate Lambda function activate Streams Lambda service->>Streams: Checkpoints to lowest sequence number Lambda service->>Streams: Poll records starting from updated checkpoint deactivate Streams ``` *Kinesis and DynamoDB streams mechanism with multiple batch item failures* ### Processing messages asynchronously > New to AsyncIO? Read this [comprehensive guide first](https://realpython.com/async-io-python/). You can use `AsyncBatchProcessor` class and `async_process_partial_response` function to process messages concurrently. When is this useful? Your use case might be able to process multiple records at the same time without conflicting with one another. For example, imagine you need to process multiple loyalty points and incrementally save them in the database. While you await the database to confirm your records are saved, you could start processing another request concurrently. The reason this is not the default behaviour is that not all use cases can handle concurrency safely (e.g., loyalty points must be updated in order). ``` import httpx # external dependency from aws_lambda_powertools.utilities.batch import ( AsyncBatchProcessor, EventType, async_process_partial_response, ) from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext processor = AsyncBatchProcessor(event_type=EventType.SQS) async def async_record_handler(record: SQSRecord): # Yield control back to the event loop to schedule other tasks # while you await from a response from httpbin.org async with httpx.AsyncClient() as client: ret = await client.get("https://httpbin.org/get") return ret.status_code def lambda_handler(event, context: LambdaContext): return async_process_partial_response( event=event, record_handler=async_record_handler, processor=processor, context=context, ) ``` Using tracer? `AsyncBatchProcessor` uses `asyncio.gather`. This might cause [side effects and reach trace limits at high concurrency](../../core/tracer/#concurrent-asynchronous-functions). ## Advanced ### Pydantic integration You can bring your own Pydantic models via **`model`** parameter when inheriting from **`SqsRecordModel`**, **`KinesisDataStreamRecord`**, or **`DynamoDBStreamRecordModel`** Inheritance is importance because we need to access message IDs and sequence numbers from these records in the event of failure. Mypy is fully integrated with this utility, so it should identify whether you're passing the incorrect Model. ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import ( BatchProcessor, EventType, process_partial_response, ) from aws_lambda_powertools.utilities.parser import BaseModel from aws_lambda_powertools.utilities.parser.models import SqsRecordModel from aws_lambda_powertools.utilities.parser.types import Json from aws_lambda_powertools.utilities.typing import LambdaContext class Order(BaseModel): item: dict class OrderSqsRecord(SqsRecordModel): # type: ignore[override] body: Json[Order] # deserialize order data from JSON string processor = BatchProcessor(event_type=EventType.SQS, model=OrderSqsRecord) tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: OrderSqsRecord): logger.info(record.body.item) return record.body.item @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): return process_partial_response(event=event, record_handler=record_handler, processor=processor, context=context) ``` ``` { "Records": [ { "messageId": "059f36b4-87a3-44ab-83d2-661975830a7d", "receiptHandle": "AQEBwJnKyrHigUMZj6rYigCgxlaS3SLy0a", "body": "{\"item\": {\"laptop\": \"amd\"}}", "attributes": { "ApproximateReceiveCount": "1", "SentTimestamp": "1545082649183", "SenderId": "AIDAIENQZJOLO23YVJ4VO", "ApproximateFirstReceiveTimestamp": "1545082649185" }, "messageAttributes": {}, "md5OfBody": "e4e68fb7bd0e697a0ae8f1bb342846b3", "eventSource": "aws:sqs", "eventSourceARN": "arn:aws:sqs:us-east-2: 123456789012:my-queue", "awsRegion": "us-east-1" }, { "messageId": "244fc6b4-87a3-44ab-83d2-361172410c3a", "receiptHandle": "AQEBwJnKyrHigUMZj6rYigCgxlaS3SLy0a", "body": "{\"item\": {\"keyboard\": \"classic\"}}", "attributes": { "ApproximateReceiveCount": "1", "SentTimestamp": "1545082649183", "SenderId": "AIDAIENQZJOLO23YVJ4VO", "ApproximateFirstReceiveTimestamp": "1545082649185" }, "messageAttributes": {}, "md5OfBody": "e4e68fb7bd0e697a0ae8f1bb342846b3", "eventSource": "aws:sqs", "eventSourceARN": "arn:aws:sqs:us-east-2: 123456789012:my-queue", "awsRegion": "us-east-1" } ] } ``` ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import ( BatchProcessor, EventType, process_partial_response, ) from aws_lambda_powertools.utilities.parser import BaseModel from aws_lambda_powertools.utilities.parser.models import ( KinesisDataStreamRecord, KinesisDataStreamRecordPayload, ) from aws_lambda_powertools.utilities.parser.types import Json from aws_lambda_powertools.utilities.typing import LambdaContext class Order(BaseModel): item: dict class OrderKinesisPayloadRecord(KinesisDataStreamRecordPayload): # type: ignore[override] data: Json[Order] class OrderKinesisRecord(KinesisDataStreamRecord): # type: ignore[override] kinesis: OrderKinesisPayloadRecord processor = BatchProcessor(event_type=EventType.KinesisDataStreams, model=OrderKinesisRecord) tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: OrderKinesisRecord): logger.info(record.kinesis.data.item) return record.kinesis.data.item @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): return process_partial_response(event=event, record_handler=record_handler, processor=processor, context=context) ``` ``` { "Records": [ { "kinesis": { "kinesisSchemaVersion": "1.0", "partitionKey": "1", "sequenceNumber": "4107859083838847772757075850904226111829882106684065", "data": "eyJpdGVtIjogeyJsYXB0b3AiOiAiYW1kIn19Cg==", "approximateArrivalTimestamp": 1545084650.987 }, "eventSource": "aws:kinesis", "eventVersion": "1.0", "eventID": "shardId-000000000006:4107859083838847772757075850904226111829882106684065", "eventName": "aws:kinesis:record", "invokeIdentityArn": "arn:aws:iam::123456789012:role/lambda-role", "awsRegion": "us-east-2", "eventSourceARN": "arn:aws:kinesis:us-east-2:123456789012:stream/lambda-stream" }, { "kinesis": { "kinesisSchemaVersion": "1.0", "partitionKey": "1", "sequenceNumber": "6006958808509702859251049540584488075644979031228738", "data": "eyJpdGVtIjogeyJrZXlib2FyZCI6ICJjbGFzc2ljIn19Cg==", "approximateArrivalTimestamp": 1545084650.987 }, "eventSource": "aws:kinesis", "eventVersion": "1.0", "eventID": "shardId-000000000006:6006958808509702859251049540584488075644979031228738", "eventName": "aws:kinesis:record", "invokeIdentityArn": "arn:aws:iam::123456789012:role/lambda-role", "awsRegion": "us-east-2", "eventSourceARN": "arn:aws:kinesis:us-east-2:123456789012:stream/lambda-stream" } ] } ``` ``` import json from typing import Dict, Optional from typing_extensions import Literal from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import ( BatchProcessor, EventType, process_partial_response, ) from aws_lambda_powertools.utilities.parser import BaseModel, field_validator from aws_lambda_powertools.utilities.parser.models import ( DynamoDBStreamChangedRecordModel, DynamoDBStreamRecordModel, ) from aws_lambda_powertools.utilities.typing import LambdaContext class Order(BaseModel): item: dict class OrderDynamoDB(BaseModel): Message: Order # auto transform json string # so Pydantic can auto-initialize nested Order model @field_validator("Message", mode="before") def transform_message_to_dict(cls, value: Dict[Literal["S"], str]): return json.loads(value["S"]) class OrderDynamoDBChangeRecord(DynamoDBStreamChangedRecordModel): # type: ignore[override] NewImage: Optional[OrderDynamoDB] OldImage: Optional[OrderDynamoDB] class OrderDynamoDBRecord(DynamoDBStreamRecordModel): # type: ignore[override] dynamodb: OrderDynamoDBChangeRecord processor = BatchProcessor(event_type=EventType.DynamoDBStreams, model=OrderDynamoDBRecord) tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: OrderDynamoDBRecord): if record.dynamodb.NewImage and record.dynamodb.NewImage.Message: logger.info(record.dynamodb.NewImage.Message.item) return record.dynamodb.NewImage.Message.item @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): return process_partial_response(event=event, record_handler=record_handler, processor=processor, context=context) ``` ``` { "Records": [ { "eventID": "1", "eventVersion": "1.0", "dynamodb": { "Keys": { "Id": { "N": "101" } }, "NewImage": { "Message": { "S": "{\"item\": {\"laptop\": \"amd\"}}" } }, "StreamViewType": "NEW_AND_OLD_IMAGES", "SequenceNumber": "3275880929", "SizeBytes": 26 }, "awsRegion": "us-west-2", "eventName": "INSERT", "eventSourceARN": "eventsource_arn", "eventSource": "aws:dynamodb" }, { "eventID": "1", "eventVersion": "1.0", "dynamodb": { "Keys": { "Id": { "N": "101" } }, "NewImage": { "SomethingElse": { "S": "success" } }, "StreamViewType": "NEW_AND_OLD_IMAGES", "SequenceNumber": "8640712661", "SizeBytes": 26 }, "awsRegion": "us-west-2", "eventName": "INSERT", "eventSourceARN": "eventsource_arn", "eventSource": "aws:dynamodb" } ] } ``` ### Working with full batch failures By default, the `BatchProcessor` will raise `BatchProcessingError` if all records in the batch fail to process, we do this to reflect the failure in your operational metrics. When working with functions that handle batches with a small number of records, or when you use errors as a flow control mechanism, this behavior might not be desirable as your function might generate an unnaturally high number of errors. When this happens, the [Lambda service will scale down the concurrency of your function](https://docs.aws.amazon.com/lambda/latest/dg/services-sqs-errorhandling.html#services-sqs-backoff-strategy), potentially impacting performance. For these scenarios, you can set the `raise_on_entire_batch_failure` option to `False`. ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import ( BatchProcessor, EventType, process_partial_response, ) from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.SQS, raise_on_entire_batch_failure=False) tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: SQSRecord): payload: str = record.json_body # if json string data, otherwise record.body for str logger.info(payload) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): return process_partial_response( event=event, record_handler=record_handler, processor=processor, context=context, ) ``` ### Accessing processed messages Use the context manager to access a list of all returned values from your `record_handler` function. - **When successful**. We include a tuple with **1/** `success`, **2/** the result of `record_handler`, and **3/** the batch item - **When failed**. We include a tuple with **1/** `fail`, **2/** exception as a string, and **3/** the batch item serialized as Event Source Data Class or Pydantic model. If a Pydantic model fails validation early, we serialize its failure record as Event Source Data Class to be able to collect message ID/sequence numbers etc. ``` from __future__ import annotations import json from typing_extensions import Literal from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import BatchProcessor, EventType from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.SQS) tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: SQSRecord): payload: str = record.body if payload: item: dict = json.loads(payload) logger.info(item) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): batch = event["Records"] # (1)! with processor(records=batch, handler=record_handler): processed_messages: list[tuple] = processor.process() for message in processed_messages: status: Literal["success", "fail"] = message[0] cause: str = message[1] # (2)! record: SQSRecord = message[2] logger.info(status, record=record, cause=cause) return processor.response() ``` 1. Context manager requires the records list. This is typically handled by `process_partial_response`. 1. Cause contains `exception` str if failed, or `success` otherwise. ``` [ ( "fail", " ), ( "success", "success", {'messageId': '88891c36-32eb-4a25-9905-654a32916893', 'receiptHandle': 'AQEBwJnKyrHigUMZj6rYigCgxlaS3SLy0a', 'body': 'success', 'attributes': {'ApproximateReceiveCount': '1', 'SentTimestamp': '1545082649183', 'SenderId': 'AIDAIENQZJOLO23YVJ4VO', 'ApproximateFirstReceiveTimestamp': '1545082649185'}, 'messageAttributes': {}, 'md5OfBody': 'e4e68fb7bd0e697a0ae8f1bb342846b3', 'eventSource': 'aws:sqs', 'eventSourceARN': 'arn:aws:sqs:us-east-2:123456789012:my-queue', 'awsRegion': 'us-east-1'} ) ] ``` 1. Sample exception could have raised from within `record_handler` function. ``` [ ( "fail", # (1)! ":1 validation error for OrderSqs\nbody\n JSON object must be str, bytes or bytearray (type=type_error.json)", ), ( "success", "success", {'messageId': '88891c36-32eb-4a25-9905-654a32916893', 'receiptHandle': 'AQEBwJnKyrHigUMZj6rYigCgxlaS3SLy0a', 'body': 'success', 'attributes': {'ApproximateReceiveCount': '1', 'SentTimestamp': '1545082649183', 'SenderId': 'AIDAIENQZJOLO23YVJ4VO', 'ApproximateFirstReceiveTimestamp': '1545082649185'}, 'messageAttributes': {}, 'md5OfBody': 'e4e68fb7bd0e697a0ae8f1bb342846b3', 'eventSource': 'aws:sqs', 'eventSourceARN': 'arn:aws:sqs:us-east-2:123456789012:my-queue', 'awsRegion': 'us-east-1'} ), ( "fail", # (2)! ":Failed to process record.", OrderSqs(messageId='9d0bfba5-d213-4b64-89bd-f4fbd7e58358', receiptHandle='AQEBwJnKyrHigUMZj6rYigCgxlaS3SLy0a', body=Order(item={'type': 'fail'}), attributes=SqsAttributesModel(ApproximateReceiveCount='1', ApproximateFirstReceiveTimestamp=datetime.datetime(2018, 12, 17, 21, 37, 29, 185000, tzinfo=datetime.timezone.utc), MessageDeduplicationId=None, MessageGroupId=None, SenderId='AIDAIENQZJOLO23YVJ4VO', SentTimestamp=datetime.datetime(2018, 12, 17, 21, 37, 29, 183000, tzinfo=datetime.timezone.utc), SequenceNumber=None, AWSTraceHeader=None), messageAttributes={}, md5OfBody='e4e68fb7bd0e697a0ae8f1bb342846b3', md5OfMessageAttributes=None, eventSource='aws:sqs', eventSourceARN='arn:aws:sqs:us-east-2:123456789012:my-queue', awsRegion='us-east-1') ) ] ``` 1. Sample when a model fails validation early. Batch item (3rd item) is serialized to the respective Event Source Data Class event type. 1. Sample when model validated successfully but another exception was raised during processing. ### Accessing Lambda Context Within your `record_handler` function, you might need access to the Lambda context to determine how much time you have left before your function times out. We can automatically inject the [Lambda context](https://docs.aws.amazon.com/lambda/latest/dg/python-context.html) into your `record_handler` if your function signature has a parameter named `lambda_context`. When using a context manager, you also need to pass the Lambda context object like in the example below. ``` from typing import Optional from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import ( BatchProcessor, EventType, process_partial_response, ) from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.SQS) tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: SQSRecord, lambda_context: Optional[LambdaContext] = None): if lambda_context is not None: remaining_time = lambda_context.get_remaining_time_in_millis() logger.info(remaining_time) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): return process_partial_response(event=event, record_handler=record_handler, processor=processor, context=context) ``` ``` from typing import Optional from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import BatchProcessor, EventType from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.SQS) tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: SQSRecord, lambda_context: Optional[LambdaContext] = None): if lambda_context is not None: remaining_time = lambda_context.get_remaining_time_in_millis() logger.info(remaining_time) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): batch = event["Records"] with processor(records=batch, handler=record_handler, lambda_context=context): result = processor.process() return result ``` ### Extending BatchProcessor You might want to bring custom logic to the existing `BatchProcessor` to slightly override how we handle successes and failures. For these scenarios, you can subclass `BatchProcessor` and quickly override `success_handler` and `failure_handler` methods: - **`success_handler()`** is called for each successfully processed record - **`failure_handler()`** is called for each failed record Note These functions have a common `record` argument. For backward compatibility reasons, their type is not the same: - `success_handler`: `record` type is `dict[str, Any]`, the raw record data. - `failure_handler`: `record` type can be an Event Source Data Class or your [Pydantic model](#pydantic-integration). During Pydantic validation errors, we fall back and serialize `record` to Event Source Data Class to not break the processing pipeline. Let's suppose you'd like to add metrics to track successes and failures of your batch records. ``` import json from typing import Any, Dict from aws_lambda_powertools import Logger, Metrics, Tracer from aws_lambda_powertools.metrics import MetricUnit from aws_lambda_powertools.utilities.batch import ( BatchProcessor, EventType, ExceptionInfo, FailureResponse, process_partial_response, ) from aws_lambda_powertools.utilities.batch.base import SuccessResponse from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext class MyProcessor(BatchProcessor): def success_handler(self, record: Dict[str, Any], result: Any) -> SuccessResponse: metrics.add_metric(name="BatchRecordSuccesses", unit=MetricUnit.Count, value=1) return super().success_handler(record, result) def failure_handler(self, record: SQSRecord, exception: ExceptionInfo) -> FailureResponse: metrics.add_metric(name="BatchRecordFailures", unit=MetricUnit.Count, value=1) return super().failure_handler(record, exception) processor = MyProcessor(event_type=EventType.SQS) metrics = Metrics(namespace="test") logger = Logger() tracer = Tracer() @tracer.capture_method def record_handler(record: SQSRecord): payload: str = record.body if payload: item: dict = json.loads(payload) logger.info(item) @metrics.log_metrics(capture_cold_start_metric=True) def lambda_handler(event, context: LambdaContext): return process_partial_response(event=event, record_handler=record_handler, processor=processor, context=context) ``` ### Create your own partial processor You can create your own partial batch processor from scratch by inheriting the `BasePartialProcessor` class, and implementing `_prepare()`, `_clean()`, `_process_record()` and `_async_process_record()`. ``` classDiagram direction LR class BasePartialProcessor { <> +_prepare() +_clean() +_process_record_(record: Dict) +_async_process_record_() } class YourCustomProcessor { +_prepare() +_clean() +_process_record_(record: Dict) +_async_process_record_() } BasePartialProcessor <|-- YourCustomProcessor : implement ``` *Visual representation to bring your own processor* - **`_process_record()`** – handles all processing logic for each individual message of a batch, including calling the `record_handler` (self.handler) - **`_prepare()`** – called once as part of the processor initialization - **`_clean()`** – teardown logic called once after `_process_record` completes - **`_async_process_record()`** – If you need to implement asynchronous logic, use this method, otherwise define it in your class with empty logic - **`response()`** - called upon completion of processing You can utilize this class to instantiate a new processor and then pass it to the `process_partial_response` function. ``` import copy import os import sys from random import randint from typing import Any import boto3 from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.batch import ( BasePartialProcessor, process_partial_response, ) from aws_lambda_powertools.utilities.batch.types import PartialItemFailureResponse table_name = os.getenv("TABLE_NAME", "table_store_batch") logger = Logger() class MyPartialProcessor(BasePartialProcessor): DEFAULT_RESPONSE: PartialItemFailureResponse = {"batchItemFailures": []} """ Process a record and stores successful results at a Amazon DynamoDB Table Parameters ---------- table_name: str DynamoDB table name to write results to """ def __init__(self, table_name: str): self.table_name = table_name self.batch_response: PartialItemFailureResponse = copy.deepcopy(self.DEFAULT_RESPONSE) super().__init__() def _prepare(self): # It's called once, *before* processing # Creates table resource and clean previous results self.ddb_table = boto3.resource("dynamodb").Table(self.table_name) self.success_messages.clear() def response(self) -> PartialItemFailureResponse: return self.batch_response def _clean(self): # It's called once, *after* closing processing all records (closing the context manager) # Here we're sending, at once, all successful messages to a ddb table with self.ddb_table.batch_writer() as batch: for result in self.success_messages: batch.put_item(Item=result) def _process_record(self, record): # It handles how your record is processed # Here we're keeping the status of each run # where self.handler is the record_handler function passed as an argument try: result = self.handler(record) # record_handler passed to decorator/context manager return self.success_handler(record, result) except Exception as exc: logger.error(exc) return self.failure_handler(record, sys.exc_info()) def success_handler(self, record, result: Any): entry = ("success", result, record) self.success_messages.append(record) return entry async def _async_process_record(self, record: dict): raise NotImplementedError() processor = MyPartialProcessor(table_name) def record_handler(record): return randint(0, 100) def lambda_handler(event, context): return process_partial_response(event=event, record_handler=record_handler, processor=processor, context=context) ``` ``` Transform: AWS::Serverless-2016-10-31 Resources: IdempotencyTable: Type: AWS::DynamoDB::Table Properties: AttributeDefinitions: - AttributeName: messageId AttributeType: S KeySchema: - AttributeName: messageId KeyType: HASH TimeToLiveSpecification: AttributeName: expiration Enabled: true BillingMode: PAY_PER_REQUEST ``` ``` { "Records": [ { "messageId": "059f36b4-87a3-44ab-83d2-661975830a7d", "receiptHandle": "AQEBwJnKyrHigUMZj6rYigCgxlaS3SLy0a", "body": "{\"Message\": \"success\"}" }, { "messageId": "244fc6b4-87a3-44ab-83d2-361172410c3a", "receiptHandle": "AQEBwJnKyrHigUMZj6rYigCgxlaS3SLy0a", "body": "SGVsbG8sIHRoaXMgaXMgYSB0ZXN0Lg==" } ] } ``` ### Caveats #### Tracer response auto-capture for large batch sizes When using Tracer to capture responses for each batch record processing, you might exceed 64K of tracing data depending on what you return from your `record_handler` function, or how big is your batch size. If that's the case, you can configure [Tracer to disable response auto-capturing](../../core/tracer/#disabling-response-auto-capture). ``` import json from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import ( BatchProcessor, EventType, process_partial_response, ) from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.SQS) tracer = Tracer() logger = Logger() @tracer.capture_method(capture_response=False) def record_handler(record: SQSRecord): payload: str = record.body if payload: item: dict = json.loads(payload) logger.info(item) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): return process_partial_response(event=event, record_handler=record_handler, processor=processor, context=context) ``` ## Testing your code As there is no external calls, you can unit test your code with `BatchProcessor` quite easily. **Example**: Given a SQS batch where the first batch record succeeds and the second fails processing, we should have a single item reported in the function response. ``` import json from dataclasses import dataclass from pathlib import Path import pytest from getting_started_with_test_app import lambda_handler, processor def load_event(path: Path): with path.open() as f: return json.load(f) @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:809313241:function:test" aws_request_id: str = "52fdfc07-2182-154f-163f-5f0f9a621d72" @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() @pytest.fixture() def sqs_event(): """Generates API GW Event""" return load_event(path=Path("events/sqs_event.json")) def test_app_batch_partial_response(sqs_event, lambda_context: LambdaContext): # GIVEN processor_result = processor # access processor for additional assertions successful_record = sqs_event["Records"][0] failed_record = sqs_event["Records"][1] expected_response = {"batchItemFailures": [{"itemIdentifier": failed_record["messageId"]}]} # WHEN ret = lambda_handler(sqs_event, lambda_context) # THEN assert ret == expected_response assert len(processor_result.fail_messages) == 1 assert processor_result.success_messages[0] == successful_record ``` ``` import json from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.utilities.batch import ( BatchProcessor, EventType, process_partial_response, ) from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.SQS) tracer = Tracer() logger = Logger() @tracer.capture_method def record_handler(record: SQSRecord): payload: str = record.body if payload: item: dict = json.loads(payload) logger.info(item) @logger.inject_lambda_context @tracer.capture_lambda_handler def lambda_handler(event, context: LambdaContext): return process_partial_response(event=event, record_handler=record_handler, processor=processor, context=context) ``` ``` { "Records": [ { "messageId": "059f36b4-87a3-44ab-83d2-661975830a7d", "receiptHandle": "AQEBwJnKyrHigUMZj6rYigCgxlaS3SLy0a", "body": "{\"Message\": \"success\"}", "attributes": { "ApproximateReceiveCount": "1", "SentTimestamp": "1545082649183", "SenderId": "AIDAIENQZJOLO23YVJ4VO", "ApproximateFirstReceiveTimestamp": "1545082649185" }, "messageAttributes": {}, "md5OfBody": "e4e68fb7bd0e697a0ae8f1bb342846b3", "eventSource": "aws:sqs", "eventSourceARN": "arn:aws:sqs:us-east-2: 123456789012:my-queue", "awsRegion": "us-east-1" }, { "messageId": "244fc6b4-87a3-44ab-83d2-361172410c3a", "receiptHandle": "AQEBwJnKyrHigUMZj6rYigCgxlaS3SLy0a", "body": "SGVsbG8sIHRoaXMgaXMgYSB0ZXN0Lg==", "attributes": { "ApproximateReceiveCount": "1", "SentTimestamp": "1545082649183", "SenderId": "AIDAIENQZJOLO23YVJ4VO", "ApproximateFirstReceiveTimestamp": "1545082649185" }, "messageAttributes": {}, "md5OfBody": "e4e68fb7bd0e697a0ae8f1bb342846b3", "eventSource": "aws:sqs", "eventSourceARN": "arn:aws:sqs:us-east-2: 123456789012:my-queue", "awsRegion": "us-east-1" } ] } ``` ## FAQ ### Choosing between method and context manager Use context manager when you want access to the processed messages or handle `BatchProcessingError` exception when all records within the batch fail to be processed. ### Integrating exception handling with Sentry.io When using Sentry.io for error monitoring, you can override `failure_handler` to capture each processing exception with Sentry SDK: > Credits to [Charles-Axel Dein](https://github.com/aws-powertools/powertools-lambda-python/issues/293#issuecomment-781961732) ``` from sentry_sdk import capture_exception from aws_lambda_powertools.utilities.batch import BatchProcessor, FailureResponse class MyProcessor(BatchProcessor): def failure_handler(self, record, exception) -> FailureResponse: capture_exception() # send exception to Sentry return super().failure_handler(record, exception) ``` Event Source Data Classes provides self-describing and strongly-typed classes for various AWS Lambda event sources. ## Key features - Type hinting and code completion for common event types - Helper functions for decoding/deserializing nested fields - Docstrings for fields contained in event schemas - Standardized attribute-based access to event properties ## Getting started Tip All examples shared in this documentation are available within the [project repository](https://github.com/aws-powertools/powertools-lambda-python/tree/develop/examples). There are two ways to use Event Source Data Classes in your Lambda functions. **Method 1: Direct Initialization** You can initialize the appropriate data class by passing the Lambda event object to its constructor. ``` from aws_lambda_powertools.utilities.data_classes import APIGatewayProxyEvent def lambda_handler(event: dict, context): api_event = APIGatewayProxyEvent(event) if "hello" in api_event.path and api_event.http_method == "GET": return {"statusCode": 200, "body": f"Hello from path: {api_event.path}"} else: return {"statusCode": 400, "body": "No Hello from path"} ``` ``` { "resource": "/helloworld", "path": "/hello", "httpMethod": "GET", "headers": { "Accept": "*/*", "Host": "api.example.com" }, "queryStringParameters": { "name": "John" }, "pathParameters": null, "stageVariables": null, "requestContext": { "requestId": "c6af9ac6-7b61-11e6-9a41-93e8deadbeef", "stage": "prod" }, "body": null, "isBase64Encoded": false } ``` **Method 2: Using the event_source Decorator** Alternatively, you can use the `event_source` decorator to automatically parse the event. ``` from aws_lambda_powertools.utilities.data_classes import APIGatewayProxyEvent, event_source @event_source(data_class=APIGatewayProxyEvent) def lambda_handler(event: APIGatewayProxyEvent, context): if "hello" in event.path and event.http_method == "GET": return {"statusCode": 200, "body": f"Hello from path: {event.path}"} else: return {"statusCode": 400, "body": "No Hello from path"} ``` ``` { "resource": "/helloworld", "path": "/hello", "httpMethod": "GET", "headers": { "Accept": "*/*", "Host": "api.example.com" }, "queryStringParameters": { "name": "John" }, "pathParameters": null, "stageVariables": null, "requestContext": { "requestId": "c6af9ac6-7b61-11e6-9a41-93e8deadbeef", "stage": "prod" }, "body": null, "isBase64Encoded": false } ``` ### Autocomplete with self-documented properties and methods Event Source Data Classes has the ability to leverage IDE autocompletion and inline documentation. When using the APIGatewayProxyEvent class, for example, the IDE will offer autocomplete suggestions for various properties and methods. ## Supported event sources Each event source is linked to its corresponding GitHub file with the full set of properties, methods, and docstrings specific to each event type. | Event Source | Data_class | Properties | | --- | --- | --- | | [Active MQ](#active-mq) | `ActiveMQEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/active_mq_event.py) | | [API Gateway Authorizer](#api-gateway-authorizer) | `APIGatewayAuthorizerRequestEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/api_gateway_authorizer_event.py) | | [API Gateway Authorizer V2](#api-gateway-authorizer-v2) | `APIGatewayAuthorizerEventV2` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/api_gateway_authorizer_event.py) | | [API Gateway Proxy](#api-gateway-proxy) | `APIGatewayProxyEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/api_gateway_proxy_event.py) | | [API Gateway Proxy V2](#api-gateway-proxy-v2) | `APIGatewayProxyEventV2` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/api_gateway_proxy_event.py) | | [Application Load Balancer](#application-load-balancer) | `ALBEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/alb_event.py) | | [AppSync Authorizer](#appsync-authorizer) | `AppSyncAuthorizerEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/appsync_authorizer_event.py) | | [AppSync Resolver](#appsync-resolver) | `AppSyncResolverEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/appsync_resolver_event.py) | | [AWS Config Rule](#aws-config-rule) | `AWSConfigRuleEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/aws_config_rule_event.py) | | [Bedrock Agent](#bedrock-agent) | `BedrockAgent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/bedrock_agent_event.py) | | [CloudFormation Custom Resource](#cloudformation-custom-resource) | `CloudFormationCustomResourceEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/cloudformation_custom_resource_event.py) | | [CloudWatch Alarm State Change Action](#cloudwatch-alarm-state-change-action) | `CloudWatchAlarmEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/cloud_watch_alarm_event.py) | | [CloudWatch Dashboard Custom Widget](#cloudwatch-dashboard-custom-widget) | `CloudWatchDashboardCustomWidgetEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/cloud_watch_custom_widget_event.py) | | [CloudWatch Logs](#cloudwatch-logs) | `CloudWatchLogsEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/cloud_watch_logs_event.py) | | [CodeDeploy Lifecycle Hook](#codedeploy-lifecycle-hook) | `CodeDeployLifecycleHookEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/code_deploy_lifecycle_hook_event.py) | | [CodePipeline Job Event](#codepipeline-job) | `CodePipelineJobEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/code_pipeline_job_event.py) | | [Cognito User Pool](#cognito-user-pool) | Multiple available under `cognito_user_pool_event` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/cognito_user_pool_event.py) | | [Connect Contact Flow](#connect-contact-flow) | `ConnectContactFlowEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/connect_contact_flow_event.py) | | [DynamoDB streams](#dynamodb-streams) | `DynamoDBStreamEvent`, `DynamoDBRecordEventName` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/dynamo_db_stream_event.py) | | [EventBridge](#eventbridge) | `EventBridgeEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/event_bridge_event.py) | | [Kafka](#kafka) | `KafkaEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/kafka_event.py) | | [Kinesis Data Stream](#kinesis-streams) | `KinesisStreamEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/kinesis_stream_event.py) | | [Kinesis Firehose Delivery Stream](#kinesis-firehose-delivery-stream) | `KinesisFirehoseEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/kinesis_firehose_event.py) | | [Lambda Function URL](#lambda-function-url) | `LambdaFunctionUrlEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/lambda_function_url_event.py) | | [Rabbit MQ](#rabbit-mq) | `RabbitMQEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/rabbit_mq_event.py) | | [S3](#s3) | `S3Event` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/s3_event.py) | | [S3 Batch Operations](#s3-batch-operations) | `S3BatchOperationEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/s3_batch_operation_event.py) | | [S3 Object Lambda](#s3-object-lambda) | `S3ObjectLambdaEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/s3_object_event.py) | | [S3 EventBridge Notification](#s3-eventbridge-notification) | `S3EventBridgeNotificationEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/s3_event.py) | | [SES](#ses) | `SESEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/ses_event.py) | | [SNS](#sns) | `SNSEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/sns_event.py) | | [SQS](#sqs) | `SQSEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/sqs_event.py) | | [TransferFamilyAuthorizer] | `TransferFamilyAuthorizer` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/transfer_family_event.py) | | [TransferFamilyAuthorizerResponse] | `TransferFamilyAuthorizerResponse` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/transfer_family_event.py) | | [VPC Lattice V2](#vpc-lattice-v2) | `VPCLatticeV2Event` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/vpc_lattice.py) | | [VPC Lattice V1](#vpc-lattice-v1) | `VPCLatticeEvent` | [Github](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/vpc_lattice.py) | | [IoT Core Thing Created/Updated/Deleted](#iot-core-thing-createdupdateddeleted) | `IoTCoreThingEvent` | [GitHub](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/iot_registry_event.py#L33) | | [IoT Core Thing Type Created/Updated/Deprecated/Undeprecated/Deleted](#iot-core-thing-type-createdupdateddeprecatedundeprecateddeleted) | `IoTCoreThingTypeEvent` | [GitHub](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/iot_registry_event.py#L96) | | [IoT Core Thing Type Associated/Disassociated with a Thing](#iot-core-thing-type-associateddisassociated-with-a-thing) | `IoTCoreThingTypeAssociationEvent` | [GitHub](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/iot_registry_event.py#L173) | | [IoT Core Thing Group Created/Updated/Deleted](#iot-core-thing-group-createdupdateddeleted) | `IoTCoreThingGroupEvent` | [GitHub](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/iot_registry_event.py#L214) | | [IoT Thing Added/Removed from Thing Group](#iot-thing-addedremoved-from-thing-group) | `IoTCoreAddOrRemoveFromThingGroupEvent` | [GitHub](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/iot_registry_event.py#L304) | | [IoT Child Group Added/Deleted from Parent Group](#iot-child-group-addeddeleted-from-parent-group) | `IoTCoreAddOrDeleteFromThingGroupEvent` | [GitHub](https://github.com/aws-powertools/powertools-lambda-python/blob/develop/aws_lambda_powertools/utilities/data_classes/iot_registry_event.py#L366) | Info The examples showcase a subset of Event Source Data Classes capabilities - for comprehensive details, leverage your IDE's autocompletion, refer to type hints and docstrings, and explore the [full API reference](https://docs.powertools.aws.dev/lambda/python/latest/api/utilities/data_classes/) for complete property listings of each event source. ### Active MQ It is used for [Active MQ payloads](https://docs.aws.amazon.com/lambda/latest/dg/with-mq.html), also see the [AWS blog post](https://aws.amazon.com/blogs/compute/using-amazon-mq-as-an-event-source-for-aws-lambda/) for more details. ``` import json from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.active_mq_event import ActiveMQEvent logger = Logger() @event_source(data_class=ActiveMQEvent) def lambda_handler(event: ActiveMQEvent, context): for message in event.messages: msg = message.message_id msg_pn = message.destination_physicalname logger.info(f"Message ID: {msg} and physical name: {msg_pn}") return {"statusCode": 200, "body": json.dumps("Processing complete")} ``` ``` { "eventSource": "aws:amq", "eventSourceArn": "arn:aws:mq:us-west-2:112556298976:broker:test:b-9bcfa592-423a-4942-879d-eb284b418fc8", "messages": [ { "messageID": "ID:b-9bcfa592-423a-4942-879d-eb284b418fc8-1.mq.us-west-2.amazonaws.com-37557-1234520418293-4:1:1:1:1", "messageType": "jms/text-message", "data": "QUJDOkFBQUE=", "connectionId": "myJMSCoID", "redelivered": false, "destination": { "physicalName": "testQueue" }, "timestamp": 1598827811958, "brokerInTime": 1598827811958, "brokerOutTime": 1598827811959, "properties": { "testKey": "testValue" } }, { "messageID": "ID:b-9bcfa592-423a-4942-879d-eb284b418fc8-1.mq.us-west-2.amazonaws.com-37557-1234520418293-4:1:1:1:1", "messageType": "jms/text-message", "data": "eyJ0aW1lb3V0IjowLCJkYXRhIjoiQ1pybWYwR3c4T3Y0YnFMUXhENEUifQ==", "connectionId": "myJMSCoID2", "redelivered": false, "destination": { "physicalName": "testQueue" }, "timestamp": 1598827811958, "brokerInTime": 1598827811958, "brokerOutTime": 1598827811959, "properties": { "testKey": "testValue" } }, { "messageID": "ID:b-9bcfa592-423a-4942-879d-eb284b418fc8-1.mq.us-west-2.amazonaws.com-37557-1234520418293-4:1:1:1:1", "messageType": "jms/bytes-message", "data": "3DTOOW7crj51prgVLQaGQ82S48k=", "connectionId": "myJMSCoID1", "persistent": false, "destination": { "physicalName": "testQueue" }, "timestamp": 1598827811958, "brokerInTime": 1598827811958, "brokerOutTime": 1598827811959, "properties": { "testKey": "testValue" } } ] } ``` ### API Gateway Authorizer It is used for [API Gateway Rest API Lambda Authorizer payload](https://docs.aws.amazon.com/apigateway/latest/developerguide/apigateway-use-lambda-authorizer.html). Use **`APIGatewayAuthorizerRequestEvent`** for type `REQUEST` and **`APIGatewayAuthorizerTokenEvent`** for type `TOKEN`. ``` from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.api_gateway_authorizer_event import ( APIGatewayAuthorizerRequestEvent, APIGatewayAuthorizerResponse, ) @event_source(data_class=APIGatewayAuthorizerRequestEvent) def lambda_handler(event: APIGatewayAuthorizerRequestEvent, context): # Simple auth check (replace with your actual auth logic) is_authorized = event.headers.get("HeaderAuth1") == "headerValue1" if not is_authorized: return {"principalId": "", "policyDocument": {"Version": "2012-10-17", "Statement": []}} arn = event.parsed_arn policy = APIGatewayAuthorizerResponse( principal_id="user", context={"user": "example"}, region=arn.region, aws_account_id=arn.aws_account_id, api_id=arn.api_id, stage=arn.stage, ) policy.allow_all_routes() return policy.asdict() ``` ``` from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.api_gateway_authorizer_event import ( APIGatewayAuthorizerRequestEvent, APIGatewayAuthorizerResponseWebSocket, ) @event_source(data_class=APIGatewayAuthorizerRequestEvent) def lambda_handler(event: APIGatewayAuthorizerRequestEvent, context): # Simple auth check (replace with your actual auth logic) is_authorized = event.headers.get("HeaderAuth1") == "headerValue1" if not is_authorized: return {"principalId": "", "policyDocument": {"Version": "2012-10-17", "Statement": []}} arn = event.parsed_arn policy = APIGatewayAuthorizerResponseWebSocket( principal_id="user", context={"user": "example"}, region=arn.region, aws_account_id=arn.aws_account_id, api_id=arn.api_id, stage=arn.stage, ) policy.allow_all_routes() return policy.asdict() ``` ``` { "version": "1.0", "type": "REQUEST", "methodArn": "arn:aws:execute-api:us-east-1:123456789012:abcdef123/test/GET/request", "identitySource": "user1,123", "authorizationToken": "user1,123", "resource": "/request", "path": "/request", "httpMethod": "GET", "headers": { "X-AMZ-Date": "20170718T062915Z", "Accept": "*/*", "HeaderAuth1": "headerValue1", "CloudFront-Viewer-Country": "US", "CloudFront-Forwarded-Proto": "https", "CloudFront-Is-Tablet-Viewer": "false", "CloudFront-Is-Mobile-Viewer": "false", "User-Agent": "..." }, "multiValueHeaders": { "Header1": [ "value1" ], "Origin": [ "https://aws.amazon.com" ], "Header2": [ "value1", "value2" ] }, "queryStringParameters": { "QueryString1": "queryValue1" }, "pathParameters": {}, "stageVariables": { "StageVar1": "stageValue1" }, "requestContext": { "accountId": "123456789012", "apiId": "abcdef123", "domainName": "3npb9j1tlk.execute-api.us-west-1.amazonaws.com", "domainPrefix": "3npb9j1tlk", "extendedRequestId": "EXqgWgXxSK4EJug=", "httpMethod": "GET", "identity": { "accessKey": null, "accountId": null, "caller": null, "cognitoAmr": null, "cognitoAuthenticationProvider": null, "cognitoAuthenticationType": null, "cognitoIdentityId": null, "cognitoIdentityPoolId": null, "principalOrgId": null, "apiKey": "...", "sourceIp": "test-invoke-source-ip", "user": null, "userAgent": "PostmanRuntime/7.28.3", "userArn": null, "clientCert": { "clientCertPem": "CERT_CONTENT", "subjectDN": "www.example.com", "issuerDN": "Example issuer", "serialNumber": "a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1", "validity": { "notBefore": "May 28 12:30:02 2019 GMT", "notAfter": "Aug 5 09:36:04 2021 GMT" } } }, "path": "/request", "protocol": "HTTP/1.1", "requestId": "EXqgWgXxSK4EJug=", "requestTime": "20/Aug/2021:14:36:50 +0000", "requestTimeEpoch": 1629470210043, "resourceId": "ANY /request", "resourcePath": "/request", "stage": "test" } } ``` ``` from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.api_gateway_authorizer_event import ( APIGatewayAuthorizerResponse, APIGatewayAuthorizerTokenEvent, ) @event_source(data_class=APIGatewayAuthorizerTokenEvent) def lambda_handler(event: APIGatewayAuthorizerTokenEvent, context): # Simple token check (replace with your actual token validation logic) is_valid_token = event.authorization_token == "allow" if not is_valid_token: return {"principalId": "", "policyDocument": {"Version": "2012-10-17", "Statement": []}} arn = event.parsed_arn policy = APIGatewayAuthorizerResponse( principal_id="user", context={"user": "example"}, region=arn.region, aws_account_id=arn.aws_account_id, api_id=arn.api_id, stage=arn.stage, ) policy.allow_all_routes() return policy.asdict() ``` ``` { "type": "TOKEN", "authorizationToken": "allow", "methodArn": "arn:aws:execute-api:us-west-2:123456789012:ymy8tbxw7b/*/GET/" } ``` ### API Gateway Authorizer V2 It is used for [API Gateway HTTP API Lambda Authorizer payload version 2](https://docs.aws.amazon.com/apigateway/latest/developerguide/http-api-lambda-authorizer.html). See also [this blog post](https://aws.amazon.com/blogs/compute/introducing-iam-and-lambda-authorizers-for-amazon-api-gateway-http-apis/) for more details. ``` from secrets import compare_digest from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.api_gateway_authorizer_event import ( APIGatewayAuthorizerEventV2, APIGatewayAuthorizerResponseV2, ) def get_user_by_token(token): if compare_digest(token, "value"): return {"name": "Foo"} return None @event_source(data_class=APIGatewayAuthorizerEventV2) def lambda_handler(event: APIGatewayAuthorizerEventV2, context): user = get_user_by_token(event.headers.get("Authorization")) if user is None: # No user was found, so we return not authorized return APIGatewayAuthorizerResponseV2(authorize=False).asdict() # Found the user and setting the details in the context response = APIGatewayAuthorizerResponseV2( authorize=True, context=user, ) return response.asdict() ``` ``` { "version": "2.0", "type": "REQUEST", "routeArn": "arn:aws:execute-api:us-east-1:123456789012:abcdef123/test/GET/request", "identitySource": ["user1", "123"], "routeKey": "GET /merchants", "rawPath": "/merchants", "rawQueryString": "parameter1=value1¶meter1=value2¶meter2=value", "cookies": ["cookie1", "cookie2"], "headers": { "x-amzn-trace-id": "Root=1-611cc4a7-0746ebee281cfd967db97b64", "Header1": "value1", "Header2": "value2", "Authorization": "value" }, "queryStringParameters": { "parameter1": "value1,value2", "parameter2": "value" }, "requestContext": { "accountId": "123456789012", "apiId": "api-id", "authentication": { "clientCert": { "clientCertPem": "CERT_CONTENT", "subjectDN": "www.example.com", "issuerDN": "Example issuer", "serialNumber": "a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1", "validity": { "notBefore": "May 28 12:30:02 2019 GMT", "notAfter": "Aug 5 09:36:04 2021 GMT" } } }, "domainName": "id.execute-api.us-east-1.amazonaws.com", "domainPrefix": "id", "http": { "method": "POST", "path": "/merchants", "protocol": "HTTP/1.1", "sourceIp": "10.10.10.10", "userAgent": "agent" }, "requestId": "id", "routeKey": "GET /merchants", "stage": "$default", "time": "12/Mar/2020:19:03:58 +0000", "timeEpoch": 1583348638390 }, "pathParameters": { "parameter1": "value1" }, "stageVariables": { "stageVariable1": "value1", "stageVariable2": "value2" } } ``` ### API Gateway Proxy It is used for either API Gateway REST API or HTTP API using v1 proxy event. ``` from aws_lambda_powertools.utilities.data_classes import APIGatewayProxyEvent, event_source @event_source(data_class=APIGatewayProxyEvent) def lambda_handler(event: APIGatewayProxyEvent, context): if "hello" in event.path and event.http_method == "GET": return {"statusCode": 200, "body": f"Hello from path: {event.path}"} else: return {"statusCode": 400, "body": "No Hello from path"} ``` ``` { "resource": "/helloworld", "path": "/hello", "httpMethod": "GET", "headers": { "Accept": "*/*", "Host": "api.example.com" }, "queryStringParameters": { "name": "John" }, "pathParameters": null, "stageVariables": null, "requestContext": { "requestId": "c6af9ac6-7b61-11e6-9a41-93e8deadbeef", "stage": "prod" }, "body": null, "isBase64Encoded": false } ``` ### API Gateway Proxy V2 It is used for HTTP API using v2 proxy event. ``` from aws_lambda_powertools.utilities.data_classes import APIGatewayProxyEventV2, event_source @event_source(data_class=APIGatewayProxyEventV2) def lambda_handler(event: APIGatewayProxyEventV2, context): if "hello" in event.path and event.http_method == "POST": return {"statusCode": 200, "body": f"Hello from path: {event.path}"} else: return {"statusCode": 400, "body": "No Hello from path"} ``` ``` { "version": "2.0", "routeKey": "$default", "rawPath": "/my/path", "rawQueryString": "parameter1=value1¶meter1=value2¶meter2=value", "cookies": [ "cookie1", "cookie2" ], "headers": { "Header1": "value1", "Header2": "value1,value2" }, "queryStringParameters": { "parameter1": "value1,value2", "parameter2": "value" }, "requestContext": { "accountId": "123456789012", "apiId": "api-id", "authentication": { "clientCert": { "clientCertPem": "CERT_CONTENT", "subjectDN": "www.example.com", "issuerDN": "Example issuer", "serialNumber": "a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1", "validity": { "notBefore": "May 28 12:30:02 2019 GMT", "notAfter": "Aug 5 09:36:04 2021 GMT" } } }, "authorizer": { "jwt": { "claims": { "claim1": "value1", "claim2": "value2" }, "scopes": [ "scope1", "scope2" ] } }, "domainName": "id.execute-api.us-east-1.amazonaws.com", "domainPrefix": "id", "http": { "method": "POST", "path": "/my/path", "protocol": "HTTP/1.1", "sourceIp": "192.168.0.1/32", "userAgent": "agent" }, "requestId": "id", "routeKey": "$default", "stage": "$default", "time": "12/Mar/2020:19:03:58 +0000", "timeEpoch": 1583348638390 }, "body": "{\"message\": \"hello world\", \"username\": \"tom\"}", "pathParameters": { "parameter1": "value1" }, "isBase64Encoded": false, "stageVariables": { "stageVariable1": "value1", "stageVariable2": "value2" } } ``` ### Application Load Balancer Is it used for [Application load balancer](https://docs.aws.amazon.com/elasticloadbalancing/latest/application/introduction.html) event. ``` from aws_lambda_powertools.utilities.data_classes import ALBEvent, event_source @event_source(data_class=ALBEvent) def lambda_handler(event: ALBEvent, context): if "lambda" in event.path and event.http_method == "GET": return {"statusCode": 200, "body": f"Hello from path: {event.path}"} else: return {"statusCode": 400, "body": "No Hello from path"} ``` ``` { "requestContext": { "elb": { "targetGroupArn": "arn:aws:elasticloadbalancing:us-east-2:123456789012:targetgroup/lambda-279XGJDqGZ5rsrHC2Fjr/49e9d65c45c6791a" } }, "httpMethod": "GET", "path": "/lambda", "queryStringParameters": { "query": "1234ABCD" }, "headers": { "accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8", "accept-encoding": "gzip", "accept-language": "en-US,en;q=0.9", "connection": "keep-alive", "host": "lambda-alb-123578498.us-east-2.elb.amazonaws.com", "upgrade-insecure-requests": "1", "user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.98 Safari/537.36", "x-amzn-trace-id": "Root=1-5c536348-3d683b8b04734faae651f476", "x-forwarded-for": "72.12.164.125", "x-forwarded-port": "80", "x-forwarded-proto": "http", "x-imforwards": "20" }, "body": "Test", "isBase64Encoded": false } ``` ### AppSync Authorizer Used when building an [AWS_LAMBDA Authorization](https://docs.aws.amazon.com/appsync/latest/devguide/security-authz.html#aws-lambda-authorization) with AppSync. See blog post [Introducing Lambda authorization for AWS AppSync GraphQL APIs](https://aws.amazon.com/blogs/mobile/appsync-lambda-auth/) or read the Amplify documentation on using [AWS Lambda for authorization](https://docs.amplify.aws/lib/graphqlapi/authz/q/platform/js#aws-lambda) with AppSync. ``` from typing import Dict from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.logging.logger import Logger from aws_lambda_powertools.utilities.data_classes.appsync_authorizer_event import ( AppSyncAuthorizerEvent, AppSyncAuthorizerResponse, ) from aws_lambda_powertools.utilities.data_classes.event_source import event_source logger = Logger() def get_user_by_token(token: str): """Look a user by token""" ... @logger.inject_lambda_context(correlation_id_path=correlation_paths.APPSYNC_AUTHORIZER) @event_source(data_class=AppSyncAuthorizerEvent) def lambda_handler(event: AppSyncAuthorizerEvent, context) -> Dict: user = get_user_by_token(event.authorization_token) if not user: # No user found, return not authorized return AppSyncAuthorizerResponse().asdict() return AppSyncAuthorizerResponse( authorize=True, resolver_context={"id": user.id}, # Only allow admins to delete events deny_fields=None if user.is_admin else ["Mutation.deleteEvent"], ).asdict() ``` ``` { "authorizationToken": "BE9DC5E3-D410-4733-AF76-70178092E681", "requestContext": { "apiId": "giy7kumfmvcqvbedntjwjvagii", "accountId": "254688921111", "requestId": "b80ed838-14c6-4500-b4c3-b694c7bef086", "queryString": "mutation MyNewTask($desc: String!) {\n createTask(description: $desc, owner: \"ccc\", taskStatus: \"cc\", title: \"ccc\") {\n id\n }\n}\n", "operationName": "MyNewTask", "variables": { "desc": "Foo" } } } ``` ### AppSync Resolver Used when building Lambda GraphQL Resolvers with [Amplify GraphQL Transform Library](https://docs.amplify.aws/cli/graphql-transformer/function) (`@function`), and [AppSync Direct Lambda Resolvers](https://aws.amazon.com/blogs/mobile/appsync-direct-lambda/). The example serves as an AppSync resolver for the `locations` field of the `Merchant` type. It uses the `@event_source` decorator to parse the AppSync event, handles pagination and filtering for locations, and demonstrates `AppSyncIdentityCognito`. ``` from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.appsync_resolver_event import ( AppSyncIdentityCognito, AppSyncResolverEvent, ) from aws_lambda_powertools.utilities.typing import LambdaContext @event_source(data_class=AppSyncResolverEvent) def lambda_handler(event: AppSyncResolverEvent, context: LambdaContext): # Access the AppSync event details type_name = event.type_name field_name = event.field_name arguments = event.arguments source = event.source print(f"Resolving field '{field_name}' for type '{type_name}'") print(f"Arguments: {arguments}") print(f"Source: {source}") # Check if the identity is Cognito-based if isinstance(event.identity, AppSyncIdentityCognito): user_id = event.identity.sub username = event.identity.username print(f"Request from Cognito user: {username} (ID: {user_id})") else: print("Request is not from a Cognito-authenticated user") if type_name == "Merchant" and field_name == "locations": page = arguments.get("page", 1) size = arguments.get("size", 10) name_filter = arguments.get("name") # Here you would typically fetch locations from a database # This is a mock implementation locations = [ {"id": "1", "name": "Location 1", "address": "123 Main St"}, {"id": "2", "name": "Location 2", "address": "456 Elm St"}, {"id": "3", "name": "Location 3", "address": "789 Oak St"}, ] # Apply name filter if provided if name_filter: locations = [loc for loc in locations if name_filter.lower() in loc["name"].lower()] # Apply pagination start = (page - 1) * size end = start + size paginated_locations = locations[start:end] return { "items": paginated_locations, "totalCount": len(locations), "nextToken": str(page + 1) if end < len(locations) else None, } else: raise Exception(f"Unhandled field: {field_name} for type: {type_name}") ``` ``` { "typeName": "Merchant", "fieldName": "locations", "arguments": { "page": 2, "size": 1, "name": "value" }, "identity": { "claims": { "sub": "07920713-4526-4642-9c88-2953512de441", "iss": "https://cognito-idp.us-east-1.amazonaws.com/us-east-1_POOL_ID", "aud": "58rc9bf5kkti90ctmvioppukm9", "event_id": "7f4c9383-abf6-48b7-b821-91643968b755", "token_use": "id", "auth_time": 1615366261, "name": "Michael Brewer", "exp": 1615369861, "iat": 1615366261 }, "defaultAuthStrategy": "ALLOW", "groups": null, "issuer": "https://cognito-idp.us-east-1.amazonaws.com/us-east-1_POOL_ID", "sourceIp": [ "11.215.2.22" ], "sub": "07920713-4526-4642-9c88-2953512de441", "username": "mike" }, "source": { "name": "Value", "nested": { "name": "value", "list": [] } }, "request": { "headers": { "x-forwarded-for": "11.215.2.22, 64.44.173.11", "cloudfront-viewer-country": "US", "cloudfront-is-tablet-viewer": "false", "via": "2.0 SOMETHING.cloudfront.net (CloudFront)", "cloudfront-forwarded-proto": "https", "origin": "https://console.aws.amazon.com", "content-length": "156", "accept-language": "en-US,en;q=0.9", "host": "SOMETHING.appsync-api.us-east-1.amazonaws.com", "x-forwarded-proto": "https", "sec-gpc": "1", "user-agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) etc.", "accept": "*/*", "cloudfront-is-mobile-viewer": "false", "cloudfront-is-smarttv-viewer": "false", "accept-encoding": "gzip, deflate, br", "referer": "https://console.aws.amazon.com/", "content-type": "application/json", "sec-fetch-mode": "cors", "x-amz-cf-id": "Fo5VIuvP6V6anIEt62WzFDCK45mzM4yEdpt5BYxOl9OFqafd-WR0cA==", "x-amzn-trace-id": "Root=1-60488877-0b0c4e6727ab2a1c545babd0", "authorization": "AUTH-HEADER", "sec-fetch-dest": "empty", "x-amz-user-agent": "AWS-Console-AppSync/", "cloudfront-is-desktop-viewer": "true", "sec-fetch-site": "cross-site", "x-forwarded-port": "443" } }, "prev": { "result": {} } } ``` ### AWS Config Rule The example utilizes AWSConfigRuleEvent to parse the incoming event. The function logs the message type of the invoking event and returns a simple success response. The example event receives a Scheduled Event Notification, but could also be ItemChanged and Oversized. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_classes import ( AWSConfigRuleEvent, event_source, ) logger = Logger() @event_source(data_class=AWSConfigRuleEvent) def lambda_handler(event: AWSConfigRuleEvent, context): message_type = event.invoking_event.message_type logger.info(f"Logging {message_type} event rule", invoke_event=event.raw_invoking_event) return {"Success": "OK"} ``` ``` { "version":"1.0", "invokingEvent":"{\"awsAccountId\":\"0123456789012\",\"notificationCreationTime\":\"2023-04-27T13:26:17.741Z\",\"messageType\":\"ScheduledNotification\",\"recordVersion\":\"1.0\"}", "ruleParameters":"{\"test\":\"x\"}", "resultToken":"eyJlbmNyeXB0ZWREYXRhIjpbLTQyLDEyNiw1MiwtMzcsLTI5LDExNCwxMjYsLTk3LDcxLDIyLC0xMTAsMTEyLC0zMSwtOTMsLTQ5LC0xMDEsODIsMyw1NCw0OSwzLC02OSwtNzEsLTcyLDYyLDgxLC03MiwtODEsNTAsMzUsLTUwLC03NSwtMTE4LC0xMTgsNzcsMTIsLTEsMTQsMTIwLC03MCwxMTAsLTMsNTAsLTYwLDEwNSwtNTcsNDUsMTAyLC0xMDksLTYxLC0xMDEsLTYxLDQsNDcsLTg0LC0yNSwxMTIsNTQsLTcxLC0xMDksNDUsMTksMTIzLC0yNiwxMiwtOTYsLTczLDU0LC0xMDksOTIsNDgsLTU5LC04MywtMzIsODIsLTM2LC05MCwxOSw5OCw3Nyw3OCw0MCw4MCw3OCwtMTA1LDg3LC0xMTMsLTExNiwtNzIsMzAsLTY4LC00MCwtODksMTA5LC0xMDgsLTEyOCwyMiw3Miw3NywtMjEsNzYsODksOTQsLTU5LDgxLC0xMjEsLTEwNywtNjcsNjMsLTcsODIsLTg5LC00NiwtMzQsLTkyLDEyMiwtOTAsMTcsLTEyMywyMCwtODUsLTU5LC03MCw4MSwyNyw2Miw3NCwtODAsODAsMzcsNDAsMTE2LDkxLC0yNCw1MSwtNDEsLTc5LDI4LDEyMCw1MywtMTIyLC04MywxMjYsLTc4LDI1LC05OCwtMzYsMTMsMzIsODYsLTI1LDQ4LDMsLTEwMiwtMTYsMjQsLTMsODUsNDQsLTI4LDE0LDIyLDI3LC0xMjIsMTE4LDEwMSw3Myw1LDE4LDU4LC02NCwyMywtODYsLTExNCwyNCwwLDEwMCwyLDExNywtNjIsLTExOSwtMTI4LDE4LDY1LDkwLDE0LC0xMDIsMjEsODUsMTAwLDExNyw1NSwyOSwxMjcsNTQsNzcsNzIsNzQsMzIsNzgsMywtMTExLDExOCwtNzAsLTg2LDEyNywtNzQsNjAsMjIsNDgsMzcsODcsMTMsMCwtMTA1LDUsLTEyMiwtNzEsLTEwMCwxMDQsLTEyNiwtMTYsNzksLTMwLDEyMCw3NywtNzYsLTQxLC0xMDksMiw5NywtMTAxLC0xLDE1LDEyMywxMTksMTA4LDkxLC0yMCwtMTI1LC05NiwyLC05MiwtMTUsLTY3LC03NiwxMjEsMTA0LDEwNSw2NCwtNjIsMTAyLDgsNCwxMjEsLTQ1LC04MCwtODEsLTgsMTE4LDQ0LC04MiwtNDEsLTg0LDczLC0zNiwxMTcsODAsLTY5LC03MywxNCwtMTgsNzIsMzEsLTUsLTExMSwtMTI3LC00MywzNCwtOCw1NywxMDMsLTQyLDE4LC0zMywxMTcsLTI2LC0xMjQsLTEyNCwxNSw4OCwyMywxNiwtNTcsNTQsLTYsLTEwMiwxMTYsLTk5LC00NSwxMDAsLTM1LDg3LDM3LDYsOTgsMiwxMTIsNjAsLTMzLDE3LDI2LDk5LC0xMDUsNDgsLTEwNCwtMTE5LDc4LDYsLTU4LDk1LDksNDEsLTE2LDk2LDQxLC0yMiw5Niw3MiwxMTYsLTk1LC0xMDUsLTM2LC0xMjMsLTU1LDkxLC00NiwtNywtOTIsMzksNDUsODQsMTYsLTEyNCwtMTIyLC02OCwxLC0yOCwxMjIsLTYwLDgyLDEwMywtNTQsLTkyLDI3LC05OSwtMTI4LDY1LDcsLTcyLC0xMjcsNjIsLTIyLDIsLTExLDE4LC04OSwtMTA2LC03NCw3MSw4NiwtMTE2LC0yNSwtMTE1LC05Niw1NywtMzQsMjIsLTEyNCwtMTI1LC00LC00MSw0MiwtNTcsLTEwMyw0NSw3OCwxNCwtMTA2LDExMSw5OCwtOTQsLTcxLDUsNzUsMTksLTEyNCwtMzAsMzQsLTUwLDc1LC04NCwtNTAsLTU2LDUxLC0xNSwtMzYsNjEsLTk0LC03OSwtNDUsMTI2LC03NywtMTA1LC0yLC05MywtNiw4LC0zLDYsLTQyLDQ2LDEyNSw1LC05OCwxMyw2NywtMTAsLTEzLC05NCwtNzgsLTEyNywxMjEsLTI2LC04LC0xMDEsLTkxLDEyMSwtNDAsLTEyNCwtNjQsODQsLTcyLDYzLDE5LC04NF0sIm1hdGVyaWFsU2V0U2VyaWFsTnVtYmVyIjoxLCJpdlBhcmFtZXRlclNwZWMiOnsiaXYiOlszLC0xMCwtODUsMTE0LC05MCwxMTUsNzcsNTUsNTQsMTUsMzgsODQsLTExNiwxNCwtNDAsMjhdfX0=", "eventLeftScope":false, "executionRoleArn":"arn:aws:iam::0123456789012:role/aws-service-role/config.amazonaws.com/AWSServiceRoleForConfig", "configRuleArn":"arn:aws:config:us-east-1:0123456789012:config-rule/config-rule-pdmyw1", "configRuleName":"rule-ec2-test", "configRuleId":"config-rule-pdmyw1", "accountId":"0123456789012", "evaluationMode":"DETECTIVE" } ``` ### Bedrock Agent The example handles [Bedrock Agent event](https://aws.amazon.com/bedrock/agents/) with `BedrockAgentEvent` to parse the incoming event. The function logs the action group and input text, then returns a structured response compatible with Bedrock Agent's expected format, including a mock response body. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_classes import BedrockAgentEvent, event_source logger = Logger() @event_source(data_class=BedrockAgentEvent) def lambda_handler(event: BedrockAgentEvent, context) -> dict: input_text = event.input_text logger.info(f"Bedrock Agent {event.action_group} invoked with input", input_text=input_text) return { "message_version": "1.0", "responses": [ { "action_group": event.action_group, "api_path": event.api_path, "http_method": event.http_method, "http_status_code": 200, "response_body": {"application/json": {"body": "This is the response"}}, }, ], } ``` ``` { "actionGroup": "ClaimManagementActionGroup", "messageVersion": "1.0", "sessionId": "12345678912345", "sessionAttributes": {}, "promptSessionAttributes": {}, "inputText": "I want to claim my insurance", "agent": { "alias": "TSTALIASID", "name": "test", "version": "DRAFT", "id": "8ZXY0W8P1H" }, "httpMethod": "GET", "apiPath": "/claims" } ``` ### CloudFormation Custom Resource The example focuses on the `Create` request type, generating a unique physical resource ID and logging the process. The function is structured to potentially handle `Update` and `Delete` operations as well. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_classes import ( CloudFormationCustomResourceEvent, event_source, ) from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() @event_source(data_class=CloudFormationCustomResourceEvent) def lambda_handler(event: CloudFormationCustomResourceEvent, context: LambdaContext): request_type = event.request_type if request_type == "Create": return on_create(event, context) else: raise ValueError(f"Invalid request type: {request_type}") def on_create(event: CloudFormationCustomResourceEvent, context: LambdaContext): props = event.resource_properties logger.info(f"Create new resource with props {props}.") physical_id = f"MyResource-{context.aws_request_id}" return {"PhysicalResourceId": physical_id, "Data": {"Message": "Resource created successfully"}} ``` ``` { "RequestType": "Create", "ServiceToken": "arn:aws:lambda:us-east-1:xxx:function:xxxx-CrbuiltinfunctionidProvi-2vKAalSppmKe", "ResponseURL": "https://cloudformation-custom-resource-response-useast1.s3.amazonaws.com/7F%7Cb1f50fdfc25f3b", "StackId": "arn:aws:cloudformation:us-east-1:xxxx:stack/xxxx/271845b0-f2e8-11ed-90ac-0eeb25b8ae21", "RequestId": "xxxxx-d2a0-4dfb-ab1f-xxxxxx", "LogicalResourceId": "xxxxxxxxx", "ResourceType": "Custom::MyType", "ResourceProperties": { "ServiceToken": "arn:aws:lambda:us-east-1:xxxxx:function:xxxxx", "MyProps": "ss" } } ``` ### CloudWatch Dashboard Custom Widget Thie example for `CloudWatchDashboardCustomWidgetEvent` logs the dashboard name, extracts key information like widget ID and time range, and returns a formatted response with a title and markdown content. Read more about [custom widgets for Cloudwatch dashboard](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/add_custom_widget_samples.html). ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_classes import CloudWatchDashboardCustomWidgetEvent, event_source logger = Logger() @event_source(data_class=CloudWatchDashboardCustomWidgetEvent) def lambda_handler(event: CloudWatchDashboardCustomWidgetEvent, context): if event.widget_context is None: logger.warning("No widget context provided") return {"title": "Error", "markdown": "Widget context is missing"} logger.info(f"Processing custom widget for dashboard: {event.widget_context.dashboard_name}") # Access specific event properties widget_id = event.widget_context.widget_id time_range = event.widget_context.time_range if time_range is None: logger.warning("No time range provided") return {"title": f"Custom Widget {widget_id}", "markdown": "Time range is missing"} # Your custom widget logic here return { "title": f"Custom Widget {widget_id}", "markdown": f""" Dashboard: {event.widget_context.dashboard_name} Time Range: {time_range.start} to {time_range.end} Theme: {event.widget_context.theme or "default"} """, } ``` ``` { "original": "param-to-widget", "widgetContext": { "dashboardName": "Name-of-current-dashboard", "widgetId": "widget-16", "domain": "https://us-east-1.console.aws.amazon.com", "accountId": "123456789123", "locale": "en", "timezone": { "label": "UTC", "offsetISO": "+00:00", "offsetInMinutes": 0 }, "period": 300, "isAutoPeriod": true, "timeRange": { "mode": "relative", "start": 1627236199729, "end": 1627322599729, "relativeStart": 86400012, "zoom": { "start": 1627276030434, "end": 1627282956521 } }, "theme": "light", "linkCharts": true, "title": "Tweets for Amazon website problem", "forms": { "all": {} }, "params": { "original": "param-to-widget" }, "width": 588, "height": 369 } } ``` ### CloudWatch Alarm State Change Action [CloudWatch supports Lambda as an alarm state change action](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/AlarmThatSendsEmail.html#alarms-and-actions). You can use the `CloudWathAlarmEvent` data class to access the fields containing such data as alarm information, current state, and previous state. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_classes import CloudWatchAlarmEvent, event_source from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() @event_source(data_class=CloudWatchAlarmEvent) def lambda_handler(event: CloudWatchAlarmEvent, context: LambdaContext) -> dict: logger.info(f"Alarm {event.alarm_data.alarm_name} state is {event.alarm_data.state.value}") # You can now work with event. For example, you can enrich the received data, and # decide on how you want to route the alarm. return { "name": event.alarm_data.alarm_name, "arn": event.alarm_arn, "urgent": "Priority: P1" in (event.alarm_data.configuration.description or ""), } ``` ``` { "source": "aws.cloudwatch", "alarmArn": "arn:aws:cloudwatch:eu-west-1:912397435824:alarm:test_alarm", "accountId": "123456789012", "time": "2024-02-17T11:53:08.431+0000", "region": "eu-west-1", "alarmData": { "alarmName": "Test alert", "state": { "value": "ALARM", "reason": "Threshold Crossed: 1 out of the last 1 datapoints [1.0 (17/02/24 11:51:00)] was less than the threshold (10.0) (minimum 1 datapoint for OK -> ALARM transition).", "reasonData": "{\"version\":\"1.0\",\"queryDate\":\"2024-02-17T11:53:08.423+0000\",\"startDate\":\"2024-02-17T11:51:00.000+0000\",\"statistic\":\"SampleCount\",\"period\":60,\"recentDatapoints\":[1.0],\"threshold\":10.0,\"evaluatedDatapoints\":[{\"timestamp\":\"2024-02-17T11:51:00.000+0000\",\"sampleCount\":1.0,\"value\":1.0}]}", "timestamp": "2024-02-17T11:53:08.431+0000" }, "previousState": { "value": "OK", "reason": "Threshold Crossed: 1 out of the last 1 datapoints [1.0 (17/02/24 11:50:00)] was not greater than the threshold (10.0) (minimum 1 datapoint for ALARM -> OK transition).", "reasonData": "{\"version\":\"1.0\",\"queryDate\":\"2024-02-17T11:51:31.460+0000\",\"startDate\":\"2024-02-17T11:50:00.000+0000\",\"statistic\":\"SampleCount\",\"period\":60,\"recentDatapoints\":[1.0],\"threshold\":10.0,\"evaluatedDatapoints\":[{\"timestamp\":\"2024-02-17T11:50:00.000+0000\",\"sampleCount\":1.0,\"value\":1.0}]}", "timestamp": "2024-02-17T11:51:31.462+0000" }, "configuration": { "description": "This is description **here**", "metrics": [ { "id": "e1", "expression": "m1/m2", "label": "Expression1", "returnData": true }, { "id": "m1", "metricStat": { "metric": { "namespace": "AWS/Lambda", "name": "Invocations", "dimensions": {} }, "period": 60, "stat": "SampleCount" }, "returnData": false }, { "id": "m2", "metricStat": { "metric": { "namespace": "AWS/Lambda", "name": "Duration", "dimensions": {} }, "period": 60, "stat": "SampleCount" }, "returnData": false } ] } } } ``` ### CloudWatch Logs CloudWatch Logs events by default are compressed and base64 encoded. You can use the helper function provided to decode, decompress and parse json data from the event. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_classes import CloudWatchLogsEvent, event_source from aws_lambda_powertools.utilities.data_classes.cloud_watch_logs_event import CloudWatchLogsDecodedData logger = Logger() @event_source(data_class=CloudWatchLogsEvent) def lambda_handler(event: CloudWatchLogsEvent, context): decompressed_log: CloudWatchLogsDecodedData = event.parse_logs_data() logger.info(f"Log group: {decompressed_log.log_group}") logger.info(f"Log stream: {decompressed_log.log_stream}") for log_event in decompressed_log.log_events: logger.info(f"Timestamp: {log_event.timestamp}, Message: {log_event.message}") return {"statusCode": 200, "body": f"Processed {len(decompressed_log.log_events)} log events"} ``` ``` { "awslogs": { "data": "H4sIAAAAAAAAAHWPwQqCQBCGX0Xm7EFtK+smZBEUgXoLCdMhFtKV3akI8d0bLYmibvPPN3wz00CJxmQnTO41whwWQRIctmEcB6sQbFC3CjW3XW8kxpOpP+OC22d1Wml1qZkQGtoMsScxaczKN3plG8zlaHIta5KqWsozoTYw3/djzwhpLwivWFGHGpAFe7DL68JlBUk+l7KSN7tCOEJ4M3/qOI49vMHj+zCKdlFqLaU2ZHV2a4Ct/an0/ivdX8oYc1UVX860fQDQiMdxRQEAAA==" } } ``` #### Kinesis integration [When streaming CloudWatch Logs to a Kinesis Data Stream](https://aws.amazon.com/premiumsupport/knowledge-center/streaming-cloudwatch-logs/) (cross-account or not), you can use `extract_cloudwatch_logs_from_event` to decode, decompress and extract logs as `CloudWatchLogsDecodedData` to ease log processing. ``` from typing import List from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.cloud_watch_logs_event import CloudWatchLogsDecodedData from aws_lambda_powertools.utilities.data_classes.kinesis_stream_event import ( KinesisStreamEvent, extract_cloudwatch_logs_from_event, ) @event_source(data_class=KinesisStreamEvent) def lambda_handler(event: KinesisStreamEvent, context): logs: List[CloudWatchLogsDecodedData] = extract_cloudwatch_logs_from_event(event) for log in logs: if log.message_type == "DATA_MESSAGE": return "success" return "nothing to be processed" ``` ``` { "Records": [ { "kinesis": { "kinesisSchemaVersion": "1.0", "partitionKey": "da10bf66b1f54bff5d96eae99149ad1f", "sequenceNumber": "49635052289529725553291405521504870233219489715332317186", "data": "H4sIAAAAAAAAAK2Sa2vbMBSG/4ox+xg3Oror39IlvaztVmJv7WjCUGwl8+ZLZstts5L/vuOsZYUyWGEgJHiP9J7nvOghLF3b2rVLthsXjsLJOBl/uZjG8fh4Gg7C+q5yDcqUAWcSONHEoFzU6+Om7jZYGdq7dljYcpnZ4cZHwLWOJl1Zbs/r9cR6e9RVqc/rKlpXV9eXt+fy27vt8W+L2DfOlr07oXQIMAQyvHlzPk6mcbKgciktF5lQfMU5dZZqzrShLF2uFC60aLtlmzb5prc/ygvvmjYc3YRPFG+LusuurE+/Ikqb1Gd55dq8jV+8isT6+317Rk42J5PTcLFnm966yvd2D2GeISJTYIwCJSQ1BE9OtWZCABWaKMIJAMdDMyU5MYZLhmkxBhQxfY4Re1tiWiAlBsgIVQTE4Cl6tI+T8SwJZu5Hh1dPs1FApOMSDI9WVKmIC+4irTMWQZYpx7QkztrgE06MU4yCx9DmVbgbvABmQJTGtkYAB0NwEwyYQUBpqEFuSbkGrThTRKi/AlP+HHj6fvJa3P9Ap/+Rbja9/PD6POd+0jXW7xM1B8CDsp37w7woXBb8qQDZ6xeurJttEOc/HWpUBxeHKNr74LHwsXXYlsm9flrl/rmFIQeS7m3m1fVs/DlIGpu6nhMiyWQGXNKIMbcCIgkhElKbaZnZpYJUz33s1iV+z/6+StMlR3yphHNcCyxiNEXf2zed6xuEu8XuF2wb6krnAwAA", "approximateArrivalTimestamp": 1668093033.744 }, "eventSource": "aws:kinesis", "eventVersion": "1.0", "eventID": "shardId-000000000000:49635052289529725553291405521504870233219489715332317186", "eventName": "aws:kinesis:record", "invokeIdentityArn": "arn:aws:iam::231436140809:role/pt-1488-CloudWatchKinesisLogsFunctionRole-1M4G2TIWIE49", "awsRegion": "eu-west-1", "eventSourceARN": "arn:aws:kinesis:eu-west-1:231436140809:stream/pt-1488-KinesisStreamCloudWatchLogs-D8tHs0im0aJG" }, { "kinesis": { "kinesisSchemaVersion": "1.0", "partitionKey": "cf4c4c2c9a49bdfaf58d7dbbc2b06081", "sequenceNumber": "49635052289529725553291405520881064510298312199003701250", "data": "H4sIAAAAAAAAAK2SW2/TQBCF/4pl8ViTvc7u5i0laVraQhUbWtREaG1PgsGXYK/bhqr/nXVoBRIgUYnXc2bPfHO092GFXWc3mOy2GI7D6SSZfDyfxfFkPgsPwua2xtbLjFPBgQqiifFy2WzmbdNvvTOyt92otFWa29HWRVRoHU37qtqdNZupdfaorzNXNHW0qS+vLm7O4PPr3fxHROxatNWQThgbUTqiZHT94mySzOJkBUqYLOWY8ZQLbaTRkEvDciUYzWzKfETXp13WFtsh/qgoHbZdOL4OnyhelU2fX1qXffIoXdKcFjV2RRf/9iqSmy933Sk53h5PT8LVnm12g7Ub4u7DIveIXFFjFNGUKUlAaMY0EUJKLjkQbxhKGCWeknMKoAGUkYoJ7TFd4St2tvJtDRYxDAg3VB08Ve/j42SySIIFfu396Ek+DkS+xkwAiYhM00isgUV6jXmEMrM5EmMsh+C9v9hfMQ4eS1vW4cPBH4CZVpoTJkEIAp5RUMo8vGFae3JNCCdUccMVgPw7sP4VePZm+lzc/0AH/0i3mF28fX6fSzftW+v2jZKXRgVVt3SHRVliHvx06F4+x6ppd0FcfEMvMR2cH3rR3gWPxrsO/Vau9vqyvlpMPgRJazMcYGgEHHLKBhLGJaBA0JLxNc0JppoS9Cwxbir/B4d5QDBAQSnfFFGp8aa/vxw2uLbHYUH4sHr4Dj5RJxfMAwAA", "approximateArrivalTimestamp": 1668092612.992 }, "eventSource": "aws:kinesis", "eventVersion": "1.0", "eventID": "shardId-000000000000:49635052289529725553291405520881064510298312199003701250", "eventName": "aws:kinesis:record", "invokeIdentityArn": "arn:aws:iam::231436140809:role/pt-1488-CloudWatchKinesisLogsFunctionRole-1M4G2TIWIE49", "awsRegion": "eu-west-1", "eventSourceARN": "arn:aws:kinesis:eu-west-1:231436140809:stream/pt-1488-KinesisStreamCloudWatchLogs-D8tHs0im0aJG" } ] } ``` Alternatively, you can use `extract_cloudwatch_logs_from_record` to seamless integrate with the [Batch utility](../batch/) for more robust log processing. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.batch import ( BatchProcessor, EventType, process_partial_response, ) from aws_lambda_powertools.utilities.data_classes.kinesis_stream_event import ( KinesisStreamRecord, extract_cloudwatch_logs_from_record, ) logger = Logger() processor = BatchProcessor(event_type=EventType.KinesisDataStreams) def record_handler(record: KinesisStreamRecord): log = extract_cloudwatch_logs_from_record(record) logger.info(f"Message type: {log.message_type}") return log.message_type == "DATA_MESSAGE" def lambda_handler(event, context): return process_partial_response( event=event, record_handler=record_handler, processor=processor, context=context, ) ``` ``` { "Records": [ { "kinesis": { "kinesisSchemaVersion": "1.0", "partitionKey": "da10bf66b1f54bff5d96eae99149ad1f", "sequenceNumber": "49635052289529725553291405521504870233219489715332317186", "data": "H4sIAAAAAAAAAK2Sa2vbMBSG/4ox+xg3Oror39IlvaztVmJv7WjCUGwl8+ZLZstts5L/vuOsZYUyWGEgJHiP9J7nvOghLF3b2rVLthsXjsLJOBl/uZjG8fh4Gg7C+q5yDcqUAWcSONHEoFzU6+Om7jZYGdq7dljYcpnZ4cZHwLWOJl1Zbs/r9cR6e9RVqc/rKlpXV9eXt+fy27vt8W+L2DfOlr07oXQIMAQyvHlzPk6mcbKgciktF5lQfMU5dZZqzrShLF2uFC60aLtlmzb5prc/ygvvmjYc3YRPFG+LusuurE+/Ikqb1Gd55dq8jV+8isT6+317Rk42J5PTcLFnm966yvd2D2GeISJTYIwCJSQ1BE9OtWZCABWaKMIJAMdDMyU5MYZLhmkxBhQxfY4Re1tiWiAlBsgIVQTE4Cl6tI+T8SwJZu5Hh1dPs1FApOMSDI9WVKmIC+4irTMWQZYpx7QkztrgE06MU4yCx9DmVbgbvABmQJTGtkYAB0NwEwyYQUBpqEFuSbkGrThTRKi/AlP+HHj6fvJa3P9Ap/+Rbja9/PD6POd+0jXW7xM1B8CDsp37w7woXBb8qQDZ6xeurJttEOc/HWpUBxeHKNr74LHwsXXYlsm9flrl/rmFIQeS7m3m1fVs/DlIGpu6nhMiyWQGXNKIMbcCIgkhElKbaZnZpYJUz33s1iV+z/6+StMlR3yphHNcCyxiNEXf2zed6xuEu8XuF2wb6krnAwAA", "approximateArrivalTimestamp": 1668093033.744 }, "eventSource": "aws:kinesis", "eventVersion": "1.0", "eventID": "shardId-000000000000:49635052289529725553291405521504870233219489715332317186", "eventName": "aws:kinesis:record", "invokeIdentityArn": "arn:aws:iam::231436140809:role/pt-1488-CloudWatchKinesisLogsFunctionRole-1M4G2TIWIE49", "awsRegion": "eu-west-1", "eventSourceARN": "arn:aws:kinesis:eu-west-1:231436140809:stream/pt-1488-KinesisStreamCloudWatchLogs-D8tHs0im0aJG" }, { "kinesis": { "kinesisSchemaVersion": "1.0", "partitionKey": "cf4c4c2c9a49bdfaf58d7dbbc2b06081", "sequenceNumber": "49635052289529725553291405520881064510298312199003701250", "data": "H4sIAAAAAAAAAK2SW2/TQBCF/4pl8ViTvc7u5i0laVraQhUbWtREaG1PgsGXYK/bhqr/nXVoBRIgUYnXc2bPfHO092GFXWc3mOy2GI7D6SSZfDyfxfFkPgsPwua2xtbLjFPBgQqiifFy2WzmbdNvvTOyt92otFWa29HWRVRoHU37qtqdNZupdfaorzNXNHW0qS+vLm7O4PPr3fxHROxatNWQThgbUTqiZHT94mySzOJkBUqYLOWY8ZQLbaTRkEvDciUYzWzKfETXp13WFtsh/qgoHbZdOL4OnyhelU2fX1qXffIoXdKcFjV2RRf/9iqSmy933Sk53h5PT8LVnm12g7Ub4u7DIveIXFFjFNGUKUlAaMY0EUJKLjkQbxhKGCWeknMKoAGUkYoJ7TFd4St2tvJtDRYxDAg3VB08Ve/j42SySIIFfu396Ek+DkS+xkwAiYhM00isgUV6jXmEMrM5EmMsh+C9v9hfMQ4eS1vW4cPBH4CZVpoTJkEIAp5RUMo8vGFae3JNCCdUccMVgPw7sP4VePZm+lzc/0AH/0i3mF28fX6fSzftW+v2jZKXRgVVt3SHRVliHvx06F4+x6ppd0FcfEMvMR2cH3rR3gWPxrsO/Vau9vqyvlpMPgRJazMcYGgEHHLKBhLGJaBA0JLxNc0JppoS9Cwxbir/B4d5QDBAQSnfFFGp8aa/vxw2uLbHYUH4sHr4Dj5RJxfMAwAA", "approximateArrivalTimestamp": 1668092612.992 }, "eventSource": "aws:kinesis", "eventVersion": "1.0", "eventID": "shardId-000000000000:49635052289529725553291405520881064510298312199003701250", "eventName": "aws:kinesis:record", "invokeIdentityArn": "arn:aws:iam::231436140809:role/pt-1488-CloudWatchKinesisLogsFunctionRole-1M4G2TIWIE49", "awsRegion": "eu-west-1", "eventSourceARN": "arn:aws:kinesis:eu-west-1:231436140809:stream/pt-1488-KinesisStreamCloudWatchLogs-D8tHs0im0aJG" } ] } ``` ### CodeDeploy LifeCycle Hook CodeDeploy triggers Lambdas with this event when defined in [AppSpec definitions](https://docs.aws.amazon.com/codedeploy/latest/userguide/reference-appspec-file-structure-hooks.html) to test applications at different stages of deployment. ``` from aws_lambda_powertools.utilities.data_classes import CodeDeployLifecycleHookEvent, event_source @event_source(data_class=CodeDeployLifecycleHookEvent) def lambda_handler(event: CodeDeployLifecycleHookEvent, context): deployment_id = event.deployment_id lifecycle_event_hook_execution_id = event.lifecycle_event_hook_execution_id return {"deployment_id": deployment_id, "lifecycle_event_hook_execution_id": lifecycle_event_hook_execution_id} ``` ``` { "DeploymentId": "d-ABCDEF", "LifecycleEventHookExecutionId": "xxxxxxxxxxxxxxxxxxxxxxxx" } ``` ### CodePipeline Job Data classes and utility functions to help create continuous delivery pipelines tasks with AWS Lambda. ``` from aws_lambda_powertools.utilities.data_classes import CodePipelineJobEvent, event_source @event_source(data_class=CodePipelineJobEvent) def lambda_handler(event: CodePipelineJobEvent, context): job_id = event.get_id input_bucket = event.input_bucket_name return {"statusCode": 200, "body": f"Processed job {job_id} from bucket {input_bucket}"} ``` ``` { "CodePipeline.job": { "id": "11111111-abcd-1111-abcd-111111abcdef", "accountId": "111111111111", "data": { "actionConfiguration": { "configuration": { "FunctionName": "MyLambdaFunctionForAWSCodePipeline", "UserParameters": "some-input-such-as-a-URL" } }, "inputArtifacts": [ { "name": "ArtifactName", "revision": null, "location": { "type": "S3", "s3Location": { "bucketName": "the name of the bucket configured as the pipeline artifact store in Amazon S3, for example codepipeline-us-east-2-1234567890", "objectKey": "the name of the application, for example CodePipelineDemoApplication.zip" } } } ], "outputArtifacts": [], "artifactCredentials": { "accessKeyId": "AKIAIOSFODNN7EXAMPLE", "secretAccessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY", "sessionToken": "MIICiTCCAfICCQD6m7oRw0uXOjANBgkqhkiG9w0BAQUFADCBiDELMAkGA1UEBhMCVVMxCzAJBgNVBAgTAldBMRAwDgYDVQQHEwdTZWF0dGxlMQ8wDQYDVQQKEwZBbWF6b24xFDASBgNVBAsTC0lBTSBDb25zb2xlMRIwEAYDVQQDEwlUZXN0Q2lsYWMxHzAdBgkqhkiG9w0BCQEWEG5vb25lQGFtYXpvbi5jb20wHhcNMTEwNDI1MjA0NTIxWhcNMTIwNDI0MjA0NTIxWjCBiDELMAkGA1UEBhMCVVMxCzAJBgNVBAgTAldBMRAwDgYDVQQHEwdTZWF0dGxlMQ8wDQYDVQQKEwZBbWF6b24xFDASBgNVBAsTC0lBTSBDb25zb2xlMRIwEAYDVQQDEwlUZXN0Q2lsYWMxHzAdBgkqhkiG9w0BCQEWEG5vb25lQGFtYXpvbi5jb20wgZ8wDQYJKoZIhvcNAQEBBQADgY0AMIGJAoGBAMaK0dn+a4GmWIWJ21uUSfwfEvySWtC2XADZ4nB+BLYgVIk60CpiwsZ3G93vUEIO3IyNoH/f0wYK8m9TrDHudUZg3qX4waLG5M43q7Wgc/MbQITxOUSQv7c7ugFFDzQGBzZswY6786m86gpEIbb3OhjZnzcvQAaRHhdlQWIMm2nrAgMBAAEwDQYJKoZIhvcNAQEFBQADgYEAtCu4nUhVVxYUntneD9+h8Mg9q6q+auNKyExzyLwaxlAoo7TJHidbtS4J5iNmZgXL0FkbFFBjvSfpJIlJ00zbhNYS5f6GuoEDmFJl0ZxBHjJnyp378OD8uTs7fLvjx79LjSTbNYiytVbZPQUQ5Yaxu2jXnimvw3rrszlaEXAMPLE=" }, "continuationToken": "A continuation token if continuing job" } } } ``` ### Cognito User Pool Cognito User Pools have several [different Lambda trigger sources](https://docs.aws.amazon.com/cognito/latest/developerguide/cognito-user-identity-pools-working-with-aws-lambda-triggers.html#cognito-user-identity-pools-working-with-aws-lambda-trigger-sources), all of which map to a different data class, which can be imported from `aws_lambda_powertools.data_classes.cognito_user_pool_event`: | Trigger/Event Source | Data Class | | --- | --- | | Custom message event | `data_classes.cognito_user_pool_event.CustomMessageTriggerEvent` | | Post authentication | `data_classes.cognito_user_pool_event.PostAuthenticationTriggerEvent` | | Post confirmation | `data_classes.cognito_user_pool_event.PostConfirmationTriggerEvent` | | Pre authentication | `data_classes.cognito_user_pool_event.PreAuthenticationTriggerEvent` | | Pre sign-up | `data_classes.cognito_user_pool_event.PreSignUpTriggerEvent` | | Pre token generation | `data_classes.cognito_user_pool_event.PreTokenGenerationTriggerEvent` | | Pre token generation V2 | `data_classes.cognito_user_pool_event.PreTokenGenerationV2TriggerEvent` | | User migration | `data_classes.cognito_user_pool_event.UserMigrationTriggerEvent` | | Define Auth Challenge | `data_classes.cognito_user_pool_event.DefineAuthChallengeTriggerEvent` | | Create Auth Challenge | `data_classes.cognito_user_pool_event.CreateAuthChallengeTriggerEvent` | | Verify Auth Challenge | `data_classes.cognito_user_pool_event.VerifyAuthChallengeResponseTriggerEvent` | | Custom Email Sender | `data_classes.cognito_user_pool_event.CustomEmailSenderTriggerEvent` | | Custom SMS Sender | `data_classes.cognito_user_pool_event.CustomSMSSenderTriggerEvent` | Some examples for the Cognito User Pools Lambda triggers sources: #### Post Confirmation Example ``` from aws_lambda_powertools.utilities.data_classes.cognito_user_pool_event import PostConfirmationTriggerEvent def lambda_handler(event, context): event: PostConfirmationTriggerEvent = PostConfirmationTriggerEvent(event) user_attributes = event.request.user_attributes return {"statusCode": 200, "body": f"User attributes: {user_attributes}"} ``` ``` { "version": "string", "triggerSource": "PostConfirmation_ConfirmSignUp", "region": "us-east-1", "userPoolId": "string", "userName": "userName", "callerContext": { "awsSdkVersion": "awsSdkVersion", "clientId": "clientId" }, "request": { "userAttributes": { "email": "user@example.com", "email_verified": true } }, "response": {} } ``` #### Define Auth Challenge Example Note In this example we are modifying the wrapped dict response fields, so we need to return the json serializable wrapped event in `event.raw_event`. This example is based on the AWS Cognito docs for [Define Auth Challenge Lambda Trigger](https://docs.aws.amazon.com/cognito/latest/developerguide/user-pool-lambda-define-auth-challenge.html). ``` from aws_lambda_powertools.utilities.data_classes.cognito_user_pool_event import DefineAuthChallengeTriggerEvent def lambda_handler(event, context) -> dict: event_obj: DefineAuthChallengeTriggerEvent = DefineAuthChallengeTriggerEvent(event) if len(event_obj.request.session) == 1 and event_obj.request.session[0].challenge_name == "SRP_A": event_obj.response.issue_tokens = False event_obj.response.fail_authentication = False event_obj.response.challenge_name = "PASSWORD_VERIFIER" elif ( len(event_obj.request.session) == 2 and event_obj.request.session[1].challenge_name == "PASSWORD_VERIFIER" and event_obj.request.session[1].challenge_result ): event_obj.response.issue_tokens = False event_obj.response.fail_authentication = False event_obj.response.challenge_name = "CUSTOM_CHALLENGE" elif ( len(event_obj.request.session) == 3 and event_obj.request.session[2].challenge_name == "CUSTOM_CHALLENGE" and event_obj.request.session[2].challenge_result ): event_obj.response.issue_tokens = True event_obj.response.fail_authentication = False else: event_obj.response.issue_tokens = False event_obj.response.fail_authentication = True return event_obj.raw_event ``` ``` { "version": "1", "region": "us-east-1", "userPoolId": "us-east-1_example", "userName": "UserName", "callerContext": { "awsSdkVersion": "awsSdkVersion", "clientId": "clientId" }, "triggerSource": "DefineAuthChallenge_Authentication", "request": { "userAttributes": { "sub": "4A709A36-7D63-4785-829D-4198EF10EBDA", "email_verified": "true", "name": "First Last", "email": "define-auth@mail.com" }, "session" : [ { "challengeName": "PASSWORD_VERIFIER", "challengeResult": true }, { "challengeName": "CUSTOM_CHALLENGE", "challengeResult": true, "challengeMetadata": "CAPTCHA_CHALLENGE" } ], "userNotFound": true }, "response": {} } ``` #### Create Auth Challenge Example This example is based on the AWS Cognito docs for [Create Auth Challenge Lambda Trigger](https://docs.aws.amazon.com/cognito/latest/developerguide/user-pool-lambda-create-auth-challenge.html). ``` from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.cognito_user_pool_event import CreateAuthChallengeTriggerEvent @event_source(data_class=CreateAuthChallengeTriggerEvent) def handler(event: CreateAuthChallengeTriggerEvent, context) -> dict: if event.request.challenge_name == "CUSTOM_CHALLENGE": event.response.public_challenge_parameters = {"captchaUrl": "url/123.jpg"} event.response.private_challenge_parameters = {"answer": "5"} event.response.challenge_metadata = "CAPTCHA_CHALLENGE" return event.raw_event ``` ``` { "version": "1", "region": "us-east-1", "userPoolId": "us-east-1_example", "userName": "UserName", "callerContext": { "awsSdkVersion": "awsSdkVersion", "clientId": "clientId" }, "triggerSource": "CreateAuthChallenge_Authentication", "request": { "userAttributes": { "sub": "4A709A36-7D63-4785-829D-4198EF10EBDA", "email_verified": "true", "name": "First Last", "email": "create-auth@mail.com" }, "challengeName": "PASSWORD_VERIFIER", "session" : [ { "challengeName": "CUSTOM_CHALLENGE", "challengeResult": true, "challengeMetadata": "CAPTCHA_CHALLENGE" } ], "userNotFound": false }, "response": {} } ``` #### Verify Auth Challenge Response Example This example is based on the AWS Cognito docs for [Verify Auth Challenge Response Lambda Trigger](https://docs.aws.amazon.com/cognito/latest/developerguide/user-pool-lambda-verify-auth-challenge-response.html). ``` from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.cognito_user_pool_event import VerifyAuthChallengeResponseTriggerEvent @event_source(data_class=VerifyAuthChallengeResponseTriggerEvent) def lambda_handler(event: VerifyAuthChallengeResponseTriggerEvent, context) -> dict: event.response.answer_correct = ( event.request.private_challenge_parameters.get("answer") == event.request.challenge_answer ) return event.raw_event ``` ``` { "version": "1", "region": "us-east-1", "userPoolId": "us-east-1_example", "userName": "UserName", "callerContext": { "awsSdkVersion": "awsSdkVersion", "clientId": "clientId" }, "triggerSource": "VerifyAuthChallengeResponse_Authentication", "request": { "userAttributes": { "sub": "4A709A36-7D63-4785-829D-4198EF10EBDA", "email_verified": "true", "name": "First Last", "email": "verify-auth@mail.com" }, "privateChallengeParameters": { "answer": "challengeAnswer" }, "clientMetadata" : { "foo": "value" }, "challengeAnswer": "challengeAnswer", "userNotFound": true }, "response": {} } ``` ### Connect Contact Flow The example integrates with [Amazon Connect](https://docs.aws.amazon.com/connect/latest/adminguide/what-is-amazon-connect.html) by handling contact flow events. The function converts the event into a `ConnectContactFlowEvent` object, providing a structured representation of the contact flow data. ``` from aws_lambda_powertools.utilities.data_classes.connect_contact_flow_event import ( ConnectContactFlowChannel, ConnectContactFlowEndpointType, ConnectContactFlowEvent, ConnectContactFlowInitiationMethod, ) def lambda_handler(event, context): event: ConnectContactFlowEvent = ConnectContactFlowEvent(event) assert event.contact_data.attributes == {"Language": "en-US"} assert event.contact_data.channel == ConnectContactFlowChannel.VOICE assert event.contact_data.customer_endpoint.endpoint_type == ConnectContactFlowEndpointType.TELEPHONE_NUMBER assert event.contact_data.initiation_method == ConnectContactFlowInitiationMethod.API ``` ``` { "Name": "ContactFlowEvent", "Details": { "ContactData": { "Attributes": { "Language": "en-US" }, "Channel": "VOICE", "ContactId": "5ca32fbd-8f92-46af-92a5-6b0f970f0efe", "CustomerEndpoint": { "Address": "+11234567890", "Type": "TELEPHONE_NUMBER" }, "InitialContactId": "5ca32fbd-8f92-46af-92a5-6b0f970f0efe", "InitiationMethod": "API", "InstanceARN": "arn:aws:connect:eu-central-1:123456789012:instance/9308c2a1-9bc6-4cea-8290-6c0b4a6d38fa", "MediaStreams": { "Customer": { "Audio": { "StartFragmentNumber": "91343852333181432392682062622220590765191907586", "StartTimestamp": "1565781909613", "StreamARN": "arn:aws:kinesisvideo:eu-central-1:123456789012:stream/connect-contact-a3d73b84-ce0e-479a-a9dc-5637c9d30ac9/1565272947806" } } }, "PreviousContactId": "5ca32fbd-8f92-46af-92a5-6b0f970f0efe", "Queue": { "ARN": "arn:aws:connect:eu-central-1:123456789012:instance/9308c2a1-9bc6-4cea-8290-6c0b4a6d38fa/queue/5cba7cbf-1ecb-4b6d-b8bd-fe91079b3fc8", "Name": "QueueOne" }, "SystemEndpoint": { "Address": "+11234567890", "Type": "TELEPHONE_NUMBER" } }, "Parameters": { "ParameterOne": "One", "ParameterTwo": "Two" } } } ``` ### DynamoDB Streams The DynamoDB data class utility provides the base class for `DynamoDBStreamEvent`, as well as enums for stream view type (`StreamViewType`) and event type. (`DynamoDBRecordEventName`). The class automatically deserializes DynamoDB types into their equivalent Python types. ``` from aws_lambda_powertools.utilities.data_classes.dynamo_db_stream_event import ( DynamoDBRecordEventName, DynamoDBStreamEvent, ) def lambda_handler(event, context): event: DynamoDBStreamEvent = DynamoDBStreamEvent(event) # Multiple records can be delivered in a single event for record in event.records: if record.event_name == DynamoDBRecordEventName.MODIFY: pass elif record.event_name == DynamoDBRecordEventName.INSERT: pass return "success" ``` ``` from aws_lambda_powertools.utilities.data_classes import DynamoDBStreamEvent, event_source from aws_lambda_powertools.utilities.typing import LambdaContext @event_source(data_class=DynamoDBStreamEvent) def lambda_handler(event: DynamoDBStreamEvent, context: LambdaContext): processed_keys = [] for record in event.records: if record.dynamodb and record.dynamodb.keys and "Id" in record.dynamodb.keys: key = record.dynamodb.keys["Id"] processed_keys.append(key) return {"statusCode": 200, "body": f"Processed keys: {processed_keys}"} ``` ``` { "Records": [ { "eventID": "1", "eventVersion": "1.0", "dynamodb": { "ApproximateCreationDateTime": 1693997155.0, "Keys": { "Id": { "N": "101" } }, "NewImage": { "Message": { "S": "New item!" }, "Id": { "N": "101" } }, "StreamViewType": "NEW_AND_OLD_IMAGES", "SequenceNumber": "111", "SizeBytes": 26 }, "awsRegion": "us-west-2", "eventName": "INSERT", "eventSourceARN": "eventsource_arn", "eventSource": "aws:dynamodb" }, { "eventID": "2", "eventVersion": "1.0", "dynamodb": { "OldImage": { "Message": { "S": "New item!" }, "Id": { "N": "101" } }, "SequenceNumber": "222", "Keys": { "Id": { "N": "101" } }, "SizeBytes": 59, "NewImage": { "Message": { "S": "This item has changed" }, "Id": { "N": "101" } }, "StreamViewType": "NEW_AND_OLD_IMAGES" }, "awsRegion": "us-west-2", "eventName": "MODIFY", "eventSourceARN": "source_arn", "eventSource": "aws:dynamodb" } ] } ``` ### EventBridge When an event matching a defined rule occurs in EventBridge, it can [automatically trigger a Lambda function](https://docs.aws.amazon.com/lambda/latest/dg/with-eventbridge-scheduler.html), passing the event data as input. ``` from aws_lambda_powertools.utilities.data_classes import EventBridgeEvent, event_source @event_source(data_class=EventBridgeEvent) def lambda_handler(event: EventBridgeEvent, context): detail_type = event.detail_type state = event.detail.get("state") # Do something return {"detail_type": detail_type, "state": state} ``` ``` { "version": "0", "id": "6a7e8feb-b491-4cf7-a9f1-bf3703467718", "detail-type": "EC2 Instance State-change Notification", "source": "aws.ec2", "account": "111122223333", "time": "2017-12-22T18:43:48Z", "region": "us-west-1", "resources": [ "arn:aws:ec2:us-west-1:123456789012:instance/i-1234567890abcdef0" ], "detail": { "instance_id": "i-1234567890abcdef0", "state": "terminated" }, "replay-name": "replay_archive" } ``` ### Kafka This example is based on the AWS docs for [Amazon MSK](https://docs.aws.amazon.com/lambda/latest/dg/with-msk.html) and [self-managed Apache Kafka](https://docs.aws.amazon.com/lambda/latest/dg/with-kafka.html). ``` from aws_lambda_powertools.utilities.data_classes import KafkaEvent, event_source def do_something_with(key: str, value: str): print(f"key: {key}, value: {value}") @event_source(data_class=KafkaEvent) def lambda_handler(event: KafkaEvent, context): for record in event.records: do_something_with(record.topic, record.value) return "success" ``` ``` { "eventSource":"aws:kafka", "eventSourceArn":"arn:aws:kafka:us-east-1:0123456789019:cluster/SalesCluster/abcd1234-abcd-cafe-abab-9876543210ab-4", "bootstrapServers":"b-2.demo-cluster-1.a1bcde.c1.kafka.us-east-1.amazonaws.com:9092,b-1.demo-cluster-1.a1bcde.c1.kafka.us-east-1.amazonaws.com:9092", "records":{ "mytopic-0":[ { "topic":"mytopic", "partition":0, "offset":15, "timestamp":1545084650987, "timestampType":"CREATE_TIME", "key":"cmVjb3JkS2V5", "value":"eyJrZXkiOiJ2YWx1ZSJ9", "headers":[ { "headerKey":[ 104, 101, 97, 100, 101, 114, 86, 97, 108, 117, 101 ] } ] }, { "topic":"mytopic", "partition":0, "offset":15, "timestamp":1545084650987, "timestampType":"CREATE_TIME", "value":"eyJrZXkiOiJ2YWx1ZSJ9", "headers":[ { "headerKey":[ 104, 101, 97, 100, 101, 114, 86, 97, 108, 117, 101 ] } ] }, { "topic":"mytopic", "partition":0, "offset":15, "timestamp":1545084650987, "timestampType":"CREATE_TIME", "key": null, "value":"eyJrZXkiOiJ2YWx1ZSJ9", "headers":[ { "headerKey":[ 104, 101, 97, 100, 101, 114, 86, 97, 108, 117, 101 ] } ] } ] } } ``` ### Kinesis streams Kinesis events by default contain base64 encoded data. You can use the helper function to access the data either as json or plain text, depending on the original payload. ``` import json from typing import Any, Dict, Union from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_classes import KinesisStreamEvent, event_source from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() @event_source(data_class=KinesisStreamEvent) def lambda_handler(event: KinesisStreamEvent, context: LambdaContext): for record in event.records: kinesis_record = record.kinesis payload: Union[Dict[str, Any], str] try: # Try to parse as JSON first payload = kinesis_record.data_as_json() logger.info("Received JSON data from Kinesis") except json.JSONDecodeError: # If JSON parsing fails, get as text payload = kinesis_record.data_as_text() logger.info("Received text data from Kinesis") process_data(payload) return {"statusCode": 200, "body": "Processed all records successfully"} def process_data(data: Union[Dict[str, Any], str]) -> None: if isinstance(data, dict): # Handle JSON data logger.info(f"Processing JSON data: {data}") # Add your JSON processing logic here else: # Handle text data logger.info(f"Processing text data: {data}") # Add your text processing logic here ``` ``` { "Records": [ { "kinesis": { "kinesisSchemaVersion": "1.0", "partitionKey": "1", "sequenceNumber": "49590338271490256608559692538361571095921575989136588898", "data": "SGVsbG8sIHRoaXMgaXMgYSB0ZXN0Lg==", "approximateArrivalTimestamp": 1545084650.987 }, "eventSource": "aws:kinesis", "eventVersion": "1.0", "eventID": "shardId-000000000006:49590338271490256608559692538361571095921575989136588898", "eventName": "aws:kinesis:record", "invokeIdentityArn": "arn:aws:iam::123456789012:role/lambda-role", "awsRegion": "us-east-2", "eventSourceARN": "arn:aws:kinesis:us-east-2:123456789012:stream/lambda-stream" }, { "kinesis": { "kinesisSchemaVersion": "1.0", "partitionKey": "1", "sequenceNumber": "49590338271490256608559692540925702759324208523137515618", "data": "VGhpcyBpcyBvbmx5IGEgdGVzdC4=", "approximateArrivalTimestamp": 1545084711.166 }, "eventSource": "aws:kinesis", "eventVersion": "1.0", "eventID": "shardId-000000000006:49590338271490256608559692540925702759324208523137515618", "eventName": "aws:kinesis:record", "invokeIdentityArn": "arn:aws:iam::123456789012:role/lambda-role", "awsRegion": "us-east-2", "eventSourceARN": "arn:aws:kinesis:us-east-2:123456789012:stream/lambda-stream" } ], "window": { "start": "2020-12-09T07:04:00Z", "end": "2020-12-09T07:06:00Z" }, "state": { "1": 282, "2": 715 }, "shardId": "shardId-000000000006", "eventSourceARN": "arn:aws:kinesis:us-east-1:123456789012:stream/lambda-stream", "isFinalInvokeForWindow": false, "isWindowTerminatedEarly": false } ``` ### Kinesis Firehose delivery stream When using Kinesis Firehose, you can use a Lambda function to [perform data transformation](https://docs.aws.amazon.com/firehose/latest/dev/data-transformation.html). For each transformed record, you can choose to either: - **A)** Put them back to the delivery stream (default) - **B)** Drop them so consumers don't receive them (e.g., data validation) - **C)** Indicate a record failed data transformation and should be retried To do that, you can use `KinesisFirehoseDataTransformationResponse` class along with helper functions to make it easier to decode and encode base64 data in the stream. ``` from aws_lambda_powertools.utilities.data_classes import ( KinesisFirehoseDataTransformationResponse, KinesisFirehoseEvent, event_source, ) from aws_lambda_powertools.utilities.serialization import base64_from_json from aws_lambda_powertools.utilities.typing import LambdaContext @event_source(data_class=KinesisFirehoseEvent) def lambda_handler(event: KinesisFirehoseEvent, context: LambdaContext): result = KinesisFirehoseDataTransformationResponse() for record in event.records: # get original data using data_as_text property data = record.data_as_text # (1)! ## generate data to return transformed_data = {"new_data": "transformed data using Powertools", "original_payload": data} processed_record = record.build_data_transformation_response( data=base64_from_json(transformed_data), # (2)! ) result.add_record(processed_record) # return transformed records return result.asdict() ``` 1. **Ingesting JSON payloads?** Use `record.data_as_json` to easily deserialize them. 1. For your convenience, `base64_from_json` serializes a dict to JSON, then encode as base64 data. ``` from json import JSONDecodeError from typing import Dict from aws_lambda_powertools.utilities.data_classes import ( KinesisFirehoseDataTransformationRecord, KinesisFirehoseDataTransformationResponse, KinesisFirehoseEvent, event_source, ) from aws_lambda_powertools.utilities.serialization import base64_from_json from aws_lambda_powertools.utilities.typing import LambdaContext @event_source(data_class=KinesisFirehoseEvent) def lambda_handler(event: KinesisFirehoseEvent, context: LambdaContext): result = KinesisFirehoseDataTransformationResponse() for record in event.records: try: payload: Dict = record.data_as_json # decodes and deserialize base64 JSON string ## generate data to return transformed_data = {"tool_used": "powertools_dataclass", "original_payload": payload} processed_record = KinesisFirehoseDataTransformationRecord( record_id=record.record_id, data=base64_from_json(transformed_data), ) except JSONDecodeError: # (1)! # our producers ingest JSON payloads only; drop malformed records from the stream processed_record = KinesisFirehoseDataTransformationRecord( record_id=record.record_id, data=record.data, result="Dropped", ) result.add_record(processed_record) # return transformed records return result.asdict() ``` 1. This exception would be generated from `record.data_as_json` if invalid payload. ``` from aws_lambda_powertools.utilities.data_classes import ( KinesisFirehoseDataTransformationRecord, KinesisFirehoseDataTransformationResponse, KinesisFirehoseEvent, event_source, ) from aws_lambda_powertools.utilities.serialization import base64_from_json from aws_lambda_powertools.utilities.typing import LambdaContext @event_source(data_class=KinesisFirehoseEvent) def lambda_handler(event: dict, context: LambdaContext): firehose_event = KinesisFirehoseEvent(event) result = KinesisFirehoseDataTransformationResponse() for record in firehose_event.records: try: payload = record.data_as_text # base64 decoded data as str # generate data to return transformed_data = {"tool_used": "powertools_dataclass", "original_payload": payload} # Default result is Ok processed_record = KinesisFirehoseDataTransformationRecord( record_id=record.record_id, data=base64_from_json(transformed_data), ) except Exception: # add Failed result to processing results, send back to kinesis for retry processed_record = KinesisFirehoseDataTransformationRecord( record_id=record.record_id, data=record.data, result="ProcessingFailed", # (1)! ) result.add_record(processed_record) # return transformed records return result.asdict() ``` 1. This record will now be sent to your [S3 bucket in the `processing-failed` folder](https://docs.aws.amazon.com/firehose/latest/dev/data-transformation.html#data-transformation-failure-handling). ``` { "invocationId": "2b4d1ad9-2f48-94bd-a088-767c317e994a", "sourceKinesisStreamArn":"arn:aws:kinesis:us-east-1:123456789012:stream/kinesis-source", "deliveryStreamArn": "arn:aws:firehose:us-east-2:123456789012:deliverystream/delivery-stream-name", "region": "us-east-2", "records": [ { "data": "SGVsbG8gV29ybGQ=", "recordId": "record1", "approximateArrivalTimestamp": 1664028820148, "kinesisRecordMetadata": { "shardId": "shardId-000000000000", "partitionKey": "4d1ad2b9-24f8-4b9d-a088-76e9947c317a", "approximateArrivalTimestamp": 1664028820148, "sequenceNumber": "49546986683135544286507457936321625675700192471156785154", "subsequenceNumber": 0 } }, { "data": "eyJIZWxsbyI6ICJXb3JsZCJ9", "recordId": "record2", "approximateArrivalTimestamp": 1664028793294, "kinesisRecordMetadata": { "shardId": "shardId-000000000001", "partitionKey": "4d1ad2b9-24f8-4b9d-a088-76e9947c318a", "approximateArrivalTimestamp": 1664028793294, "sequenceNumber": "49546986683135544286507457936321625675700192471156785155", "subsequenceNumber": 0 } } ] } ``` ### Lambda Function URL [Lambda Function URLs](https://docs.aws.amazon.com/lambda/latest/dg/urls-invocation.html) provide a direct HTTP endpoint for invoking Lambda functions. This feature allows functions to receive and process HTTP requests without the need for additional services like API Gateway. ``` from aws_lambda_powertools.utilities.data_classes import LambdaFunctionUrlEvent, event_source @event_source(data_class=LambdaFunctionUrlEvent) def lambda_handler(event: LambdaFunctionUrlEvent, context): if event.request_context.http.method == "GET": return {"statusCode": 200, "body": "Hello World!"} ``` ``` { "version":"2.0", "routeKey":"$default", "rawPath":"/", "rawQueryString":"", "headers":{ "sec-fetch-mode":"navigate", "x-amzn-tls-version":"TLSv1.2", "sec-fetch-site":"cross-site", "accept-language":"pt-BR,pt;q=0.9", "x-forwarded-proto":"https", "x-forwarded-port":"443", "x-forwarded-for":"123.123.123.123", "sec-fetch-user":"?1", "accept":"text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9", "x-amzn-tls-cipher-suite":"ECDHE-RSA-AES128-GCM-SHA256", "sec-ch-ua":"\" Not A;Brand\";v=\"99\", \"Chromium\";v=\"102\", \"Google Chrome\";v=\"102\"", "sec-ch-ua-mobile":"?0", "x-amzn-trace-id":"Root=1-62ecd163-5f302e550dcde3b12402207d", "sec-ch-ua-platform":"\"Linux\"", "host":".lambda-url.us-east-1.on.aws", "upgrade-insecure-requests":"1", "cache-control":"max-age=0", "accept-encoding":"gzip, deflate, br", "sec-fetch-dest":"document", "user-agent":"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/102.0.0.0 Safari/537.36" }, "requestContext":{ "accountId":"anonymous", "apiId":"", "domainName":".lambda-url.us-east-1.on.aws", "domainPrefix":"", "http":{ "method":"GET", "path":"/", "protocol":"HTTP/1.1", "sourceIp":"123.123.123.123", "userAgent":"agent" }, "requestId":"id", "routeKey":"$default", "stage":"$default", "time":"05/Aug/2022:08:14:39 +0000", "timeEpoch":1659687279885 }, "isBase64Encoded":false } ``` ### Rabbit MQ It is used for [Rabbit MQ payloads](https://docs.aws.amazon.com/lambda/latest/dg/with-mq.html). See the [blog post](https://aws.amazon.com/blogs/compute/using-amazon-mq-for-rabbitmq-as-an-event-source-for-lambda/) for more details. ``` from typing import Dict from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.rabbit_mq_event import RabbitMQEvent logger = Logger() @event_source(data_class=RabbitMQEvent) def lambda_handler(event: RabbitMQEvent, context): for queue_name, messages in event.rmq_messages_by_queue.items(): logger.debug(f"Messages for queue: {queue_name}") for message in messages: logger.debug(f"MessageID: {message.basic_properties.message_id}") data: Dict = message.json_data logger.debug(f"Process json in base64 encoded data str {data}") return { "queue_name": queue_name, "message_id": message.basic_properties.message_id, } ``` ``` { "eventSource": "aws:rmq", "eventSourceArn": "arn:aws:mq:us-west-2:112556298976:broker:pizzaBroker:b-9bcfa592-423a-4942-879d-eb284b418fc8", "rmqMessagesByQueue": { "pizzaQueue::/": [ { "basicProperties": { "contentType": "text/plain", "contentEncoding": null, "headers": { "header1": { "bytes": [ 118, 97, 108, 117, 101, 49 ] }, "header2": { "bytes": [ 118, 97, 108, 117, 101, 50 ] }, "numberInHeader": 10 }, "deliveryMode": 1, "priority": 34, "correlationId": null, "replyTo": null, "expiration": "60000", "messageId": null, "timestamp": "Jan 1, 1970, 12:33:41 AM", "type": null, "userId": "AIDACKCEVSQ6C2EXAMPLE", "appId": null, "clusterId": null, "bodySize": 80 }, "redelivered": false, "data": "eyJ0aW1lb3V0IjowLCJkYXRhIjoiQ1pybWYwR3c4T3Y0YnFMUXhENEUifQ==" } ] } } ``` ### S3 Integration with Amazon S3 enables automatic, serverless processing of object-level events in S3 buckets. When triggered by actions like object creation or deletion, Lambda functions receive detailed event information, allowing for real-time file processing, data transformations, and automated workflows. ``` from urllib.parse import unquote_plus from aws_lambda_powertools.utilities.data_classes import S3Event, event_source @event_source(data_class=S3Event) def lambda_handler(event: S3Event, context): bucket_name = event.bucket_name # Multiple records can be delivered in a single event for record in event.records: object_key = unquote_plus(record.s3.get_object.key) object_etag = record.s3.get_object.etag return { "bucket": bucket_name, "object_key": object_key, "object_etag": object_etag, } ``` ``` { "Records": [ { "eventVersion": "2.1", "eventSource": "aws:s3", "awsRegion": "us-east-2", "eventTime": "2019-09-03T19:37:27.192Z", "eventName": "ObjectCreated:Put", "userIdentity": { "principalId": "AWS:AIDAINPONIXQXHT3IKHL2" }, "requestParameters": { "sourceIPAddress": "205.255.255.255" }, "responseElements": { "x-amz-request-id": "D82B88E5F771F645", "x-amz-id-2": "vlR7PnpV2Ce81l0PRw6jlUpck7Jo5ZsQjryTjKlc5aLWGVHPZLj5NeC6qMa0emYBDXOo6QBU0Wo=" }, "s3": { "s3SchemaVersion": "1.0", "configurationId": "828aa6fc-f7b5-4305-8584-487c791949c1", "bucket": { "name": "lambda-artifacts-deafc19498e3f2df", "ownerIdentity": { "principalId": "A3I5XTEXAMAI3E" }, "arn": "arn:aws:s3:::lambda-artifacts-deafc19498e3f2df" }, "object": { "key": "b21b84d653bb07b05b1e6b33684dc11b", "size": 1305107, "eTag": "b21b84d653bb07b05b1e6b33684dc11b", "sequencer": "0C0F6F405D6ED209E1" } } } ] } ``` ### S3 Batch Operations This example is based on the AWS S3 Batch Operations documentation [Example Lambda function for S3 Batch Operations](https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops-invoke-lambda.html). ``` import boto3 from botocore.exceptions import ClientError from aws_lambda_powertools.utilities.data_classes import S3BatchOperationEvent, S3BatchOperationResponse, event_source from aws_lambda_powertools.utilities.typing import LambdaContext @event_source(data_class=S3BatchOperationEvent) def lambda_handler(event: S3BatchOperationEvent, context: LambdaContext): response = S3BatchOperationResponse(event.invocation_schema_version, event.invocation_id, "PermanentFailure") task = event.task src_key: str = task.s3_key src_bucket: str = task.s3_bucket s3 = boto3.client("s3", region_name="us-east-1") try: dest_bucket, dest_key = do_some_work(s3, src_bucket, src_key) result = task.build_task_batch_response("Succeeded", f"s3://{dest_bucket}/{dest_key}") except ClientError as e: error_code = e.response["Error"]["Code"] error_message = e.response["Error"]["Message"] if error_code == "RequestTimeout": result = task.build_task_batch_response("TemporaryFailure", "Retry request to Amazon S3 due to timeout.") else: result = task.build_task_batch_response("PermanentFailure", f"{error_code}: {error_message}") except Exception as e: result = task.build_task_batch_response("PermanentFailure", str(e)) finally: response.add_result(result) return response.asdict() def do_some_work(s3_client, src_bucket: str, src_key: str): ... ``` ``` { "invocationSchemaVersion": "2.0", "invocationId": "YXNkbGZqYWRmaiBhc2RmdW9hZHNmZGpmaGFzbGtkaGZza2RmaAo", "job": { "id": "f3cc4f60-61f6-4a2b-8a21-d07600c373ce", "userArguments": { "k1": "v1", "k2": "v2" } }, "tasks": [ { "taskId": "dGFza2lkZ29lc2hlcmUK", "s3Key": "prefix/dataset/dataset.20231222.json.gz", "s3VersionId": null, "s3Bucket": "powertools-dataset" } ] } ``` ### S3 Object Lambda This example is based on the AWS Blog post [Introducing Amazon S3 Object Lambda – Use Your Code to Process Data as It Is Being Retrieved from S3](https://aws.amazon.com/blogs/aws/introducing-amazon-s3-object-lambda-use-your-code-to-process-data-as-it-is-being-retrieved-from-s3/). ``` import boto3 import requests from aws_lambda_powertools import Logger from aws_lambda_powertools.logging.correlation_paths import S3_OBJECT_LAMBDA from aws_lambda_powertools.utilities.data_classes.s3_object_event import S3ObjectLambdaEvent logger = Logger() session = boto3.session.Session() s3 = session.client("s3") @logger.inject_lambda_context(correlation_id_path=S3_OBJECT_LAMBDA, log_event=True) def lambda_handler(event, context): event = S3ObjectLambdaEvent(event) # Get object from S3 response = requests.get(event.input_s3_url) original_object = response.content.decode("utf-8") # Make changes to the object about to be returned transformed_object = original_object.upper() # Write object back to S3 Object Lambda s3.write_get_object_response( Body=transformed_object, RequestRoute=event.request_route, RequestToken=event.request_token, ) return {"status_code": 200} ``` ``` { "xAmzRequestId": "1a5ed718-5f53-471d-b6fe-5cf62d88d02a", "getObjectContext": { "inputS3Url": "https://myap-123412341234.s3-accesspoint.us-east-1.amazonaws.com/s3.txt?X-Amz-Security-Token=...", "outputRoute": "io-iad-cell001", "outputToken": "..." }, "configuration": { "accessPointArn": "arn:aws:s3-object-lambda:us-east-1:123412341234:accesspoint/myolap", "supportingAccessPointArn": "arn:aws:s3:us-east-1:123412341234:accesspoint/myap", "payload": "test" }, "userRequest": { "url": "/s3.txt", "headers": { "Host": "myolap-123412341234.s3-object-lambda.us-east-1.amazonaws.com", "Accept-Encoding": "identity", "X-Amz-Content-SHA256": "e3b0c44297fc1c149afbf4c8995fb92427ae41e4649b934ca495991b7852b855" } }, "userIdentity": { "type": "IAMUser", "principalId": "...", "arn": "arn:aws:iam::123412341234:user/myuser", "accountId": "123412341234", "accessKeyId": "..." }, "protocolVersion": "1.00" } ``` ### S3 EventBridge Notification [S3 EventBridge notifications](https://docs.aws.amazon.com/AmazonS3/latest/userguide/EventBridge.html) enhance Lambda's ability to process S3 events by routing them through Amazon EventBridge. This integration offers advanced filtering, multiple destination support, and standardized CloudEvents format. ``` from aws_lambda_powertools.utilities.data_classes import S3EventBridgeNotificationEvent, event_source @event_source(data_class=S3EventBridgeNotificationEvent) def lambda_handler(event: S3EventBridgeNotificationEvent, context): bucket_name = event.detail.bucket.name file_key = event.detail.object.key if event.detail_type == "Object Created": print(f"Object {file_key} created in bucket {bucket_name}") return { "bucket": bucket_name, "file_key": file_key, } ``` ``` { "version": "0", "id": "f5f1e65c-dc3a-93ca-6c1e-b1647eac7963", "detail-type": "Object Created", "source": "aws.s3", "account": "123456789012", "time": "2023-03-08T17:50:14Z", "region": "eu-west-1", "resources": [ "arn:aws:s3:::example-bucket" ], "detail": { "version": "0", "bucket": { "name": "example-bucket" }, "object": { "key": "IMG_m7fzo3.jpg", "size": 184662, "etag": "4e68adba0abe2dc8653dc3354e14c01d", "sequencer": "006408CAD69598B05E" }, "request-id": "57H08PA84AB1JZW0", "requester": "123456789012", "source-ip-address": "34.252.34.74", "reason": "PutObject" } } ``` ### Secrets Manager AWS Secrets Manager rotation uses an AWS Lambda function to update the secret. [Click here](https://docs.aws.amazon.com/secretsmanager/latest/userguide/rotating-secrets.html) for more information about rotating AWS Secrets Manager secrets. ``` from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.data_classes import SecretsManagerEvent, event_source secrets_provider = parameters.SecretsProvider() @event_source(data_class=SecretsManagerEvent) def lambda_handler(event: SecretsManagerEvent, context): # Getting secret value using Parameter utility # See https://docs.powertools.aws.dev/lambda/python/latest/utilities/parameters/ secret = secrets_provider.get(event.secret_id, VersionId=event.version_id, VersionStage="AWSCURRENT") # You need to work with secrets afterwards # Check more examples: https://github.com/aws-samples/aws-secrets-manager-rotation-lambdas return secret ``` ``` { "SecretId":"arn:aws:secretsmanager:us-west-2:123456789012:secret:MyTestDatabaseSecret-a1b2c3", "ClientRequestToken":"550e8400-e29b-41d4-a716-446655440000", "Step":"createSecret" } ``` ### SES The integration with Simple Email Service (SES) enables serverless email processing. When configured, SES can trigger Lambda functions in response to incoming emails or delivery status notifications. The Lambda function receives an SES event containing details like sender, recipients, and email content. ``` from aws_lambda_powertools.utilities.data_classes import SESEvent, event_source @event_source(data_class=SESEvent) def lambda_handler(event: SESEvent, context): # Multiple records can be delivered in a single event for record in event.records: mail = record.ses.mail common_headers = mail.common_headers return { "mail": mail, "common_headers": common_headers, } ``` ``` { "Records": [ { "eventVersion": "1.0", "ses": { "mail": { "commonHeaders": { "from": [ "Jane Doe " ], "to": [ "johndoe@example.com" ], "returnPath": "janedoe@example.com", "messageId": "<0123456789example.com>", "date": "Wed, 7 Oct 2015 12:34:56 -0700", "subject": "Test Subject" }, "source": "janedoe@example.com", "timestamp": "1970-01-01T00:00:00.000Z", "destination": [ "johndoe@example.com" ], "headers": [ { "name": "Return-Path", "value": "" }, { "name": "Received", "value": "from mailer.example.com (mailer.example.com [203.0.113.1]) by ..." }, { "name": "DKIM-Signature", "value": "v=1; a=rsa-sha256; c=relaxed/relaxed; d=example.com; s=example; ..." }, { "name": "MIME-Version", "value": "1.0" }, { "name": "From", "value": "Jane Doe " }, { "name": "Date", "value": "Wed, 7 Oct 2015 12:34:56 -0700" }, { "name": "Message-ID", "value": "<0123456789example.com>" }, { "name": "Subject", "value": "Test Subject" }, { "name": "To", "value": "johndoe@example.com" }, { "name": "Content-Type", "value": "text/plain; charset=UTF-8" } ], "headersTruncated": false, "messageId": "o3vrnil0e2ic28tr" }, "receipt": { "recipients": [ "johndoe@example.com" ], "timestamp": "1970-01-01T00:00:00.000Z", "spamVerdict": { "status": "PASS" }, "dkimVerdict": { "status": "PASS" }, "dmarcPolicy": "reject", "processingTimeMillis": 574, "action": { "type": "Lambda", "invocationType": "Event", "functionArn": "arn:aws:lambda:us-west-2:012345678912:function:Example" }, "dmarcVerdict": { "status": "PASS" }, "spfVerdict": { "status": "PASS" }, "virusVerdict": { "status": "PASS" } } }, "eventSource": "aws:ses" } ] } ``` ### SNS The integration with Simple Notification Service (SNS) enables serverless message processing. When configured, SNS can trigger Lambda functions in response to published messages or notifications. The Lambda function receives an SNS event containing details like the message body, subject, and metadata. ``` from aws_lambda_powertools.utilities.data_classes import SNSEvent, event_source @event_source(data_class=SNSEvent) def lambda_handler(event: SNSEvent, context): # Multiple records can be delivered in a single event for record in event.records: message = record.sns.message subject = record.sns.subject return { "message": message, "subject": subject, } ``` ``` { "Records": [ { "EventVersion": "1.0", "EventSubscriptionArn": "arn:aws:sns:us-east-2:123456789012:sns-la ...", "EventSource": "aws:sns", "Sns": { "SignatureVersion": "1", "Timestamp": "2019-01-02T12:45:07.000Z", "Signature": "tcc6faL2yUC6dgZdmrwh1Y4cGa/ebXEkAi6RibDsvpi+tE/1+82j...65r==", "SigningCertUrl": "https://sns.us-east-2.amazonaws.com/SimpleNotification", "MessageId": "95df01b4-ee98-5cb9-9903-4c221d41eb5e", "Message": "Hello from SNS!", "MessageAttributes": { "Test": { "Type": "String", "Value": "TestString" }, "TestBinary": { "Type": "Binary", "Value": "TestBinary" } }, "Type": "Notification", "UnsubscribeUrl": "https://sns.us-east-2.amazonaws.com/?Action=Unsubscribe", "TopicArn": "arn:aws:sns:us-east-2:123456789012:sns-lambda", "Subject": "TestInvoke" } } ] } ``` ### SQS The integration with Simple Queue Service (SQS) enables serverless queue processing. When configured, SQS can trigger Lambda functions in response to messages in the queue. The Lambda function receives an SQS event containing details like message body, attributes, and metadata. ``` from aws_lambda_powertools.utilities.data_classes import SQSEvent, SQSRecord, event_source @event_source(data_class=SQSEvent) def lambda_handler(event: SQSEvent, context): # Multiple records can be delivered in a single event for record in event.records: message, message_id = process_record(record) return { "message": message, "message_id": message_id, } def process_record(record: SQSRecord): message = record.body message_id = record.message_id return message, message_id ``` ``` { "Records": [ { "messageId": "059f36b4-87a3-44ab-83d2-661975830a7d", "receiptHandle": "AQEBwJnKyrHigUMZj6rYigCgxlaS3SLy0a...", "body": "Test message.", "attributes": { "ApproximateReceiveCount": "1", "SentTimestamp": "1545082649183", "SenderId": "AIDAIENQZJOLO23YVJ4VO", "ApproximateFirstReceiveTimestamp": "1545082649185" }, "messageAttributes": { "testAttr": { "stringValue": "100", "binaryValue": "base64Str", "dataType": "Number" } }, "md5OfBody": "e4e68fb7bd0e697a0ae8f1bb342846b3", "eventSource": "aws:sqs", "eventSourceARN": "arn:aws:sqs:us-east-2:123456789012:my-queue", "awsRegion": "us-east-2" }, { "messageId": "2e1424d4-f796-459a-8184-9c92662be6da", "receiptHandle": "AQEBzWwaftRI0KuVm4tP+/7q1rGgNqicHq...", "body": "{\"message\": \"foo1\"}", "attributes": { "ApproximateReceiveCount": "1", "SentTimestamp": "1545082650636", "SenderId": "AIDAIENQZJOLO23YVJ4VO", "ApproximateFirstReceiveTimestamp": "1545082650649" }, "messageAttributes": {}, "md5OfBody": "e4e68fb7bd0e697a0ae8f1bb342846b3", "eventSource": "aws:sqs", "eventSourceARN": "arn:aws:sqs:us-east-2:123456789012:my-queue", "awsRegion": "us-east-2" } ] } ``` ### VPC Lattice V2 You can register your Lambda functions as targets within an Amazon VPC Lattice service network. By doing this, your Lambda function becomes a service within the network, and clients that have access to the VPC Lattice service network can call your service using [Payload V2](https://docs.aws.amazon.com/lambda/latest/dg/services-vpc-lattice.html#vpc-lattice-receiving-events). [Click here](https://docs.aws.amazon.com/lambda/latest/dg/services-vpc-lattice.html) for more information about using AWS Lambda with Amazon VPC Lattice. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_classes import VPCLatticeEventV2, event_source from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() @event_source(data_class=VPCLatticeEventV2) def lambda_handler(event: VPCLatticeEventV2, context: LambdaContext): logger.info(event.body) response = { "isBase64Encoded": False, "statusCode": 200, "statusDescription": "200 OK", "headers": {"Content-Type": "application/text"}, "body": "VPC Lattice V2 Event ✨🎉✨", } return response ``` ``` { "version": "2.0", "path": "/todos", "method": "GET", "headers": { "user_agent": "curl/7.64.1", "x-forwarded-for": "10.213.229.10", "host": "test-lambda-service-3908sdf9u3u.dkfjd93.vpc-lattice-svcs.us-east-2.on.aws", "accept": "*/*" }, "queryStringParameters": { "order-id": "1" }, "body": "{\"message\": \"Hello from Lambda!\"}", "requestContext": { "serviceNetworkArn": "arn:aws:vpc-lattice:us-east-2:123456789012:servicenetwork/sn-0bf3f2882e9cc805a", "serviceArn": "arn:aws:vpc-lattice:us-east-2:123456789012:service/svc-0a40eebed65f8d69c", "targetGroupArn": "arn:aws:vpc-lattice:us-east-2:123456789012:targetgroup/tg-6d0ecf831eec9f09", "identity": { "sourceVpcArn": "arn:aws:ec2:region:123456789012:vpc/vpc-0b8276c84697e7339", "type" : "AWS_IAM", "principal": "arn:aws:sts::123456789012:assumed-role/example-role/057d00f8b51257ba3c853a0f248943cf", "sessionName": "057d00f8b51257ba3c853a0f248943cf", "x509SanDns": "example.com" }, "region": "us-east-2", "timeEpoch": "1696331543569073" } } ``` ### VPC Lattice V1 You can register your Lambda functions as targets within an Amazon VPC Lattice service network. By doing this, your Lambda function becomes a service within the network, and clients that have access to the VPC Lattice service network can call your service. [Click here](https://docs.aws.amazon.com/lambda/latest/dg/services-vpc-lattice.html) for more information about using AWS Lambda with Amazon VPC Lattice. ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_classes import VPCLatticeEvent, event_source from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() @event_source(data_class=VPCLatticeEvent) def lambda_handler(event: VPCLatticeEvent, context: LambdaContext): logger.info(event.body) response = { "isBase64Encoded": False, "statusCode": 200, "headers": {"Content-Type": "application/text"}, "body": "Event Response to VPC Lattice 🔥🚀🔥", } return response ``` ``` { "raw_path": "/testpath", "method": "GET", "headers": { "user_agent": "curl/7.64.1", "x-forwarded-for": "10.213.229.10", "host": "test-lambda-service-3908sdf9u3u.dkfjd93.vpc-lattice-svcs.us-east-2.on.aws", "accept": "*/*" }, "query_string_parameters": { "order-id": "1" }, "body": "eyJ0ZXN0IjogImV2ZW50In0=", "is_base64_encoded": true } ``` ### IoT Core Events #### IoT Core Thing Created/Updated/Deleted You can use IoT Core registry events to trigger your lambda functions. More information on this specific one can be found [here](https://docs.aws.amazon.com/iot/latest/developerguide/registry-events.html#registry-events-thing). ``` from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.iot_registry_event import IoTCoreThingEvent @event_source(data_class=IoTCoreThingEvent) def lambda_handler(event: IoTCoreThingEvent, context): print(f"Received IoT Core event type {event.event_type}") ``` ``` { "eventType": "THING_EVENT", "eventId": "f5ae9b94-8b8e-4d8e-8c8f-b3266dd89853", "timestamp": 1234567890123, "operation": "CREATED", "accountId": "123456789012", "thingId": "b604f69c-aa9a-4d4a-829e-c480e958a0b5", "thingName": "MyThing", "versionNumber": 1, "thingTypeName": null, "attributes": {"attribute3": "value3", "attribute1": "value1", "attribute2": "value2"} } ``` #### IoT Core Thing Type Created/Updated/Deprecated/Undeprecated/Deleted You can use IoT Core registry events to trigger your lambda functions. More information on this specific one can be found [here](https://docs.aws.amazon.com/iot/latest/developerguide/registry-events.html#registry-events-thingtype-crud). ``` from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.iot_registry_event import IoTCoreThingTypeEvent @event_source(data_class=IoTCoreThingTypeEvent) def lambda_handler(event: IoTCoreThingTypeEvent, context): print(f"Received IoT Core event type {event.event_type}") ``` ``` { "eventType": "THING_TYPE_EVENT", "eventId": "8827376c-4b05-49a3-9b3b-733729df7ed5", "timestamp": 1234567890123, "operation": "CREATED", "accountId": "123456789012", "thingTypeId": "c530ae83-32aa-4592-94d3-da29879d1aac", "thingTypeName": "MyThingType", "isDeprecated": false, "deprecationDate": null, "searchableAttributes": ["attribute1", "attribute2", "attribute3"], "propagatingAttributes": [ {"userPropertyKey": "key", "thingAttribute": "model"}, {"userPropertyKey": "key", "connectionAttribute": "iot:ClientId"} ], "description": "My thing type" } ``` #### IoT Core Thing Type Associated/Disassociated with a Thing You can use IoT Core registry events to trigger your lambda functions. More information on this specific one can be found [here](https://docs.aws.amazon.com/iot/latest/developerguide/registry-events.html#registry-events-thingtype-assoc). ``` from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.iot_registry_event import IoTCoreThingTypeAssociationEvent @event_source(data_class=IoTCoreThingTypeAssociationEvent) def lambda_handler(event: IoTCoreThingTypeAssociationEvent, context): print(f"Received IoT Core event type {event.event_type}") ``` ``` { "eventId": "87f8e095-531c-47b3-aab5-5171364d138d", "eventType": "THING_TYPE_ASSOCIATION_EVENT", "operation": "ADDED", "thingId": "b604f69c-aa9a-4d4a-829e-c480e958a0b5", "thingName": "myThing", "thingTypeName": "MyThingType", "timestamp": 1234567890123 } ``` #### IoT Core Thing Group Created/Updated/Deleted You can use IoT Core registry events to trigger your lambda functions. More information on this specific one can be found [here](https://docs.aws.amazon.com/iot/latest/developerguide/registry-events.html#registry-events-thinggroup-crud). ``` from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.iot_registry_event import IoTCoreThingGroupEvent @event_source(data_class=IoTCoreThingGroupEvent) def lambda_handler(event: IoTCoreThingGroupEvent, context): print(f"Received IoT Core event type {event.event_type}") ``` ``` { "eventType": "THING_GROUP_EVENT", "eventId": "8b9ea8626aeaa1e42100f3f32b975899", "timestamp": 1603995417409, "operation": "UPDATED", "accountId": "571EXAMPLE833", "thingGroupId": "8757eec8-bb37-4cca-a6fa-403b003d139f", "thingGroupName": "Tg_level5", "versionNumber": 3, "parentGroupName": "Tg_level4", "parentGroupId": "5fce366a-7875-4c0e-870b-79d8d1dce119", "description": "New description for Tg_level5", "rootToParentThingGroups": [ { "groupArn": "arn:aws:iot:us-west-2:571EXAMPLE833:thinggroup/TgTopLevel", "groupId": "36aa0482-f80d-4e13-9bff-1c0a75c055f6" }, { "groupArn": "arn:aws:iot:us-west-2:571EXAMPLE833:thinggroup/Tg_level1", "groupId": "bc1643e1-5a85-4eac-b45a-92509cbe2a77" }, { "groupArn": "arn:aws:iot:us-west-2:571EXAMPLE833:thinggroup/Tg_level2", "groupId": "0476f3d2-9beb-48bb-ae2c-ea8bd6458158" }, { "groupArn": "arn:aws:iot:us-west-2:571EXAMPLE833:thinggroup/Tg_level3", "groupId": "1d9d4ffe-a6b0-48d6-9de6-2e54d1eae78f" }, { "groupArn": "arn:aws:iot:us-west-2:571EXAMPLE833:thinggroup/Tg_level4", "groupId": "5fce366a-7875-4c0e-870b-79d8d1dce119" } ], "attributes": {"attribute1": "value1", "attribute3": "value3", "attribute2": "value2"}, "dynamicGroupMappingId": null } ``` #### IoT Thing Added/Removed from Thing Group You can use IoT Core registry events to trigger your lambda functions. More information on this specific one can be found [here](https://docs.aws.amazon.com/iot/latest/developerguide/registry-events.html#registry-events-thinggroup-addremove). ``` from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.iot_registry_event import IoTCoreAddOrRemoveFromThingGroupEvent @event_source(data_class=IoTCoreAddOrRemoveFromThingGroupEvent) def lambda_handler(event: IoTCoreAddOrRemoveFromThingGroupEvent, context): print(f"Received IoT Core event type {event.event_type}") ``` ``` { "eventType": "THING_GROUP_MEMBERSHIP_EVENT", "eventId": "d684bd5f-6f6e-48e1-950c-766ac7f02fd1", "timestamp": 1234567890123, "operation": "ADDED", "accountId": "123456789012", "groupArn": "arn:aws:iot:ap-northeast-2:123456789012:thinggroup/MyChildThingGroup", "groupId": "06838589-373f-4312-b1f2-53f2192291c4", "thingArn": "arn:aws:iot:ap-northeast-2:123456789012:thing/MyThing", "thingId": "b604f69c-aa9a-4d4a-829e-c480e958a0b5", "membershipId": "8505ebf8-4d32-4286-80e9-c23a4a16bbd8" } ``` #### IoT Child Group Added/Deleted from Parent Group You can use IoT Core registry events to trigger your lambda functions. More information on this specific one can be found [here](https://docs.aws.amazon.com/iot/latest/developerguide/registry-events.html#registry-events-thinggroup-adddelete). ``` from aws_lambda_powertools.utilities.data_classes import event_source from aws_lambda_powertools.utilities.data_classes.iot_registry_event import IoTCoreAddOrDeleteFromThingGroupEvent @event_source(data_class=IoTCoreAddOrDeleteFromThingGroupEvent) def lambda_handler(event: IoTCoreAddOrDeleteFromThingGroupEvent, context): print(f"Received IoT Core event type {event.event_type}") ``` ``` { "eventType": "THING_GROUP_HIERARCHY_EVENT", "eventId": "264192c7-b573-46ef-ab7b-489fcd47da41", "timestamp": 1234567890123, "operation": "ADDED", "accountId": "123456789012", "thingGroupId": "8f82a106-6b1d-4331-8984-a84db5f6f8cb", "thingGroupName": "MyRootThingGroup", "childGroupId": "06838589-373f-4312-b1f2-53f2192291c4", "childGroupName": "MyChildThingGroup" } ``` ## Advanced ### Debugging Alternatively, you can print out the fields to obtain more information. All classes come with a `__str__` method that generates a dictionary string which can be quite useful for debugging. However, certain events may contain sensitive fields such as `secret_access_key` and `session_token`, which are labeled as `[SENSITIVE]` to prevent any accidental disclosure of confidential information. If we fail to deserialize a field value (e.g., JSON), they will appear as `[Cannot be deserialized]` ``` from aws_lambda_powertools.utilities.data_classes import ( CodePipelineJobEvent, event_source, ) @event_source(data_class=CodePipelineJobEvent) def lambda_handler(event, context): print(event) ``` ``` { "CodePipeline.job": { "id": "11111111-abcd-1111-abcd-111111abcdef", "accountId": "111111111111", "data": { "actionConfiguration": { "configuration": { "FunctionName": "MyLambdaFunctionForAWSCodePipeline", "UserParameters": "some-input-such-as-a-URL" } }, "inputArtifacts": [ { "name": "ArtifactName", "revision": null, "location": { "type": "S3", "s3Location": { "bucketName": "the name of the bucket configured as the pipeline artifact store in Amazon S3, for example codepipeline-us-east-2-1234567890", "objectKey": "the name of the application, for example CodePipelineDemoApplication.zip" } } } ], "outputArtifacts": [], "artifactCredentials": { "accessKeyId": "AKIAIOSFODNN7EXAMPLE", "secretAccessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY", "sessionToken": "MIICiTCCAfICCQD6m7oRw0uXOjANBgkqhkiG9w0BAQUFADCBiDELMAkGA1UEBhMCVVMxCzAJBgNVBAgTAldBMRAwDgYDVQQHEwdTZWF0dGxlMQ8wDQYDVQQKEwZBbWF6b24xFDASBgNVBAsTC0lBTSBDb25zb2xlMRIwEAYDVQQDEwlUZXN0Q2lsYWMxHzAdBgkqhkiG9w0BCQEWEG5vb25lQGFtYXpvbi5jb20wHhcNMTEwNDI1MjA0NTIxWhcNMTIwNDI0MjA0NTIxWjCBiDELMAkGA1UEBhMCVVMxCzAJBgNVBAgTAldBMRAwDgYDVQQHEwdTZWF0dGxlMQ8wDQYDVQQKEwZBbWF6b24xFDASBgNVBAsTC0lBTSBDb25zb2xlMRIwEAYDVQQDEwlUZXN0Q2lsYWMxHzAdBgkqhkiG9w0BCQEWEG5vb25lQGFtYXpvbi5jb20wgZ8wDQYJKoZIhvcNAQEBBQADgY0AMIGJAoGBAMaK0dn+a4GmWIWJ21uUSfwfEvySWtC2XADZ4nB+BLYgVIk60CpiwsZ3G93vUEIO3IyNoH/f0wYK8m9TrDHudUZg3qX4waLG5M43q7Wgc/MbQITxOUSQv7c7ugFFDzQGBzZswY6786m86gpEIbb3OhjZnzcvQAaRHhdlQWIMm2nrAgMBAAEwDQYJKoZIhvcNAQEFBQADgYEAtCu4nUhVVxYUntneD9+h8Mg9q6q+auNKyExzyLwaxlAoo7TJHidbtS4J5iNmZgXL0FkbFFBjvSfpJIlJ00zbhNYS5f6GuoEDmFJl0ZxBHjJnyp378OD8uTs7fLvjx79LjSTbNYiytVbZPQUQ5Yaxu2jXnimvw3rrszlaEXAMPLE=" }, "continuationToken": "A continuation token if continuing job" } } } ``` ``` { "account_id":"111111111111", "data":{ "action_configuration":{ "configuration":{ "decoded_user_parameters":"[Cannot be deserialized]", "function_name":"MyLambdaFunctionForAWSCodePipeline", "raw_event":"[SENSITIVE]", "user_parameters":"some-input-such-as-a-URL" }, "raw_event":"[SENSITIVE]" }, "artifact_credentials":{ "access_key_id":"AKIAIOSFODNN7EXAMPLE", "expiration_time":"None", "raw_event":"[SENSITIVE]", "secret_access_key":"[SENSITIVE]", "session_token":"[SENSITIVE]" }, "continuation_token":"A continuation token if continuing job", "encryption_key":"None", "input_artifacts":[ { "location":{ "get_type":"S3", "raw_event":"[SENSITIVE]", "s3_location":{ "bucket_name":"the name of the bucket configured as the pipeline artifact store in Amazon S3, for example codepipeline-us-east-2-1234567890", "key":"the name of the application, for example CodePipelineDemoApplication.zip", "object_key":"the name of the application, for example CodePipelineDemoApplication.zip", "raw_event":"[SENSITIVE]" } }, "name":"ArtifactName", "raw_event":"[SENSITIVE]", "revision":"None" } ], "output_artifacts":[ ], "raw_event":"[SENSITIVE]" }, "decoded_user_parameters":"[Cannot be deserialized]", "get_id":"11111111-abcd-1111-abcd-111111abcdef", "input_bucket_name":"the name of the bucket configured as the pipeline artifact store in Amazon S3, for example codepipeline-us-east-2-1234567890", "input_object_key":"the name of the application, for example CodePipelineDemoApplication.zip", "raw_event":"[SENSITIVE]", "user_parameters":"some-input-such-as-a-URL" } ``` The data masking utility can encrypt, decrypt, or irreversibly erase sensitive information to protect data confidentiality. ``` stateDiagram-v2 direction LR LambdaFn: Your Lambda function DataMasking: DataMasking Operation: Possible operations Input: Sensitive value Erase: Erase Encrypt: Encrypt Decrypt: Decrypt Provider: AWS Encryption SDK provider Result: Data transformed (erased, encrypted, or decrypted) LambdaFn --> DataMasking DataMasking --> Operation state Operation { [*] --> Input Input --> Erase: Irreversible Input --> Encrypt Input --> Decrypt Encrypt --> Provider Decrypt --> Provider } Operation --> Result ``` ## Key features - Encrypt, decrypt, or irreversibly erase data with ease - Erase sensitive information in one or more fields within nested data - Seamless integration with [AWS Encryption SDK](https://docs.aws.amazon.com/encryption-sdk/latest/developer-guide/introduction.html) for industry and AWS security best practices ## Terminology **Erasing** replaces sensitive information **irreversibly** with a non-sensitive placeholder *(`*****`)*, or with a customized mask. This operation replaces data in-memory, making it a one-way action. **Encrypting** transforms plaintext into ciphertext using an encryption algorithm and a cryptographic key. It allows you to encrypt any sensitive data, so only allowed personnel to decrypt it. Learn more about encryption [here](https://aws.amazon.com/blogs/security/importance-of-encryption-and-how-aws-can-help/). **Decrypting** transforms ciphertext back into plaintext using a decryption algorithm and the correct decryption key. **Encryption context** is a non-secret `key=value` data used for authentication like `tenant_id:`. This adds extra security and confirms encrypted data relationship with a context. **[Encrypted message](https://docs.aws.amazon.com/encryption-sdk/latest/developer-guide/message-format.html)** is a portable data structure that includes encrypted data along with copies of the encrypted data key. It includes everything Encryption SDK needs to validate authenticity, integrity, and to decrypt with the right master key. **[Envelope encryption](https://docs.aws.amazon.com/encryption-sdk/latest/developer-guide/concepts.html#envelope-encryption)** uses two different keys to encrypt data safely: master and data key. The data key encrypts the plaintext, and the master key encrypts the data key. It simplifies key management *(you own the master key)*, isolates compromises to data key, and scales better with large data volumes. ``` graph LR M(Master key) --> |Encrypts| D(Data key) D(Data key) --> |Encrypts| S(Sensitive data) ``` *Envelope encryption visualized.* ## Getting started Tip All examples shared in this documentation are available within the [project repository](https://github.com/aws-powertools/powertools-lambda-python/tree/develop/examples). ### Install Add `aws-lambda-powertools[datamasking]` as a dependency in your preferred tool: *e.g.*, *requirements.txt*, *pyproject.toml*. This will install the [AWS Encryption SDK](https://docs.aws.amazon.com/encryption-sdk/latest/developer-guide/introduction.html). AWS Encryption SDK contains non-Python dependencies. This means you should use [AWS SAM CLI](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/using-sam-cli-build.html#using-sam-cli-build-options-container) or [official build container images](https://gallery.ecr.aws/search?searchTerm=sam%2Fbuild-python&popularRegistries=amazon) when building your application for AWS Lambda. Local development should work as expected. ### Required resources By default, we use Amazon Key Management Service (KMS) for encryption and decryption operations. Before you start, you will need a KMS symmetric key to encrypt and decrypt your data. Your Lambda function will need read and write access to it. **NOTE**. We recommend setting a minimum of 1024MB of memory *(CPU intensive)*, and separate Lambda functions for encrypt and decrypt. For more information, you can see the full reports of our [load tests](https://github.com/aws-powertools/powertools-lambda-python/pull/2197#issuecomment-1730571597) and [traces](https://github.com/aws-powertools/powertools-lambda-python/pull/2197#issuecomment-1732060923). ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: > Powertools for AWS Lambda (Python) data masking example Globals: # https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-specification-template-anatomy-globals.html Function: Timeout: 5 Runtime: python3.11 Tracing: Active Environment: Variables: POWERTOOLS_SERVICE_NAME: PowertoolsHelloWorld POWERTOOLS_LOG_LEVEL: INFO KMS_KEY_ARN: !GetAtt DataMaskingMasterKey.Arn # In production, we recommend you split up the encrypt and decrypt for fine-grained security. # For example, one function can act as the encryption proxy via HTTP requests, data pipeline, etc., # while only authorized personnel can call decrypt via a separate function. Resources: DataMaskingEncryptFunctionExample: Type: AWS::Serverless::Function Properties: Handler: data_masking_function_example.lambda_handler CodeUri: ../src Description: Data Masking encryption function # Cryptographic operations demand more CPU. CPU is proportionally allocated based on memory size. # We recommend allocating a minimum of 1024MB of memory. MemorySize: 1024 # DataMaskingDecryptFunctionExample: # Type: AWS::Serverless::Function # Properties: # Handler: data_masking_function_decrypt.lambda_handler # CodeUri: ../src # Description: Data Masking decryption function # MemorySize: 1024 # KMS Key DataMaskingMasterKey: Type: "AWS::KMS::Key" Properties: Description: KMS Key for encryption and decryption using Powertools for AWS Lambda Data masking feature # KMS Key support both IAM Resource Policies and Key Policies # For more details: https://docs.aws.amazon.com/kms/latest/developerguide/key-policies.html KeyPolicy: Version: "2012-10-17" Id: data-masking-enc-dec Statement: # For security reasons, ensure your KMS Key has at least one administrator. # In this example, the root account is granted administrator permissions. # However, we recommended configuring specific IAM Roles for enhanced security in production. - Effect: Allow Principal: AWS: !Sub "arn:aws:iam::${AWS::AccountId}:root" # (1)! Action: "kms:*" Resource: "*" # We must grant Lambda's IAM Role access to the KMS Key - Effect: Allow Principal: AWS: !GetAtt DataMaskingEncryptFunctionExampleRole.Arn # (2)! Action: - kms:Decrypt # to decrypt encrypted data key - kms:GenerateDataKey # to create an unique and random data key for encryption # Encrypt permission is required only when using multiple keys - kms:Encrypt # (3)! Resource: "*" ``` 1. [Key policy examples using IAM Roles](https://docs.aws.amazon.com/kms/latest/developerguide/key-policy-default.html#key-policy-default-allow-administrators) 1. [SAM generated CloudFormation Resources](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-specification-generated-resources-function.html#sam-specification-generated-resources-function-not-role) 1. Required only when using [multiple keys](#using-multiple-keys) ### Erasing data Erasing will remove the original data and replace it with a `*****`. This means you cannot recover erased data, and the data type will change to `str` for all data unless the data to be erased is of an Iterable type (`list`, `tuple`, `set`), in which case the method will return a new object of the same type as the input data but with each element replaced by the string `*****`. ``` from __future__ import annotations from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_masking import DataMasking from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() data_masker = DataMasking() @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> dict: data: dict = event.get("body", {}) logger.info("Erasing fields email, address.street, and company_address") erased = data_masker.erase(data, fields=["email", "address.street", "company_address"]) # (1)! return erased ``` 1. See [working with nested data](#working-with-nested-data) to learn more about the `fields` parameter. If we omit `fields` parameter, the entire dictionary will be erased with `*****`. ``` { "body": { "id": 1, "name": "John Doe", "age": 30, "email": "johndoe@example.com", "address": { "street": "123 Main St", "city": "Anytown", "state": "CA", "zip": "12345" }, "company_address": { "street": "456 ACME Ave", "city": "Anytown", "state": "CA", "zip": "12345" } } } ``` ``` { "id": 1, "name": "John Doe", "age": 30, "email": "*****", "address": { "street": "*****", "city": "Anytown", "state": "CA", "zip": "12345" }, "company_address": "*****" } ``` #### Custom masking The `erase` method also supports additional flags for more advanced and flexible masking: (bool) Enables dynamic masking behavior when set to `True`, by maintaining the original length and structure of the text replacing with \*. > Expression: `data_masker.erase(data, fields=["address.zip"], dynamic_mask=True)` > > Field result: `'street': '*** **** **'` (str) Specifies a simple pattern for masking data. This pattern is applied directly to the input string, replacing all the original characters. For example, with a `custom_mask` of "XX-XX" applied to "12345", the result would be "XX-XX". > Expression: `data_masker.erase(data, fields=["address.zip"], custom_mask="XX")` > > Field result: `'zip': 'XX'` (str) `regex_pattern` defines a regular expression pattern used to identify parts of the input string that should be masked. This allows for more complex and flexible masking rules. It's used in conjunction with `mask_format`. `mask_format` specifies the format to use when replacing parts of the string matched by `regex_pattern`. It can include placeholders (like \\1, \\2) to refer to captured groups in the regex pattern, allowing some parts of the original string to be preserved. > Expression: `data_masker.erase(data, fields=["email"], regex_pattern=r"(.)(.*)(@.*)", mask_format=r"\1****\3")` > > Field result: `'email': 'j****@example.com'` (dict) Allows you to apply different masking rules (flags) for each data field. ``` from __future__ import annotations from aws_lambda_powertools.utilities.data_masking import DataMasking from aws_lambda_powertools.utilities.typing import LambdaContext data_masker = DataMasking() def lambda_handler(event: dict, context: LambdaContext) -> dict: data: dict = event.get("body", {}) # Masking rules for each field masking_rules = { "email": {"regex_pattern": "(.)(.*)(@.*)", "mask_format": r"\1****\3"}, "age": {"dynamic_mask": True}, "address.zip": {"custom_mask": "xxx"}, "$.other_address[?(@.postcode > 12000)]": {"custom_mask": "Masked"}, } result = data_masker.erase(data, masking_rules=masking_rules) return result ``` ``` { "body": { "id": 1, "name": "Jane Doe", "age": 30, "email": "janedoe@example.com", "address": { "street": "123 Main St", "city": "Anytown", "state": "CA", "zip": "12345", "postcode": 12345, "product": { "name": "Car" } }, "other_address": [ { "postcode": 11345, "street": "123 Any Drive" }, { "postcode": 67890, "street": "100 Main Street," } ], "company_address": { "street": "456 ACME Ave", "city": "Anytown", "state": "CA", "zip": "12345" } } } ``` ``` { "id": 1, "name": "John Doe", "age": "**", "email": "j****@example.com", "address": { "street": "123 Main St", "city": "Anytown", "state": "CA", "zip": "xxx", "postcode": 12345, "product": { "name": "Car" } }, "other_address": [ { "postcode": 11345, "street": "123 Any Drive" }, "Masked" ], "company_address": { "street": "456 ACME Ave", "city": "Anytown", "state": "CA", "zip": "12345" } } ``` ### Encrypting data About static typing and encryption Encrypting data may lead to a different data type, as it always transforms into a string *(``)*. To encrypt, you will need an [encryption provider](#providers). Here, we will use `AWSEncryptionSDKProvider`. Under the hood, we delegate a [number of operations](#encrypt-operation-with-encryption-sdk-kms) to AWS Encryption SDK to authenticate, create a portable encryption message, and actual data encryption. ``` from __future__ import annotations import os from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_masking import DataMasking from aws_lambda_powertools.utilities.data_masking.provider.kms.aws_encryption_sdk import ( AWSEncryptionSDKProvider, ) from aws_lambda_powertools.utilities.typing import LambdaContext KMS_KEY_ARN = os.getenv("KMS_KEY_ARN", "") encryption_provider = AWSEncryptionSDKProvider(keys=[KMS_KEY_ARN]) # (1)! data_masker = DataMasking(provider=encryption_provider) logger = Logger() @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> dict: data: dict = event.get("body", {}) logger.info("Encrypting the whole object") encrypted = data_masker.encrypt(data) return {"body": encrypted} ``` 1. You can use more than one KMS Key for higher availability but increased latency. Encryption SDK will ensure the data key is encrypted with both keys. ``` { "body": { "id": 1, "name": "John Doe", "age": 30, "email": "johndoe@example.com", "address": { "street": "123 Main St", "city": "Anytown", "state": "CA", "zip": "12345" }, "company_address": { "street": "456 ACME Ave", "city": "Anytown", "state": "CA", "zip": "12345" } } } ``` ``` { "body": "AgV4uF5K2YMtNhYrtviTwKNrUHhqQr73l/jNfukkh+qLOC8AXwABABVhd3MtY3J5cHRvLXB1YmxpYy1rZXkAREEvcjEyaFZHY1R5cjJuTDNKbTJ3UFA3R3ZjaytIdi9hekZqbXVUb25Ya3J5SzFBOUlJZDZxZXpSR1NTVnZDUUxoZz09AAEAB2F3cy1rbXMAS2Fybjphd3M6a21zOnVzLWVhc3QtMToyMDA5ODQxMTIzODY6a2V5LzZkODJiMzRlLTM2NjAtNDRlMi04YWJiLTdmMzA1OGJlYTIxMgC4AQIBAHjxYXAO7wQGd+7qxoyvXAajwqboF5FL/9lgYUNJTB8VtAHBP2hwVgw+zypp7GoMNTPAAAAAfjB8BgkqhkiG9w0BBwagbzBtAgEAMGgGCSqGSIb3DQEHATAeBglghkgBZQMEAS4wEQQMx/B25MTgWwpL7CmuAgEQgDtan3orAOKFUfyNm3v6rFcglb+BVVVDV71fj4aRljhpg1ixsYFaKsoej8NcwRktIiWE+mw9XmTEVb6xFQIAABAA9DeLzlRaRQgTcXMJG0iBu/YTyyDKiROD+bU1Y09X9RBz5LA1nWIENJKq2seAhNSB/////wAAAAEAAAAAAAAAAAAAAAEAAAEBExLJ9wI4n7t+wyPEEP4kjYFBdkmNuLLsVC2Yt8mv9Y1iH2G+/g9SaIcdK57pkoW0ECpBxZVOxCuhmK2s74AJCUdem9McjS1waUKyzYTi9vv2ySNBsABIDwT990rE7jZJ3tEZAqcWZg/eWlxvnksFR/akBWZKsKzFz6lF57+cTgdISCEJRV0E7fcUeCuaMaQGK1Qw2OCmIeHEG5j5iztBkZG2IB2CVND/AbxmDUFHwgjsrJPTzaDYSufcGMoZW1A9X1sLVfqNVKvnOFP5tNY7kPF5eAI9FhGBw8SjTqODXz4k6zuqzy9no8HtXowP265U8NZ5VbVTd/zuVEbZyK5KBqzP1sExW4RhnlpXMoOs9WSuAGcwZQIxANTeEwb9V7CacV2Urt/oCqysUzhoV2AcT2ZjryFqY79Tsg+FRpIx7cBizL4ieRzbhQIwcRasNncO5OZOcmVr0MqHv+gCVznndMgjXJmWwUa7h6skJKmhhMPlN0CsugxtVWnD" } ``` ### Decrypting data About static typing and decryption Decrypting data may lead to a different data type, as encrypted data is always a string *(``)*. To decrypt, you will need an [encryption provider](#providers). Here, we will use `AWSEncryptionSDKProvider`. Under the hood, we delegate a [number of operations](#decrypt-operation-with-encryption-sdk-kms) to AWS Encryption SDK to verify authentication, integrity, and actual ciphertext decryption. **NOTE**. Decryption only works with KMS Key ARN. ``` from __future__ import annotations import os from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_masking import DataMasking from aws_lambda_powertools.utilities.data_masking.provider.kms.aws_encryption_sdk import AWSEncryptionSDKProvider from aws_lambda_powertools.utilities.typing import LambdaContext KMS_KEY_ARN = os.getenv("KMS_KEY_ARN", "") # (1)! encryption_provider = AWSEncryptionSDKProvider(keys=[KMS_KEY_ARN]) # (2)! data_masker = DataMasking(provider=encryption_provider) logger = Logger() @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> dict: data: dict = event.get("body", {}) logger.info("Decrypting whole object") decrypted = data_masker.decrypt(data) return decrypted ``` 1. Note that KMS key alias or key ID won't work. 1. You can use more than one KMS Key for higher availability but increased latency. Encryption SDK will call `Decrypt` API with all master keys when trying to decrypt the data key. ``` { "body": "AgV4uF5K2YMtNhYrtviTwKNrUHhqQr73l/jNfukkh+qLOC8AXwABABVhd3MtY3J5cHRvLXB1YmxpYy1rZXkAREEvcjEyaFZHY1R5cjJuTDNKbTJ3UFA3R3ZjaytIdi9hekZqbXVUb25Ya3J5SzFBOUlJZDZxZXpSR1NTVnZDUUxoZz09AAEAB2F3cy1rbXMAS2Fybjphd3M6a21zOnVzLWVhc3QtMToyMDA5ODQxMTIzODY6a2V5LzZkODJiMzRlLTM2NjAtNDRlMi04YWJiLTdmMzA1OGJlYTIxMgC4AQIBAHjxYXAO7wQGd+7qxoyvXAajwqboF5FL/9lgYUNJTB8VtAHBP2hwVgw+zypp7GoMNTPAAAAAfjB8BgkqhkiG9w0BBwagbzBtAgEAMGgGCSqGSIb3DQEHATAeBglghkgBZQMEAS4wEQQMx/B25MTgWwpL7CmuAgEQgDtan3orAOKFUfyNm3v6rFcglb+BVVVDV71fj4aRljhpg1ixsYFaKsoej8NcwRktIiWE+mw9XmTEVb6xFQIAABAA9DeLzlRaRQgTcXMJG0iBu/YTyyDKiROD+bU1Y09X9RBz5LA1nWIENJKq2seAhNSB/////wAAAAEAAAAAAAAAAAAAAAEAAAEBExLJ9wI4n7t+wyPEEP4kjYFBdkmNuLLsVC2Yt8mv9Y1iH2G+/g9SaIcdK57pkoW0ECpBxZVOxCuhmK2s74AJCUdem9McjS1waUKyzYTi9vv2ySNBsABIDwT990rE7jZJ3tEZAqcWZg/eWlxvnksFR/akBWZKsKzFz6lF57+cTgdISCEJRV0E7fcUeCuaMaQGK1Qw2OCmIeHEG5j5iztBkZG2IB2CVND/AbxmDUFHwgjsrJPTzaDYSufcGMoZW1A9X1sLVfqNVKvnOFP5tNY7kPF5eAI9FhGBw8SjTqODXz4k6zuqzy9no8HtXowP265U8NZ5VbVTd/zuVEbZyK5KBqzP1sExW4RhnlpXMoOs9WSuAGcwZQIxANTeEwb9V7CacV2Urt/oCqysUzhoV2AcT2ZjryFqY79Tsg+FRpIx7cBizL4ieRzbhQIwcRasNncO5OZOcmVr0MqHv+gCVznndMgjXJmWwUa7h6skJKmhhMPlN0CsugxtVWnD" } ``` ``` { "id": 1, "name": "John Doe", "age": 30, "email": "johndoe@example.com", "address": { "street": "123 Main St", "city": "Anytown", "state": "CA", "zip": "12345" }, "company_address": { "street": "456 ACME Ave", "city": "Anytown", "state": "CA", "zip": "12345" } } ``` ### Encryption context for integrity and authenticity For a stronger security posture, you can add metadata to each encryption operation, and verify them during decryption. This is known as additional authenticated data (AAD). These are non-sensitive data that can help protect authenticity and integrity of your encrypted data, and even help to prevent a [confused deputy](https://docs.aws.amazon.com/IAM/latest/UserGuide/confused-deputy.html) situation. Important considerations you should know 1. **Exact match verification on decrypt**. Be careful using random data like `timestamps` as encryption context if you can't provide them on decrypt. 1. **Only `string` values are supported**. We will raise `DataMaskingUnsupportedTypeError` for non-string values. 1. **Use non-sensitive data only**. When using KMS, encryption context is available as plaintext in AWS CloudTrail, unless you [intentionally disabled KMS events](https://docs.aws.amazon.com/kms/latest/developerguide/logging-using-cloudtrail.html#filtering-kms-events). ``` from __future__ import annotations import os from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_masking import DataMasking from aws_lambda_powertools.utilities.data_masking.provider.kms.aws_encryption_sdk import AWSEncryptionSDKProvider from aws_lambda_powertools.utilities.typing import LambdaContext KMS_KEY_ARN = os.getenv("KMS_KEY_ARN", "") encryption_provider = AWSEncryptionSDKProvider(keys=[KMS_KEY_ARN]) data_masker = DataMasking(provider=encryption_provider) logger = Logger() @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> str: data = event.get("body", {}) logger.info("Encrypting whole object") encrypted: str = data_masker.encrypt( data, data_classification="confidential", # (1)! data_type="customer-data", tenant_id="a06bf973-0734-4b53-9072-39d7ac5b2cba", ) return encrypted ``` 1. They must match on `decrypt()` otherwise the operation will fail with `DataMaskingContextMismatchError`. ``` from __future__ import annotations import os from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_masking import DataMasking from aws_lambda_powertools.utilities.data_masking.provider.kms.aws_encryption_sdk import AWSEncryptionSDKProvider from aws_lambda_powertools.utilities.typing import LambdaContext KMS_KEY_ARN = os.getenv("KMS_KEY_ARN", "") encryption_provider = AWSEncryptionSDKProvider(keys=[KMS_KEY_ARN]) data_masker = DataMasking(provider=encryption_provider) logger = Logger() @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> dict: data = event.get("body", {}) logger.info("Decrypting whole object") decrypted: dict = data_masker.decrypt( data, data_classification="confidential", # (1)! data_type="customer-data", tenant_id="a06bf973-0734-4b53-9072-39d7ac5b2cba", ) return decrypted ``` 1. They must match otherwise the operation will fail with `DataMaskingContextMismatchError`. ### Choosing parts of your data Current limitations 1. The `fields` parameter is not yet supported in `encrypt` and `decrypt` operations. 1. We support `JSON` data types only - see [data serialization for more details](#data-serialization). You can use the `fields` parameter with the dot notation `.` to choose one or more parts of your data to `erase`. This is useful when you want to keep data structure intact except the confidential fields. When `fields` is present, `erase` behaves differently: | Operation | Behavior | Example | Result | | --- | --- | --- | --- | | `erase` | Replace data while keeping collections type intact. | `{"cards": ["a", "b"]}` | `{"cards": ["*****", "*****"]}` | Here are common scenarios to best visualize how to use `fields`. You want to erase data in the `card_number` field. > Expression: `data_masker.erase(data, fields=["card_number"])` ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "1111 2222 3333 4444" } ``` ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "*****" } ``` You want to erase data in the `postcode` field. > Expression: `data_masker.erase(data, fields=["address.postcode"])` ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "1111 2222 3333 4444", "address": { "postcode": 12345 } } ``` ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "1111 2222 3333 4444", "address": { "postcode": "*****" } } ``` You want to erase data in both `postcode` and `street` fields. > Expression: `data_masker.erase(data, fields=["address.postcode", "address.street"])` ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "1111 2222 3333 4444", "address": { "postcode": 12345, "street": "123 Any Street" } } ``` ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "1111 2222 3333 4444", "address": { "postcode": "*****", "street": "*****" } } ``` You want to erase data under `address` field. > Expression: `data_masker.erase(data, fields=["address"])` ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "1111 2222 3333 4444", "address": [ { "postcode": 12345, "street": "123 Any Street", "country": "United States", "timezone": "America/La_Paz" }, { "postcode": 67890, "street": "100 Main Street", "country": "United States", "timezone": "America/Mazatlan" } ] } ``` ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "1111 2222 3333 4444", "address": [ "*****", "*****" ] } ``` You want to erase data under `name` field. > Expression: `data_masker.erase(data, fields=["category..name"])` ``` { "category": { "subcategory": { "brand" : { "product": { "name": "Car" } } } } } ``` ``` { "category": { "subcategory": { "brand" : { "product": { "name": "*****" } } } } } ``` You want to erase data under `street` field located at the any index of the address list. > Expression: `data_masker.erase(data, fields=["address[*].street"])` ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "1111 2222 3333 4444", "address": [ { "postcode": 12345, "street": "123 Any Drive" }, { "postcode": 67890, "street": "100 Main Street," } ] } ``` ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "1111 2222 3333 4444", "address": [ { "postcode": 12345, "street": "*****" }, { "postcode": 67890, "street": "*****" } ] } ``` You want to erase data by slicing a list. > Expression: `data_masker.erase(data, fields=["address[-1].street"])` ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "1111 2222 3333 4444", "address": [ { "postcode": 12345, "street": "123 Any Street" }, { "postcode": 67890, "street": "100 Main Street" }, { "postcode": 78495, "street": "111 Any Drive" } ] } ``` ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "1111 2222 3333 4444", "address": [ { "postcode": 12345, "street": "123 Any Street" }, { "postcode": 67890, "street": "100 Main Street" }, { "postcode": 11111, "street": "*****" } ] } ``` You want to erase data by finding for a field with conditional expression. > Expression: `data_masker.erase(data, fields=["$.address[?(@.postcode > 12000)]"])` > > `$`: Represents the root of the JSON structure. > > `.address`: Selects the "address" property within the JSON structure. > > `(@.postcode > 12000)`: Specifies the condition that elements should meet. It selects elements where the value of the `postcode` property is `greater than 12000`. ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "1111 2222 3333 4444", "address": [ { "postcode": 12345, "street": "123 Any Drive" }, { "postcode": 67890, "street": "111 Main Street" }, { "postcode": 11111, "street": "100 Any Street" } ] } ``` ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "1111 2222 3333 4444", "address": [ { "postcode": 12345, "street": "*****" }, { "postcode": 67890, "street": "*****" }, { "postcode": 11111, "street": "100 Any Street" } ] } ``` For comprehensive guidance on using JSONPath syntax, please refer to the official documentation available at [jsonpath-ng](https://github.com/h2non/jsonpath-ng#jsonpath-syntax) #### JSON We also support data in JSON string format as input. We automatically deserialize it, then handle each field operation as expected. Note that the return will be a deserialized JSON and your desired fields updated. Expression: `data_masker.erase(data, fields=["card_number", "address.postcode"])` ``` '{"name": "Carlos", "operation": "non sensitive", "card_number": "1111 2222 3333 4444", "address": {"postcode": 12345}}' ``` ``` { "name": "Carlos", "operation": "non sensitive", "card_number": "*****", "address": { "postcode": "*****" } } ``` ## Advanced ### Data serialization Extended input support We support `Pydantic models`, `Dataclasses`, and custom classes with `dict()` or `__dict__` for input. These types are automatically converted into dictionaries before `masking` and `encrypting` operations. Please not that we **don't convert back** to the original type, and the returned object will be a dictionary. Before we traverse the data structure, we perform two important operations on input data: 1. If `JSON string`, **deserialize** using default or provided deserializer. 1. If `dictionary or complex types`, **normalize** into `JSON` to prevent traversing unsupported data types. For compatibility or performance, you can optionally pass your own JSON serializer and deserializer to replace `json.dumps` and `json.loads` respectively: ``` from aws_lambda_powertools.utilities.data_masking import DataMasking data_masker = DataMasking() class User: def __init__(self, name, age): self.name = name self.age = age def dict(self): return {"name": self.name, "age": self.age} def lambda_handler(event, context): user = User("powertools", 42) return data_masker.erase(user, fields=["age"]) ``` ``` from pydantic import BaseModel from aws_lambda_powertools.utilities.data_masking import DataMasking data_masker = DataMasking() class User(BaseModel): name: str age: int def lambda_handler(event, context): user = User(name="powertools", age=42) return data_masker.erase(user, fields=["age"]) ``` ``` from dataclasses import dataclass from aws_lambda_powertools.utilities.data_masking import DataMasking data_masker = DataMasking() @dataclass class User: name: str age: int def lambda_handler(event, context): user = User(name="powertools", age=42) return data_masker.erase(user, fields=["age"]) ``` ``` from __future__ import annotations import os import ujson from aws_lambda_powertools.utilities.data_masking import DataMasking from aws_lambda_powertools.utilities.data_masking.provider.kms.aws_encryption_sdk import ( AWSEncryptionSDKProvider, ) from aws_lambda_powertools.utilities.typing import LambdaContext KMS_KEY_ARN = os.getenv("KMS_KEY_ARN", "") encryption_provider = AWSEncryptionSDKProvider( keys=[KMS_KEY_ARN], json_serializer=ujson.dumps, json_deserializer=ujson.loads, ) data_masker = DataMasking(provider=encryption_provider) def lambda_handler(event: dict, context: LambdaContext) -> str: data: dict = event.get("body", {}) return data_masker.encrypt(data) ``` ### Using multiple keys You can use multiple KMS keys from more than one AWS account for higher availability, when instantiating `AWSEncryptionSDKProvider`. ``` from __future__ import annotations import os from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_masking import DataMasking from aws_lambda_powertools.utilities.data_masking.provider.kms.aws_encryption_sdk import ( AWSEncryptionSDKProvider, ) from aws_lambda_powertools.utilities.typing import LambdaContext KMS_KEY_ARN_1 = os.getenv("KMS_KEY_ARN_1", "") KMS_KEY_ARN_2 = os.getenv("KMS_KEY_ARN_2", "") encryption_provider = AWSEncryptionSDKProvider(keys=[KMS_KEY_ARN_1, KMS_KEY_ARN_2]) data_masker = DataMasking(provider=encryption_provider) logger = Logger() @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> dict: data: dict = event.get("body", {}) logger.info("Encrypting the whole object") encrypted = data_masker.encrypt(data) return {"body": encrypted} ``` ### Providers #### AWS Encryption SDK You can modify the following values when initializing the `AWSEncryptionSDKProvider` to best accommodate your security and performance thresholds. | Parameter | Default | Description | | --- | --- | --- | | **local_cache_capacity** | `100` | The maximum number of entries that can be retained in the local cryptographic materials cache | | **max_cache_age_seconds** | `300` | The maximum time (in seconds) that a cache entry may be kept in the cache | | **max_messages_encrypted** | `4294967296` | The maximum number of messages that may be encrypted under a cache entry | | **max_bytes_encrypted** | `9223372036854775807` | The maximum number of bytes that may be encrypted under a cache entry | If required, you can customize the default values when initializing the `AWSEncryptionSDKProvider` class. ``` from __future__ import annotations import os from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_masking import DataMasking from aws_lambda_powertools.utilities.data_masking.provider.kms.aws_encryption_sdk import ( AWSEncryptionSDKProvider, ) from aws_lambda_powertools.utilities.typing import LambdaContext KMS_KEY_ARN = os.getenv("KMS_KEY_ARN", "") encryption_provider = AWSEncryptionSDKProvider( keys=[KMS_KEY_ARN], local_cache_capacity=200, max_cache_age_seconds=400, max_messages_encrypted=200, max_bytes_encrypted=2000, ) data_masker = DataMasking(provider=encryption_provider) logger = Logger() @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> dict: data: dict = event.get("body", {}) logger.info("Encrypting the whole object") encrypted = data_masker.encrypt(data) return {"body": encrypted} ``` ##### Passing additional SDK arguments See the [AWS Encryption SDK docs for more details](https://aws-encryption-sdk-python.readthedocs.io/en/latest/generated/aws_encryption_sdk.html#aws_encryption_sdk.EncryptionSDKClient.encrypt) As an escape hatch mechanism, you can pass additional arguments to the `AWSEncryptionSDKProvider` via the `provider_options` parameter. For example, the AWS Encryption SDK defaults to using the `AES_256_GCM_HKDF_SHA512_COMMIT_KEY_ECDSA_P384` algorithm for encrypting your Data Key. If you want, you have the flexibility to customize and choose a different encryption algorithm. ``` from __future__ import annotations import os from aws_encryption_sdk.identifiers import Algorithm from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.data_masking import DataMasking from aws_lambda_powertools.utilities.data_masking.provider.kms.aws_encryption_sdk import AWSEncryptionSDKProvider from aws_lambda_powertools.utilities.typing import LambdaContext KMS_KEY_ARN = os.getenv("KMS_KEY_ARN", "") encryption_provider = AWSEncryptionSDKProvider(keys=[KMS_KEY_ARN]) data_masker = DataMasking(provider=encryption_provider) logger = Logger() @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> str: data: dict = event.get("body", {}) logger.info("Encrypting whole object with a different algorithm") provider_options = {"algorithm": Algorithm.AES_256_GCM_HKDF_SHA512_COMMIT_KEY} encrypted = data_masker.encrypt( data, provider_options=provider_options, ) return encrypted ``` ### Data masking request flow The following sequence diagrams explain how `DataMasking` behaves under different scenarios. #### Erase operation Erasing operations occur in-memory and we cannot recover the original value. ``` sequenceDiagram autonumber participant Client participant Lambda participant DataMasking as Data Masking (in memory) Client->>Lambda: Invoke (event) Lambda->>DataMasking: erase(data) DataMasking->>DataMasking: replaces data with ***** Note over Lambda,DataMasking: No encryption providers involved. DataMasking->>Lambda: data masked Lambda-->>Client: Return response ``` *Simple masking operation* #### Encrypt operation with Encryption SDK (KMS) We call KMS to generate an unique data key that can be used for multiple `encrypt` operation in-memory. It improves performance, cost and prevent throttling. To make this operation simpler to visualize, we keep caching details in a [separate sequence diagram](#caching-encrypt-operations-with-encryption-sdk). Caching is enabled by default. ``` sequenceDiagram autonumber participant Client participant Lambda participant DataMasking as Data Masking participant EncryptionProvider as Encryption Provider Client->>Lambda: Invoke (event) Lambda->>DataMasking: Init Encryption Provider with master key Note over Lambda,DataMasking: AWSEncryptionSDKProvider([KMS_KEY]) Lambda->>DataMasking: encrypt(data) DataMasking->>EncryptionProvider: Create unique data key Note over DataMasking,EncryptionProvider: KMS GenerateDataKey API DataMasking->>DataMasking: Cache new unique data key DataMasking->>DataMasking: DATA_KEY.encrypt(data) DataMasking->>DataMasking: MASTER_KEY.encrypt(DATA_KEY) DataMasking->>DataMasking: Create encrypted message Note over DataMasking: Encrypted message includes encrypted data, data key encrypted, algorithm, and more. DataMasking->>Lambda: Ciphertext from encrypted message Lambda-->>Client: Return response ``` *Encrypting operation using envelope encryption.* #### Encrypt operation with multiple KMS Keys When encrypting data with multiple KMS keys, the `aws_encryption_sdk` makes additional API calls to encrypt the data with each of the specified keys. ``` sequenceDiagram autonumber participant Client participant Lambda participant DataMasking as Data Masking participant EncryptionProvider as Encryption Provider Client->>Lambda: Invoke (event) Lambda->>DataMasking: Init Encryption Provider with master key Note over Lambda,DataMasking: AWSEncryptionSDKProvider([KEY_1, KEY_2]) Lambda->>DataMasking: encrypt(data) DataMasking->>EncryptionProvider: Create unique data key Note over DataMasking,EncryptionProvider: KMS GenerateDataKey API - KEY_1 DataMasking->>DataMasking: Cache new unique data key DataMasking->>DataMasking: DATA_KEY.encrypt(data) DataMasking->>DataMasking: KEY_1.encrypt(DATA_KEY) loop For every additional KMS Key DataMasking->>EncryptionProvider: Encrypt DATA_KEY Note over DataMasking,EncryptionProvider: KMS Encrypt API - KEY_2 end DataMasking->>DataMasking: Create encrypted message Note over DataMasking: Encrypted message includes encrypted data, all data keys encrypted, algorithm, and more. DataMasking->>Lambda: Ciphertext from encrypted message Lambda-->>Client: Return response ``` *Encrypting operation using envelope encryption.* #### Decrypt operation with Encryption SDK (KMS) We call KMS to decrypt the encrypted data key available in the encrypted message. If successful, we run authentication *(context)* and integrity checks (*algorithm, data key length, etc*) to confirm its proceedings. Lastly, we decrypt the original encrypted data, throw away the decrypted data key for security reasons, and return the original plaintext data. ``` sequenceDiagram autonumber participant Client participant Lambda participant DataMasking as Data Masking participant EncryptionProvider as Encryption Provider Client->>Lambda: Invoke (event) Lambda->>DataMasking: Init Encryption Provider with master key Note over Lambda,DataMasking: AWSEncryptionSDKProvider([KMS_KEY]) Lambda->>DataMasking: decrypt(data) DataMasking->>EncryptionProvider: Decrypt encrypted data key Note over DataMasking,EncryptionProvider: KMS Decrypt API DataMasking->>DataMasking: Authentication and integrity checks DataMasking->>DataMasking: DATA_KEY.decrypt(data) DataMasking->>DataMasking: MASTER_KEY.encrypt(DATA_KEY) DataMasking->>DataMasking: Discards decrypted data key DataMasking->>Lambda: Plaintext Lambda-->>Client: Return response ``` *Decrypting operation using envelope encryption.* #### Caching encrypt operations with Encryption SDK Without caching, every `encrypt()` operation would generate a new data key. It significantly increases latency and cost for ephemeral and short running environments like Lambda. With caching, we balance ephemeral Lambda environment performance characteristics with [adjustable thresholds](#aws-encryption-sdk) to meet your security needs. Data key recycling We request a new data key when a cached data key exceeds any of the following security thresholds: 1. **Max age in seconds** 1. **Max number of encrypted messages** 1. **Max bytes encrypted** across all operations ``` sequenceDiagram autonumber participant Client participant Lambda participant DataMasking as Data Masking participant EncryptionProvider as Encryption Provider Client->>Lambda: Invoke (event) Lambda->>DataMasking: Init Encryption Provider with master key Note over Lambda,DataMasking: AWSEncryptionSDKProvider([KMS_KEY]) Lambda->>DataMasking: encrypt(data) DataMasking->>EncryptionProvider: Create unique data key Note over DataMasking,EncryptionProvider: KMS GenerateDataKey API DataMasking->>DataMasking: Cache new unique data key DataMasking->>DataMasking: DATA_KEY.encrypt(data) DataMasking->>DataMasking: MASTER_KEY.encrypt(DATA_KEY) DataMasking->>DataMasking: Create encrypted message Note over DataMasking: Encrypted message includes encrypted data, data key encrypted, algorithm, and more. DataMasking->>Lambda: Ciphertext from encrypted message Lambda->>DataMasking: encrypt(another_data) DataMasking->>DataMasking: Searches for data key in cache alt Is Data key in cache? DataMasking->>DataMasking: Reuses data key else Is Data key evicted from cache? DataMasking->>EncryptionProvider: Create unique data key DataMasking->>DataMasking: MASTER_KEY.encrypt(DATA_KEY) end DataMasking->>DataMasking: DATA_KEY.encrypt(data) DataMasking->>DataMasking: Create encrypted message DataMasking->>Lambda: Ciphertext from encrypted message Lambda-->>Client: Return response ``` *Caching data keys during encrypt operation.* ## Testing your code ### Testing erase operation Testing your code with a simple erase operation ``` from dataclasses import dataclass import pytest import test_lambda_mask @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:111111111:function:test" aws_request_id: str = "52fdfc07-2182-154f-163f-5f0f9a621d72" def get_remaining_time_in_millis(self) -> int: return 5 @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() def test_encrypt_lambda(lambda_context): # GIVEN: A sample event for testing event = {"testkey": "testvalue"} # WHEN: Invoking the lambda_handler function with the sample event and Lambda context result = test_lambda_mask.lambda_handler(event, lambda_context) # THEN: Assert that the result matches the expected output assert result == {"testkey": "*****"} ``` ``` from __future__ import annotations from aws_lambda_powertools.utilities.data_masking import DataMasking from aws_lambda_powertools.utilities.typing import LambdaContext data_masker = DataMasking() def lambda_handler(event: dict, context: LambdaContext) -> dict: data = event erased = data_masker.erase(data, fields=["testkey"]) return erased ``` The feature flags utility provides a simple rule engine to define when one or multiple features should be enabled depending on the input. Info When using `AppConfigStore`, we currently only support AppConfig using [freeform configuration profile](https://docs.aws.amazon.com/appconfig/latest/userguide/appconfig-creating-configuration-and-profile.html#appconfig-creating-configuration-and-profile-free-form-configurations) . ## Key features - Define simple feature flags to dynamically decide when to enable a feature - Fetch one or all feature flags enabled for a given application context - Support for static feature flags to simply turn on/off a feature without rules - Support for time based feature flags - Bring your own Feature Flags Store Provider ## Terminology Feature flags are used to modify behaviour without changing the application's code. These flags can be **static** or **dynamic**. **Static flags**. Indicates something is simply `on` or `off`, for example `TRACER_ENABLED=True`. **Dynamic flags**. Indicates something can have varying states, for example enable a list of premium features for customer X not Y. Tip You can use [Parameters utility](../parameters/) for static flags while this utility can do both static and dynamic feature flags. Warning Be mindful that feature flags can increase the complexity of your application over time; use them sparingly. If you want to learn more about feature flags, their variations and trade-offs, check these articles: - [Feature Toggles (aka Feature Flags) - Pete Hodgson](https://martinfowler.com/articles/feature-toggles.html) - [AWS Lambda Feature Toggles Made Simple - Ran Isenberg](https://isenberg-ran.medium.com/aws-lambda-feature-toggles-made-simple-580b0c444233) - [Feature Flags Getting Started - CloudBees](https://www.cloudbees.com/blog/ultimate-feature-flag-guide) Note AWS AppConfig requires two API calls to fetch configuration for the first time. You can improve latency by consolidating your feature settings in a single [Configuration](https://docs.aws.amazon.com/appconfig/latest/userguide/appconfig-creating-configuration-and-profile.html). ## Getting started ### IAM Permissions When using the default store `AppConfigStore`, your Lambda function IAM Role must have `appconfig:GetLatestConfiguration` and `appconfig:StartConfigurationSession` IAM permissions before using this feature. ### Required resources By default, this utility provides [AWS AppConfig](https://docs.aws.amazon.com/appconfig/latest/userguide/what-is-appconfig.html) as a configuration store. The following sample infrastructure will be used throughout this documentation: ``` AWSTemplateFormatVersion: "2010-09-09" Description: Lambda Powertools for Python Feature flags sample template Resources: FeatureStoreApp: Type: AWS::AppConfig::Application Properties: Description: "AppConfig Application for feature toggles" Name: product-catalogue FeatureStoreDevEnv: Type: AWS::AppConfig::Environment Properties: ApplicationId: !Ref FeatureStoreApp Description: "Development Environment for the App Config Store" Name: dev FeatureStoreConfigProfile: Type: AWS::AppConfig::ConfigurationProfile Properties: ApplicationId: !Ref FeatureStoreApp Name: features LocationUri: "hosted" HostedConfigVersion: Type: AWS::AppConfig::HostedConfigurationVersion Properties: ApplicationId: !Ref FeatureStoreApp ConfigurationProfileId: !Ref FeatureStoreConfigProfile Description: 'A sample hosted configuration version' Content: | { "premium_features": { "default": false, "rules": { "customer tier equals premium": { "when_match": true, "conditions": [ { "action": "EQUALS", "key": "tier", "value": "premium" } ] } } }, "ten_percent_off_campaign": { "default": false } } ContentType: 'application/json' ConfigDeployment: Type: AWS::AppConfig::Deployment Properties: ApplicationId: !Ref FeatureStoreApp ConfigurationProfileId: !Ref FeatureStoreConfigProfile ConfigurationVersion: !Ref HostedConfigVersion DeploymentStrategyId: "AppConfig.AllAtOnce" EnvironmentId: !Ref FeatureStoreDevEnv ``` ``` import json import aws_cdk.aws_appconfig as appconfig from aws_cdk import core class SampleFeatureFlagStore(core.Construct): def __init__(self, scope: core.Construct, id_: str) -> None: super().__init__(scope, id_) features_config = { "premium_features": { "default": False, "rules": { "customer tier equals premium": { "when_match": True, "conditions": [{"action": "EQUALS", "key": "tier", "value": "premium"}], } }, }, "ten_percent_off_campaign": {"default": True}, } self.config_app = appconfig.CfnApplication( self, id="app", name="product-catalogue", ) self.config_env = appconfig.CfnEnvironment( self, id="env", application_id=self.config_app.ref, name="dev-env", ) self.config_profile = appconfig.CfnConfigurationProfile( self, id="profile", application_id=self.config_app.ref, location_uri="hosted", name="features", ) self.hosted_cfg_version = appconfig.CfnHostedConfigurationVersion( self, "version", application_id=self.config_app.ref, configuration_profile_id=self.config_profile.ref, content=json.dumps(features_config), content_type="application/json", ) self.app_config_deployment = appconfig.CfnDeployment( self, id="deploy", application_id=self.config_app.ref, configuration_profile_id=self.config_profile.ref, configuration_version=self.hosted_cfg_version.ref, deployment_strategy_id="AppConfig.AllAtOnce", environment_id=self.config_env.ref, ) ``` ### Evaluating a single feature flag To get started, you'd need to initialize `AppConfigStore` and `FeatureFlags`. Then call `FeatureFlags` `evaluate` method to fetch, validate, and evaluate your feature. The `evaluate` method supports two optional parameters: - **context**: Value to be evaluated against each rule defined for the given feature - **default**: Sentinel value to use in case we experience any issues with our store, or feature doesn't exist ``` from typing import Any from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext app_config = AppConfigStore(environment="dev", application="product-catalogue", name="features") feature_flags = FeatureFlags(store=app_config) def lambda_handler(event: dict, context: LambdaContext): """ This feature flag is enabled under the following conditions: - The request payload contains a field 'tier' with the value 'premium'. Rule condition to be evaluated: "conditions": [ { "action": "EQUALS", "key": "tier", "value": "premium" } ] """ # Get customer's tier from incoming request ctx = {"tier": event.get("tier", "standard")} # Evaluate whether customer's tier has access to premium features # based on `has_premium_features` rules has_premium_features: Any = feature_flags.evaluate(name="premium_features", context=ctx, default=False) if has_premium_features: # enable premium features ... ``` ``` { "username": "lessa", "tier": "premium", "basked_id": "random_id" } ``` ``` { "premium_features": { "default": false, "rules": { "customer tier equals premium": { "when_match": true, "conditions": [ { "action": "EQUALS", "key": "tier", "value": "premium" } ] } } }, "ten_percent_off_campaign": { "default": false } } ``` #### Static flags We have a static flag named `ten_percent_off_campaign`. Meaning, there are no conditional rules, it's either ON or OFF for all customers. In this case, we could omit the `context` parameter and simply evaluate whether we should apply the 10% discount. ``` from typing import Any from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext app_config = AppConfigStore(environment="dev", application="product-catalogue", name="features") feature_flags = FeatureFlags(store=app_config) def lambda_handler(event: dict, context: LambdaContext): """ This feature flag is enabled by default for all requests. """ apply_discount: Any = feature_flags.evaluate(name="ten_percent_off_campaign", default=False) price: Any = event.get("price") if apply_discount: # apply 10% discount to product price = price * 0.9 return {"price": price} ``` ``` { "product": "laptop", "price": 1000 } ``` ``` { "ten_percent_off_campaign": { "default": true } } ``` ### Getting all enabled features As you might have noticed, each `evaluate` call means an API call to the Store and the more features you have the more costly this becomes. You can use `get_enabled_features` method for scenarios where you need a list of all enabled features according to the input context. ``` from __future__ import annotations from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext app = APIGatewayRestResolver() app_config = AppConfigStore(environment="dev", application="product-catalogue", name="features") feature_flags = FeatureFlags(store=app_config) @app.get("/products") def list_products(): # getting fields from request # https://docs.powertools.aws.dev/lambda/python/latest/core/event_handler/api_gateway/#accessing-request-details json_body = app.current_event.json_body headers = app.current_event.headers ctx = {**headers, **json_body} # getting price from payload price: float = float(json_body.get("price")) percent_discount: int = 0 # all_features is evaluated to ["premium_features", "geo_customer_campaign", "ten_percent_off_campaign"] all_features: list[str] = feature_flags.get_enabled_features(context=ctx) if "geo_customer_campaign" in all_features: # apply 20% discounts for customers in NL percent_discount += 20 if "ten_percent_off_campaign" in all_features: # apply additional 10% for all customers percent_discount += 10 price = price * (100 - percent_discount) / 100 return {"price": price} def lambda_handler(event: dict, context: LambdaContext): return app.resolve(event, context) ``` ``` { "body": "{\"username\": \"lessa\", \"tier\": \"premium\", \"basked_id\": \"random_id\", \"price\": 1000}", "resource": "/products", "path": "/products", "httpMethod": "GET", "isBase64Encoded": false, "headers": { "CloudFront-Viewer-Country": "NL" } } ``` ``` { "premium_features": { "default": false, "rules": { "customer tier equals premium": { "when_match": true, "conditions": [ { "action": "EQUALS", "key": "tier", "value": "premium" } ] } } }, "ten_percent_off_campaign": { "default": true }, "geo_customer_campaign": { "default": false, "rules": { "customer in temporary discount geo": { "when_match": true, "conditions": [ { "action": "KEY_IN_VALUE", "key": "CloudFront-Viewer-Country", "value": [ "NL", "IE", "UK", "PL", "PT" ] } ] } } } } ``` ### Time based feature flags Feature flags can also return enabled features based on time or datetime ranges. This allows you to have features that are only enabled on certain days of the week, certain time intervals or between certain calendar dates. Use cases: - Enable maintenance mode during a weekend - Disable support/chat feature after working hours - Launch a new feature on a specific date and time You can also have features enabled only at certain times of the day for premium tier customers ``` from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext app_config = AppConfigStore(environment="dev", application="product-catalogue", name="features") feature_flags = FeatureFlags(store=app_config) def lambda_handler(event: dict, context: LambdaContext): """ This feature flag is enabled under the following conditions: - The request payload contains a field 'tier' with the value 'premium'. - If the current day is either Saturday or Sunday in America/New_York timezone. Rule condition to be evaluated: "conditions": [ { "action": "EQUALS", "key": "tier", "value": "premium" }, { "action": "SCHEDULE_BETWEEN_DAYS_OF_WEEK", "key": "CURRENT_DAY_OF_WEEK", "value": { "DAYS": [ "SATURDAY", "SUNDAY" ], "TIMEZONE": "America/New_York" } } ] """ # Get customer's tier from incoming request ctx = {"tier": event.get("tier", "standard")} # Checking if the weekend premum discount is enable weekend_premium_discount = feature_flags.evaluate(name="weekend_premium_discount", default=False, context=ctx) if weekend_premium_discount: # Enable special discount on weekend for premium users: return {"message": "The weekend premium discount is enabled."} return {"message": "The weekend premium discount is not enabled."} ``` ``` { "username": "rubefons", "tier": "premium", "basked_id": "random_id" } ``` ``` { "weekend_premium_discount": { "default": false, "rules": { "customer tier equals premium and its time for a discount": { "when_match": true, "conditions": [ { "action": "EQUALS", "key": "tier", "value": "premium" }, { "action": "SCHEDULE_BETWEEN_DAYS_OF_WEEK", "key": "CURRENT_DAY_OF_WEEK", "value": { "DAYS": [ "SATURDAY", "SUNDAY" ], "TIMEZONE": "America/New_York" } } ] } } } } ``` You can also have features enabled only at certain times of the day. ``` from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext app_config = AppConfigStore(environment="dev", application="product-catalogue", name="features") feature_flags = FeatureFlags(store=app_config) def lambda_handler(event: dict, context: LambdaContext): """ This feature flag is enabled under the following conditions: - Every day between 17:00 to 19:00 in Europe/Copenhagen timezone Rule condition to be evaluated: "conditions": [ { "action": "SCHEDULE_BETWEEN_TIME_RANGE", "key": "CURRENT_TIME", "value": { "START": "17:00", "END": "19:00", "TIMEZONE": "Europe/Copenhagen" } } ] """ # Checking if the happy hour discount is enable is_happy_hour = feature_flags.evaluate(name="happy_hour", default=False) if is_happy_hour: # Enable special discount on happy hour: return {"message": "The happy hour discount is enabled."} return {"message": "The happy hour discount is not enabled."} ``` ``` { "happy_hour": { "default": false, "rules": { "is happy hour": { "when_match": true, "conditions": [ { "action": "SCHEDULE_BETWEEN_TIME_RANGE", "key": "CURRENT_TIME", "value": { "START": "17:00", "END": "19:00", "TIMEZONE": "Europe/Copenhagen" } } ] } } } } ``` You can also have features enabled only at specific days, for example: enable christmas sale discount during specific dates. ``` from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext app_config = AppConfigStore(environment="dev", application="product-catalogue", name="features") feature_flags = FeatureFlags(store=app_config) def lambda_handler(event: dict, context: LambdaContext): """ This feature flag is enabled under the following conditions: - Start date: December 25th, 2022 at 12:00:00 PM EST - End date: December 31st, 2022 at 11:59:59 PM EST - Timezone: America/New_York Rule condition to be evaluated: "conditions": [ { "action": "SCHEDULE_BETWEEN_DATETIME_RANGE", "key": "CURRENT_DATETIME", "value": { "START": "2022-12-25T12:00:00", "END": "2022-12-31T23:59:59", "TIMEZONE": "America/New_York" } } ] """ # Checking if the Christmas discount is enable xmas_discount = feature_flags.evaluate(name="christmas_discount", default=False) if xmas_discount: # Enable special discount on christmas: return {"message": "The Christmas discount is enabled."} return {"message": "The Christmas discount is not enabled."} ``` ``` { "christmas_discount": { "default": false, "rules": { "enable discount during christmas": { "when_match": true, "conditions": [ { "action": "SCHEDULE_BETWEEN_DATETIME_RANGE", "key": "CURRENT_DATETIME", "value": { "START": "2022-12-25T12:00:00", "END": "2022-12-31T23:59:59", "TIMEZONE": "America/New_York" } } ] } } } } ``` How should I use timezones? You can use any [IANA time zone](https://www.iana.org/time-zones) (as originally specified in [PEP 615](https://peps.python.org/pep-0615/)) as part of your rules definition. Powertools for AWS Lambda (Python) takes care of converting and calculate the correct timestamps for you. When using `SCHEDULE_BETWEEN_DATETIME_RANGE`, use timestamps without timezone information, and specify the timezone manually. This way, you'll avoid hitting problems with day light savings. ### Modulo Range Segmented Experimentation Feature flags can also be used to run experiments on a segment of users based on modulo range conditions on context variables. This allows you to have features that are only enabled for a certain segment of users, comparing across multiple variants of the same experiment. Use cases: - Enable an experiment for a percentage of users - Scale up an experiment incrementally in production - canary release - Run multiple experiments or variants simultaneously by assigning a spectrum segment to each experiment variant. The modulo range condition takes three values - `BASE`, `START` and `END`. The condition evaluates `START <= CONTEXT_VALUE % BASE <= END`. ``` from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext app_config = AppConfigStore(environment="dev", application="product-catalogue", name="features") feature_flags = FeatureFlags(store=app_config) def lambda_handler(event: dict, context: LambdaContext): """ This feature flag is enabled under the following conditions: - The request payload contains a field 'tier' with the value 'standard'. - If the user_id belongs to the spectrum 0-19 modulo 100, (20% users) on whom we want to run the sale experiment. Rule condition to be evaluated: "conditions": [ { "action": "EQUALS", "key": "tier", "value": "standard" }, { "action": "MODULO_RANGE", "key": "user_id", "value": { "BASE": 100, "START": 0, "END": 19 } } ] """ # Get customer's tier and identifier from incoming request ctx = {"tier": event.get("tier", "standard"), "user_id": event.get("user_id", 0)} # Checking if the sale_experiment is enable sale_experiment = feature_flags.evaluate(name="sale_experiment", default=False, context=ctx) if sale_experiment: # Enable special discount for sale experiment segment users: return {"message": "The sale experiment is enabled."} return {"message": "The sale experiment is not enabled."} ``` ``` { "user_id": 134532511, "tier": "standard", "basked_id": "random_id" } ``` ``` { "sale_experiment": { "default": false, "rules": { "experiment 1 segment - 20% users": { "when_match": true, "conditions": [ { "action": "EQUALS", "key": "tier", "value": "standard" }, { "action": "MODULO_RANGE", "key": "user_id", "value": { "BASE": 100, "START": 0, "END": 19 } } ] } } } } ``` You can run multiple experiments on your users with the spectrum of your choice. ``` from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext app_config = AppConfigStore(environment="dev", application="product-catalogue", name="features") feature_flags = FeatureFlags(store=app_config) def lambda_handler(event: dict, context: LambdaContext): """ This non-boolean feature flag returns the percentage discount depending on the sale experiment segment: - 10% standard discount if the user_id belongs to the spectrum 0-3 modulo 10, (40% users). - 15% experiment discount if the user_id belongs to the spectrum 4-6 modulo 10, (30% users). - 18% experiment discount if the user_id belongs to the spectrum 7-9 modulo 10, (30% users). Rule conditions to be evaluated: "rules": { "control group - standard 10% discount segment": { "when_match": 10, "conditions": [ { "action": "MODULO_RANGE", "key": "user_id", "value": { "BASE": 10, "START": 0, "END": 3 } } ] }, "test experiment 1 - 15% discount segment": { "when_match": 15, "conditions": [ { "action": "MODULO_RANGE", "key": "user_id", "value": { "BASE": 10, "START": 4, "END": 6 } } ] }, "test experiment 2 - 18% discount segment": { "when_match": 18, "conditions": [ { "action": "MODULO_RANGE", "key": "user_id", "value": { "BASE": 10, "START": 7, "END": 9 } } ] } } """ # Get customer's tier and identifier from incoming request ctx = {"tier": event.get("tier", "standard"), "user_id": event.get("user_id", 0)} # Get sale discount percentage from feature flag. sale_experiment_discount = feature_flags.evaluate(name="sale_experiment_discount", default=0, context=ctx) return {"message": f" {sale_experiment_discount}% discount applied."} ``` ``` { "sale_experiment_discount": { "boolean_type": false, "default": 0, "rules": { "control group - standard 10% discount segment": { "when_match": 10, "conditions": [ { "action": "MODULO_RANGE", "key": "user_id", "value": { "BASE": 10, "START": 0, "END": 3 } } ] }, "test experiment 1 - 15% discount segment": { "when_match": 15, "conditions": [ { "action": "MODULO_RANGE", "key": "user_id", "value": { "BASE": 10, "START": 4, "END": 6 } } ] }, "test experiment 2 - 18% discount segment": { "when_match": 18, "conditions": [ { "action": "MODULO_RANGE", "key": "user_id", "value": { "BASE": 10, "START": 7, "END": 9 } } ] } } } } ``` ### Beyond boolean feature flags When is this useful? You might have a list of features to unlock for premium customers, unlock a specific set of features for admin users, etc. Feature flags can return any JSON values when `boolean_type` parameter is set to `false`. These can be dictionaries, list, string, integers, etc. ``` from typing import Any from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext app_config = AppConfigStore(environment="dev", application="comments", name="config") feature_flags = FeatureFlags(store=app_config) def lambda_handler(event: dict, context: LambdaContext): # Get customer's tier from incoming request ctx = {"tier": event.get("tier", "standard")} # Evaluate `has_premium_features` based on customer's tier premium_features: Any = feature_flags.evaluate(name="premium_features", context=ctx, default=[]) return {"Premium features enabled": premium_features} ``` ``` { "username": "lessa", "tier": "premium", "basked_id": "random_id" } ``` ``` { "premium_features": { "boolean_type": false, "default": [], "rules": { "customer tier equals premium": { "when_match": [ "no_ads", "no_limits", "chat" ], "conditions": [ { "action": "EQUALS", "key": "tier", "value": "premium" } ] } } } } ``` ## Advanced ### Adjusting in-memory cache By default, we cache configuration retrieved from the Store for 5 seconds for performance and reliability reasons. You can override `max_age` parameter when instantiating the store. ``` from typing import Any from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext app_config = AppConfigStore(environment="dev", application="product-catalogue", name="features", max_age=300) feature_flags = FeatureFlags(store=app_config) def lambda_handler(event: dict, context: LambdaContext): """ This feature flag is enabled by default for all requests. """ apply_discount: Any = feature_flags.evaluate(name="ten_percent_off_campaign", default=False) price: Any = event.get("price") if apply_discount: # apply 10% discount to product price = price * 0.9 return {"price": price} ``` ``` { "product": "laptop", "price": 1000 } ``` ``` { "ten_percent_off_campaign": { "default": true } } ``` ### Getting fetched configuration When is this useful? You might have application configuration in addition to feature flags in your store. This means you don't need to make another call only to fetch app configuration. You can access the configuration fetched from the store via `get_raw_configuration` property within the store instance. ``` from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags app_config = AppConfigStore( environment="dev", application="product-catalogue", name="configuration", envelope="feature_flags", ) feature_flags = FeatureFlags(store=app_config) config = app_config.get_raw_configuration ... ``` ### Schema This utility expects a certain schema to be stored as JSON within AWS AppConfig. #### Features A feature can simply have its name and a `default` value. This is either on or off, also known as a [static flag](#static-flags). ``` { "global_feature": { "default": true }, "non_boolean_global_feature": { "default": {"group": "read-only"}, "boolean_type": false } } ``` If you need more control and want to provide context such as user group, permissions, location, etc., you need to add rules to your feature flag configuration. #### Rules When adding `rules` to a feature, they must contain: 1. A rule name as a key 1. `when_match` boolean or JSON value that should be used when conditions match 1. A list of `conditions` for evaluation ``` { "premium_feature": { "default": false, "rules": { "customer tier equals premium": { "when_match": true, "conditions": [ { "action": "EQUALS", "key": "tier", "value": "premium" } ] } } }, "non_boolean_premium_feature": { "default": [], "rules": { "customer tier equals premium": { "when_match": ["remove_limits", "remove_ads"], "conditions": [ { "action": "EQUALS", "key": "tier", "value": "premium" } ] } } } } ``` You can have multiple rules with different names. The rule engine will return the first result `when_match` of the matching rule configuration, or `default` value when none of the rules apply. #### Conditions The `conditions` block is a list of conditions that contain `action`, `key`, and `value` keys: ``` { "conditions": [ { "action": "EQUALS", "key": "tier", "value": "premium" } ] } ``` The `action` configuration can have the following values, where the expressions **`a`** is the `key` and **`b`** is the `value` above: | Action | Equivalent expression | | --- | --- | | **EQUALS** | `lambda a, b: a == b` | | **NOT_EQUALS** | `lambda a, b: a != b` | | **KEY_GREATER_THAN_VALUE** | `lambda a, b: a > b` | | **KEY_GREATER_THAN_OR_EQUAL_VALUE** | `lambda a, b: a >= b` | | **KEY_LESS_THAN_VALUE** | `lambda a, b: a < b` | | **KEY_LESS_THAN_OR_EQUAL_VALUE** | `lambda a, b: a <= b` | | **STARTSWITH** | `lambda a, b: a.startswith(b)` | | **ENDSWITH** | `lambda a, b: a.endswith(b)` | | **KEY_IN_VALUE** | `lambda a, b: a in b` | | **KEY_NOT_IN_VALUE** | `lambda a, b: a not in b` | | **ANY_IN_VALUE** | `lambda a, b: any of a is in b` | | **ALL_IN_VALUE** | `lambda a, b: all of a is in b` | | **NONE_IN_VALUE** | `lambda a, b: none of a is in b` | | **VALUE_IN_KEY** | `lambda a, b: b in a` | | **VALUE_NOT_IN_KEY** | `lambda a, b: b not in a` | | **SCHEDULE_BETWEEN_TIME_RANGE** | `lambda a, b: b.start <= time(a) <= b.end` | | **SCHEDULE_BETWEEN_DATETIME_RANGE** | `lambda a, b: b.start <= datetime(a) <= b.end` | | **SCHEDULE_BETWEEN_DAYS_OF_WEEK** | `lambda a, b: day_of_week(a) in b` | | **MODULO_RANGE** | `lambda a, b: b.start <= a % b.base <= b.end` | Info The `key` and `value` will be compared to the input from the `context` parameter. Time based keys For time based keys, we provide a list of predefined keys. These will automatically get converted to the corresponding timestamp on each invocation of your Lambda function. | Key | Meaning | | --- | --- | | CURRENT_TIME | The current time, 24 hour format (HH:mm) | | CURRENT_DATETIME | The current datetime ([ISO8601](https://en.wikipedia.org/wiki/ISO_8601)) | | CURRENT_DAY_OF_WEEK | The current day of the week (Monday-Sunday) | If not specified, the timezone used for calculations will be UTC. **For multiple conditions**, we will evaluate the list of conditions as a logical `AND`, so all conditions needs to match to return `when_match` value. #### Rule engine flowchart Now that you've seen all properties of a feature flag schema, this flowchart describes how the rule engine decides what value to return. ### Envelope There are scenarios where you might want to include feature flags as part of an existing application configuration. For this to work, you need to use a JMESPath expression via the `envelope` parameter to extract that key as the feature flags configuration. ``` from typing import Any from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext app_config = AppConfigStore( environment="dev", application="product-catalogue", name="feature_flags", envelope="features", ) feature_flags = FeatureFlags(store=app_config) def lambda_handler(event: dict, context: LambdaContext): apply_discount: Any = feature_flags.evaluate(name="ten_percent_off_campaign", default=False) price: Any = event.get("price") if apply_discount: # apply 10% discount to product price = price * 0.9 return {"price": price} ``` ``` { "product": "laptop", "price": 1000 } ``` ``` { "logging": { "level": "INFO", "sampling_rate": 0.1 }, "features": { "ten_percent_off_campaign": { "default": true } } } ``` ### Built-in store provider #### AppConfig AppConfig store provider fetches any JSON document from AWS AppConfig. These are the available options for further customization. | Parameter | Default | Description | | --- | --- | --- | | **environment** | `""` | AWS AppConfig Environment, e.g. `dev` | | **application** | `""` | AWS AppConfig Application, e.g. `product-catalogue` | | **name** | `""` | AWS AppConfig Configuration name, e.g `features` | | **envelope** | `None` | JMESPath expression to use to extract feature flags configuration from AWS AppConfig configuration | | **max_age** | `5` | Number of seconds to cache feature flags configuration fetched from AWS AppConfig | | **jmespath_options** | `None` | For advanced use cases when you want to bring your own [JMESPath functions](https://github.com/jmespath/jmespath.py#custom-functions) | | **logger** | `logging.Logger` | Logger to use for debug. You can optionally supply an instance of Powertools for AWS Lambda (Python) Logger. | | **boto3_client** | `None` | [AppConfigData boto3 client](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/appconfigdata.html#AppConfigData.Client) | | **boto3_session** | `None` | [Boto3 session](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html) | | **boto_config** | `None` | [Botocore config](https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html) | ``` from typing import Any from botocore.config import Config from jmespath.functions import Functions, signature from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext boto_config = Config(read_timeout=10, retries={"total_max_attempts": 2}) # Custom JMESPath functions class CustomFunctions(Functions): @signature({"types": ["object"]}) def _func_special_decoder(self, features): # You can add some logic here return features custom_jmespath_options = {"custom_functions": CustomFunctions()} app_config = AppConfigStore( environment="dev", application="product-catalogue", name="features", max_age=120, envelope="special_decoder(features)", # using a custom function defined in CustomFunctions Class boto_config=boto_config, jmespath_options=custom_jmespath_options, ) feature_flags = FeatureFlags(store=app_config) def lambda_handler(event: dict, context: LambdaContext): apply_discount: Any = feature_flags.evaluate(name="ten_percent_off_campaign", default=False) price: Any = event.get("price") if apply_discount: # apply 10% discount to product price = price * 0.9 return {"price": price} ``` ``` { "product": "laptop", "price": 1000 } ``` ``` { "logging": { "level": "INFO", "sampling_rate": 0.1 }, "features": { "ten_percent_off_campaign": { "default": true } } } ``` #### Customizing boto configuration The **`boto_config`** , **`boto3_session`**, and **`boto3_client`** parameters enable you to pass in a custom [botocore config object](https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html), [boto3 session](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html), or a [boto3 client](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/boto3.html) when constructing the AppConfig store provider. ``` from typing import Any import boto3 from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext boto3_session = boto3.session.Session() app_config = AppConfigStore( environment="dev", application="product-catalogue", name="features", boto3_session=boto3_session, ) feature_flags = FeatureFlags(store=app_config) def lambda_handler(event: dict, context: LambdaContext): apply_discount: Any = feature_flags.evaluate(name="ten_percent_off_campaign", default=False) price: Any = event.get("price") if apply_discount: # apply 10% discount to product price = price * 0.9 return {"price": price} ``` ``` from typing import Any from botocore.config import Config from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext boto_config = Config(read_timeout=10, retries={"total_max_attempts": 2}) app_config = AppConfigStore( environment="dev", application="product-catalogue", name="features", boto_config=boto_config, ) feature_flags = FeatureFlags(store=app_config) def lambda_handler(event: dict, context: LambdaContext): apply_discount: Any = feature_flags.evaluate(name="ten_percent_off_campaign", default=False) price: Any = event.get("price") if apply_discount: # apply 10% discount to product price = price * 0.9 return {"price": price} ``` ``` from typing import Any import boto3 from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext boto3_client = boto3.client("appconfigdata") app_config = AppConfigStore( environment="dev", application="product-catalogue", name="features", boto3_client=boto3_client, ) feature_flags = FeatureFlags(store=app_config) def lambda_handler(event: dict, context: LambdaContext): apply_discount: Any = feature_flags.evaluate(name="ten_percent_off_campaign", default=False) price: Any = event.get("price") if apply_discount: # apply 10% discount to product price = price * 0.9 return {"price": price} ``` ### Create your own store provider You can create your own custom FeatureFlags store provider by inheriting the `StoreProvider` class, and implementing both `get_raw_configuration()` and `get_configuration()` methods to retrieve the configuration from your custom store. - **`get_raw_configuration()`** – get the raw configuration from the store provider and return the parsed JSON dictionary - **`get_configuration()`** – get the configuration from the store provider, parsing it as a JSON dictionary. If an envelope is set, extract the envelope data Here are an example of implementing a custom store provider using Amazon S3, a popular object storage. Note This is just one example of how you can create your own store provider. Before creating a custom store provider, carefully evaluate your requirements and consider factors such as performance, scalability, and ease of maintenance. ``` from typing import Any from custom_s3_store_provider import S3StoreProvider from aws_lambda_powertools.utilities.feature_flags import FeatureFlags from aws_lambda_powertools.utilities.typing import LambdaContext s3_config_store = S3StoreProvider("your-bucket-name", "working_with_own_s3_store_provider_features.json") feature_flags = FeatureFlags(store=s3_config_store) def lambda_handler(event: dict, context: LambdaContext): apply_discount: Any = feature_flags.evaluate(name="ten_percent_off_campaign", default=False) price: Any = event.get("price") if apply_discount: # apply 10% discount to product price = price * 0.9 return {"price": price} ``` ``` import json from typing import Any, Dict import boto3 from botocore.exceptions import ClientError from aws_lambda_powertools.utilities.feature_flags.base import StoreProvider from aws_lambda_powertools.utilities.feature_flags.exceptions import ( ConfigurationStoreError, ) class S3StoreProvider(StoreProvider): def __init__(self, bucket_name: str, object_key: str): # Initialize the client to your custom store provider super().__init__() self.bucket_name = bucket_name self.object_key = object_key self.client = boto3.client("s3") def _get_s3_object(self) -> Dict[str, Any]: # Retrieve the object content try: response = self.client.get_object(Bucket=self.bucket_name, Key=self.object_key) return json.loads(response["Body"].read().decode()) except ClientError as exc: raise ConfigurationStoreError("Unable to get S3 Store Provider configuration file") from exc def get_configuration(self) -> Dict[str, Any]: return self._get_s3_object() @property def get_raw_configuration(self) -> Dict[str, Any]: return self._get_s3_object() ``` ``` { "product": "laptop", "price": 1000 } ``` ``` { "ten_percent_off_campaign": { "default": true } } ``` ## Testing your code You can unit test your feature flags locally and independently without setting up AWS AppConfig. `AppConfigStore` only fetches a JSON document with a specific schema. This allows you to mock the response and use it to verify the rule evaluation. Warning This excerpt relies on `pytest` and `pytest-mock` dependencies. ``` from aws_lambda_powertools.utilities.feature_flags import ( AppConfigStore, FeatureFlags, RuleAction, ) def init_feature_flags(mocker, mock_schema, envelope="") -> FeatureFlags: """Mock AppConfig Store get_configuration method to use mock schema instead""" method_to_mock = "aws_lambda_powertools.utilities.feature_flags.AppConfigStore.get_configuration" mocked_get_conf = mocker.patch(method_to_mock) mocked_get_conf.return_value = mock_schema app_conf_store = AppConfigStore( environment="test_env", application="test_app", name="test_conf_name", envelope=envelope, ) return FeatureFlags(store=app_conf_store) def test_flags_condition_match(mocker): # GIVEN expected_value = True mocked_app_config_schema = { "my_feature": { "default": False, "rules": { "tenant id equals 12345": { "when_match": expected_value, "conditions": [ { "action": RuleAction.EQUALS.value, "key": "tenant_id", "value": "12345", }, ], }, }, }, } # WHEN ctx = {"tenant_id": "12345", "username": "a"} feature_flags = init_feature_flags(mocker=mocker, mock_schema=mocked_app_config_schema) flag = feature_flags.evaluate(name="my_feature", context=ctx, default=False) # THEN assert flag == expected_value ``` ## Feature flags vs Parameters vs Env vars | Method | When to use | Requires new deployment on changes | Supported services | | --- | --- | --- | --- | | **[Environment variables](https://docs.aws.amazon.com/lambda/latest/dg/configuration-envvars.html)** | Simple configuration that will rarely if ever change, because changing it requires a Lambda function deployment. | Yes | Lambda | | **[Parameters utility](../parameters/)** | Access to secrets, or fetch parameters in different formats from AWS System Manager Parameter Store or Amazon DynamoDB. | No | Parameter Store, DynamoDB, Secrets Manager, AppConfig | | **Feature flags utility** | Rule engine to define when one or multiple features should be enabled depending on the input. | No | AppConfig | The idempotency utility allows you to retry operations within a time window with the same input, producing the same output. ## Key features - Produces the previous successful result when a function is called repeatedly with the same idempotency key - Choose your idempotency key from one or more fields, or entire payload - Safeguard concurrent requests, timeouts, missing idempotency keys, and payload tampering - Support for Amazon DynamoDB, Valkey, Redis OSS, or any Redis-compatible cache as the persistence layer ## Terminology The property of idempotency means that an operation does not cause additional side effects if it is called more than once with the same input parameters. **Idempotency key** By default, this is a combination of **(a)** Lambda function name, **(b)** fully qualified name of your function, and **(c)** a hash of the entire payload or part(s) of the payload you specify. However, you can customize the key generation by using **(a)** a [custom prefix name](#customizing-the-idempotency-key-generation), while still incorporating **(c)** a hash of the entire payload or part(s) of the payload you specify. **Idempotent request** is an operation with the same input previously processed that is not expired in your persistent storage or in-memory cache. **Persistence layer** is a storage we use to create, read, expire, and delete idempotency records. **Idempotency record** is the data representation of an idempotent request saved in the persistent layer and in its various status. We use it to coordinate whether **(a)** a request is idempotent, **(b)** it's not expired, **(c)** JSON response to return, and more. ``` classDiagram direction LR class IdempotencyRecord { idempotency_key str status Status expiry_timestamp int in_progress_expiry_timestamp int response_data str~JSON~ payload_hash str } class Status { <> INPROGRESS COMPLETE EXPIRED internal_only } IdempotencyRecord -- Status ``` *Idempotency record representation* ## Getting started We use Amazon DynamoDB as the default persistence layer in the documentation. If you prefer Redis, you can learn more from [this section](#redis-database). ### IAM Permissions When using Amazon DynamoDB as the persistence layer, you will need the following IAM permissions: | IAM Permission | Operation | | --- | --- | | **`dynamodb:GetItem`** | Retrieve idempotent record *(strong consistency)* | | **`dynamodb:PutItem`** | New idempotent records, replace expired idempotent records | | **`dynamodb:UpdateItem`** | Complete idempotency transaction, and/or update idempotent records state | | **`dynamodb:DeleteItem`** | Delete idempotent records for unsuccessful idempotency transactions | **First time setting it up?** We provide Infrastrucure as Code examples with [AWS Serverless Application Model (SAM)](#aws-serverless-application-model-sam-example), [AWS Cloud Development Kit (CDK)](#aws-cloud-development-kit-cdk), and [Terraform](#terraform) with the required permissions. ### Required resources To start, you'll need: - **Persistent storage** ______________________________________________________________________ [Amazon DynamoDB](#dynamodb-table) or [Valkey/Redis OSS/Redis compatible](#cache-database) - **AWS Lambda function** ______________________________________________________________________ With permissions to use your persistent storage Primary key for any persistence storage We combine the Lambda function name and the [fully qualified name](https://peps.python.org/pep-3155/) for classes/functions to prevent accidental reuse for similar code sharing input/output. Primary key sample: `{lambda_fn_name}.{module_name}.{fn_qualified_name}#{idempotency_key_hash}` #### DynamoDB table Unless you're looking to use an [existing table or customize each attribute](#dynamodbpersistencelayer), you only need the following: | Configuration | Value | Notes | | --- | --- | --- | | Partition key | `id` | | | TTL attribute name | `expiration` | Using AWS Console? This is configurable after table creation | You **can** use a single DynamoDB table for all functions annotated with Idempotency. ##### DynamoDB IaC examples ``` Transform: AWS::Serverless-2016-10-31 Resources: IdempotencyTable: Type: AWS::DynamoDB::Table Properties: AttributeDefinitions: - AttributeName: id AttributeType: S KeySchema: - AttributeName: id KeyType: HASH TimeToLiveSpecification: AttributeName: expiration Enabled: true BillingMode: PAY_PER_REQUEST HelloWorldFunction: Type: AWS::Serverless::Function Properties: Runtime: python3.12 Handler: app.py Policies: - Statement: - Sid: AllowDynamodbReadWrite Effect: Allow Action: - dynamodb:PutItem - dynamodb:GetItem - dynamodb:UpdateItem - dynamodb:DeleteItem Resource: !GetAtt IdempotencyTable.Arn Environment: Variables: IDEMPOTENCY_TABLE: !Ref IdempotencyTable ``` ``` from aws_cdk import RemovalPolicy from aws_cdk import aws_dynamodb as dynamodb from aws_cdk import aws_iam as iam from constructs import Construct class IdempotencyConstruct(Construct): def __init__(self, scope: Construct, name: str, lambda_role: iam.Role) -> None: super().__init__(scope, name) self.idempotency_table = dynamodb.Table( self, "IdempotencyTable", partition_key=dynamodb.Attribute(name="id", type=dynamodb.AttributeType.STRING), billing_mode=dynamodb.BillingMode.PAY_PER_REQUEST, removal_policy=RemovalPolicy.DESTROY, time_to_live_attribute="expiration", point_in_time_recovery=True, ) self.idempotency_table.grant( lambda_role, "dynamodb:PutItem", "dynamodb:GetItem", "dynamodb:UpdateItem", "dynamodb:DeleteItem", ) ``` ``` terraform { required_providers { aws = { source = "hashicorp/aws" version = "~> 4.0" } } } provider "aws" { region = "us-east-1" # Replace with your desired AWS region } resource "aws_dynamodb_table" "IdempotencyTable" { name = "IdempotencyTable" billing_mode = "PAY_PER_REQUEST" hash_key = "id" attribute { name = "id" type = "S" } ttl { attribute_name = "expiration" enabled = true } } resource "aws_lambda_function" "IdempotencyFunction" { function_name = "IdempotencyFunction" role = aws_iam_role.IdempotencyFunctionRole.arn runtime = "python3.12" handler = "app.lambda_handler" filename = "lambda.zip" } resource "aws_iam_role" "IdempotencyFunctionRole" { name = "IdempotencyFunctionRole" assume_role_policy = jsonencode({ Version = "2012-10-17" Statement = [ { Sid = "" Effect = "Allow" Principal = { Service = "lambda.amazonaws.com" } Action = "sts:AssumeRole" }, ] }) } resource "aws_iam_policy" "LambdaDynamoDBPolicy" { name = "LambdaDynamoDBPolicy" description = "IAM policy for Lambda function to access DynamoDB" policy = jsonencode({ Version = "2012-10-17" Statement = [ { Sid = "AllowDynamodbReadWrite" Effect = "Allow" Action = [ "dynamodb:PutItem", "dynamodb:GetItem", "dynamodb:UpdateItem", "dynamodb:DeleteItem", ] Resource = aws_dynamodb_table.IdempotencyTable.arn }, ] }) } resource "aws_iam_role_policy_attachment" "IdempotencyFunctionRoleAttachment" { role = aws_iam_role.IdempotencyFunctionRole.name policy_arn = aws_iam_policy.LambdaDynamoDBPolicy.arn } ``` \` ##### Limitations - **DynamoDB restricts [item sizes to 400KB](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Limits.html#limits-items)**. This means that if your annotated function's response must be smaller than 400KB, otherwise your function will fail. Consider [Redis](#redis-database) as an alternative. - **Expect 2 WCU per non-idempotent call**. During the first invocation, we use `PutItem` for locking and `UpdateItem` for completion. Consider reviewing [DynamoDB pricing documentation](https://aws.amazon.com/dynamodb/pricing/) to estimate cost. - **Old boto3 versions can increase costs**. For cost optimization, we use a conditional `PutItem` to always lock a new idempotency record. If locking fails, it means we already have an idempotency record saving us an additional `GetItem` call. However, this is only supported in boto3 `1.26.194` and higher *([June 30th 2023](https://aws.amazon.com/about-aws/whats-new/2023/06/amazon-dynamodb-cost-failed-conditional-writes/))*. #### Cache database We recommend starting with a managed cache service, such as [Amazon ElastiCache for Valkey and for Redis OSS](https://aws.amazon.com/elasticache/redis/) or [Amazon MemoryDB](https://aws.amazon.com/memorydb/). In both services, you'll need to configure [VPC access](https://docs.aws.amazon.com/lambda/latest/dg/configuration-vpc.html) to your AWS Lambda. ##### Cache configuration Prefer AWS Console/CLI? Follow the official tutorials for [Amazon ElastiCache for Redis](https://docs.aws.amazon.com/AmazonElastiCache/latest/red-ug/LambdaRedis.html) or [Amazon MemoryDB for Redis](https://aws.amazon.com/blogs/database/access-amazon-memorydb-for-redis-from-aws-lambda/) ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Resources: CacheServerlessIdempotency: Type: AWS::ElastiCache::ServerlessCache Properties: Engine: redis ServerlessCacheName: redis-cache SecurityGroupIds: # (1)! - sg-07d998809154f9d88 SubnetIds: - subnet-{your_subnet_id_1} - subnet-{your_subnet_id_2} IdempotencyFunction: Type: AWS::Serverless::Function Properties: Runtime: python3.13 Handler: app.py VpcConfig: # (1)! SecurityGroupIds: - sg-07d998809154f9d88 SubnetIds: - subnet-{your_subnet_id_1} - subnet-{your_subnet_id_2} Environment: Variables: POWERTOOLS_SERVICE_NAME: sample CACHE_HOST: !GetAtt CacheServerlessIdempotency.Endpoint.Address CACHE_PORT: !GetAtt CacheServerlessIdempotency.Endpoint.Port ``` 1. Replace the Security Group ID and Subnet ID to match your VPC settings. 1. Replace the Security Group ID and Subnet ID to match your VPC settings. Once setup, you can find a quick start and advanced examples for Cache in [the persistent layers section](#cachepersistencelayer). ### Idempotent decorator For simple use cases, you can use the `idempotent` decorator on your Lambda handler function. It will treat the entire event as an idempotency key. That is, the same event will return the previously stored result within a [configurable time window](#adjusting-expiration-window) *(1 hour, by default)*. You can also choose [one or more fields](#choosing-a-payload-subset) as an idempotency key. ``` import os from dataclasses import dataclass, field from uuid import uuid4 from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, idempotent, ) from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") persistence_layer = DynamoDBPersistenceLayer(table_name=table) @dataclass class Payment: user_id: str product_id: str payment_id: str = field(default_factory=lambda: f"{uuid4()}") class PaymentError(Exception): ... @idempotent(persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext): try: payment: Payment = create_subscription_payment(event) return { "payment_id": payment.payment_id, "message": "success", "statusCode": 200, } except Exception as exc: raise PaymentError(f"Error creating payment {str(exc)}") def create_subscription_payment(event: dict) -> Payment: return Payment(**event) ``` ``` { "user_id": "xyz", "product_id": "123456789" } ``` ### Idempotent_function decorator For full flexibility, you can use the `idempotent_function` decorator for any synchronous Python function. When using this decorator, you **must** call your decorated function using keyword arguments. You can use `data_keyword_argument` to tell us the argument to extract an idempotency key. We support JSON serializable data, [Dataclasses](https://docs.python.org/3.12/library/dataclasses.html), Pydantic Models, and [Event Source Data Classes](../data_classes/) ``` import os from dataclasses import dataclass from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent_function, ) from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") dynamodb = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig(event_key_jmespath="order_id") # see Choosing a payload subset section @dataclass class OrderItem: sku: str description: str @dataclass class Order: item: OrderItem order_id: int @idempotent_function(data_keyword_argument="order", config=config, persistence_store=dynamodb) def process_order(order: Order): # (1)! return f"processed order {order.order_id}" def lambda_handler(event: dict, context: LambdaContext): # see Lambda timeouts section config.register_lambda_context(context) # (2)! order_item = OrderItem(sku="fake", description="sample") order = Order(item=order_item, order_id=1) # `order` parameter must be called as a keyword argument to work process_order(order=order) ``` 1. Notice how **`data_keyword_argument`** matches the name of the parameter. This allows us to extract one or all fields as idempotency key. 1. Different from `idempotent` decorator, we must explicitly register the Lambda context to [protect against timeouts](#lambda-timeouts). ``` import os from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent_function, ) from aws_lambda_powertools.utilities.parser import BaseModel from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") dynamodb = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig(event_key_jmespath="order_id") # see Choosing a payload subset section class OrderItem(BaseModel): sku: str description: str class Order(BaseModel): item: OrderItem order_id: int @idempotent_function(data_keyword_argument="order", config=config, persistence_store=dynamodb) def process_order(order: Order): return f"processed order {order.order_id}" def lambda_handler(event: dict, context: LambdaContext): config.register_lambda_context(context) # see Lambda timeouts section order_item = OrderItem(sku="fake", description="sample") order = Order(item=order_item, order_id=1) # `order` parameter must be called as a keyword argument to work process_order(order=order) ``` #### Output serialization By default, `idempotent_function` serializes, stores, and returns your annotated function's result as a JSON object. You can change this behavior using `output_serializer` parameter. The output serializer supports any JSON serializable data, **Python Dataclasses** and **Pydantic Models**. Info When using the `output_serializer` parameter, the data will continue to be stored in your persistent storage as a JSON string. Function returns must be annotated with a single type, optionally wrapped in `Optional` or `Union` with `None`. Use `PydanticSerializer` to automatically serialize what's retrieved from the persistent storage based on the return type annotated. ``` import os from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent_function, ) from aws_lambda_powertools.utilities.idempotency.serialization.pydantic import PydanticSerializer from aws_lambda_powertools.utilities.parser import BaseModel from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") dynamodb = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig(event_key_jmespath="order_id") # see Choosing a payload subset section class OrderItem(BaseModel): sku: str description: str class Order(BaseModel): item: OrderItem order_id: int class OrderOutput(BaseModel): order_id: int @idempotent_function( data_keyword_argument="order", config=config, persistence_store=dynamodb, output_serializer=PydanticSerializer, ) # order output is inferred from return type def process_order(order: Order) -> OrderOutput: # (1)! return OrderOutput(order_id=order.order_id) def lambda_handler(event: dict, context: LambdaContext): config.register_lambda_context(context) # see Lambda timeouts section order_item = OrderItem(sku="fake", description="sample") order = Order(item=order_item, order_id=1) # `order` parameter must be called as a keyword argument to work process_order(order=order) ``` 1. We'll use `OrderOutput` to instantiate a new object using the data retrieved from persistent storage as input. This ensures the return of the function is not impacted when `@idempotent_function` is used. Alternatively, you can provide an explicit model as an input to `PydanticSerializer`. ``` import os from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent_function, ) from aws_lambda_powertools.utilities.idempotency.serialization.pydantic import PydanticSerializer from aws_lambda_powertools.utilities.parser import BaseModel from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") dynamodb = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig(event_key_jmespath="order_id") # see Choosing a payload subset section class OrderItem(BaseModel): sku: str description: str class Order(BaseModel): item: OrderItem order_id: int class OrderOutput(BaseModel): order_id: int @idempotent_function( data_keyword_argument="order", config=config, persistence_store=dynamodb, output_serializer=PydanticSerializer(model=OrderOutput), ) def process_order(order: Order): return OrderOutput(order_id=order.order_id) def lambda_handler(event: dict, context: LambdaContext): config.register_lambda_context(context) # see Lambda timeouts section order_item = OrderItem(sku="fake", description="sample") order = Order(item=order_item, order_id=1) # `order` parameter must be called as a keyword argument to work process_order(order=order) ``` Use `DataclassSerializer` to automatically serialize what's retrieved from the persistent storage based on the return type annotated. ``` import os from dataclasses import dataclass from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent_function, ) from aws_lambda_powertools.utilities.idempotency.serialization.dataclass import DataclassSerializer from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") dynamodb = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig(event_key_jmespath="order_id") # see Choosing a payload subset section @dataclass class OrderItem: sku: str description: str @dataclass class Order: item: OrderItem order_id: int @dataclass class OrderOutput: order_id: int @idempotent_function( data_keyword_argument="order", config=config, persistence_store=dynamodb, output_serializer=DataclassSerializer, ) # order output is inferred from return type def process_order(order: Order) -> OrderOutput: # (1)! return OrderOutput(order_id=order.order_id) def lambda_handler(event: dict, context: LambdaContext): config.register_lambda_context(context) # see Lambda timeouts section order_item = OrderItem(sku="fake", description="sample") order = Order(item=order_item, order_id=1) # `order` parameter must be called as a keyword argument to work process_order(order=order) ``` 1. We'll use `OrderOutput` to instantiate a new object using the data retrieved from persistent storage as input. This ensures the return of the function is not impacted when `@idempotent_function` is used. Alternatively, you can provide an explicit model as an input to `DataclassSerializer`. ``` import os from dataclasses import dataclass from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent_function, ) from aws_lambda_powertools.utilities.idempotency.serialization.dataclass import DataclassSerializer from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") dynamodb = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig(event_key_jmespath="order_id") # see Choosing a payload subset section @dataclass class OrderItem: sku: str description: str @dataclass class Order: item: OrderItem order_id: int @dataclass class OrderOutput: order_id: int @idempotent_function( data_keyword_argument="order", config=config, persistence_store=dynamodb, output_serializer=DataclassSerializer(model=OrderOutput), ) def process_order(order: Order): return OrderOutput(order_id=order.order_id) def lambda_handler(event: dict, context: LambdaContext): config.register_lambda_context(context) # see Lambda timeouts section order_item = OrderItem(sku="fake", description="sample") order = Order(item=order_item, order_id=1) # `order` parameter must be called as a keyword argument to work process_order(order=order) ``` Use `CustomDictSerializer` to have full control over the serialization process for any type. It expects two functions: - **to_dict**. Function to convert any type to a JSON serializable dictionary before it saves into the persistent storage. - **from_dict**. Function to convert from a dictionary retrieved from persistent storage and serialize in its original form. ``` import os from typing import Dict, Type from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent_function, ) from aws_lambda_powertools.utilities.idempotency.serialization.custom_dict import CustomDictSerializer from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") dynamodb = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig(event_key_jmespath="order_id") # see Choosing a payload subset section class OrderItem: def __init__(self, sku: str, description: str): self.sku = sku self.description = description class Order: def __init__(self, item: OrderItem, order_id: int): self.item = item self.order_id = order_id class OrderOutput: def __init__(self, order_id: int): self.order_id = order_id def order_to_dict(x: Type[OrderOutput]) -> Dict: # (1)! return dict(x.__dict__) def dict_to_order(x: Dict) -> OrderOutput: # (2)! return OrderOutput(**x) order_output_serializer = CustomDictSerializer( # (3)! to_dict=order_to_dict, from_dict=dict_to_order, ) @idempotent_function( data_keyword_argument="order", config=config, persistence_store=dynamodb, output_serializer=order_output_serializer, ) def process_order(order: Order) -> OrderOutput: return OrderOutput(order_id=order.order_id) def lambda_handler(event: dict, context: LambdaContext): config.register_lambda_context(context) # see Lambda timeouts section order_item = OrderItem(sku="fake", description="sample") order = Order(item=order_item, order_id=1) # `order` parameter must be called as a keyword argument to work process_order(order=order) ``` 1. This function does the following **1**. Receives the return from `process_order`\ **2**. Converts to dictionary before it can be saved into the persistent storage. 1. This function does the following **1**. Receives the dictionary saved into the persistent storage\ **1** Serializes to `OrderOutput` before `@idempotent` returns back to the caller. 1. This serializer receives both functions so it knows who to call when to serialize to and from dictionary. ### Using in-memory cache In-memory cache is local to each Lambda execution environment. You can enable caching with the `use_local_cache` parameter in `IdempotencyConfig`. When enabled, you can adjust cache capacity *(256)* with `local_cache_max_items`. By default, caching is disabled since we don't know how big your response could be in relation to your configured memory size. ``` import os from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent, ) from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") persistence_layer = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig( event_key_jmespath="powertools_json(body)", # by default, it holds 256 items in a Least-Recently-Used (LRU) manner use_local_cache=True, # (1)! ) @idempotent(config=config, persistence_store=persistence_layer) def lambda_handler(event, context: LambdaContext): return event ``` 1. You can adjust cache capacity with [`local_cache_max_items`](#customizing-the-default-behavior) parameter. ``` { "body": "{\"user_id\":\"xyz\",\"product_id\":\"123456789\"}" } ``` ### Choosing a payload subset Tip: Dealing with always changing payloads When dealing with a more elaborate payload, where parts of the payload always change, you should use **`event_key_jmespath`** parameter. Use **`event_key_jmespath`** parameter in [`IdempotencyConfig`](#customizing-the-default-behavior) to select one or more payload parts as your idempotency key. > **Example scenario** In this example, we have a Lambda handler that creates a payment for a user subscribing to a product. We want to ensure that we don't accidentally charge our customer by subscribing them more than once. Imagine the function runs successfully, but the client never receives the response due to a connection issue. It is safe to immediately retry in this instance, as the idempotent decorator will return a previously saved response. We want to use `user_id` and `product_id` fields as our idempotency key. **If we were** to treat the entire request as our idempotency key, a simple HTTP header change would cause our function to run again. Deserializing JSON strings in payloads for increased accuracy. The payload extracted by the `event_key_jmespath` is treated as a string by default. This means there could be differences in whitespace even when the JSON payload itself is identical. To alter this behaviour, we can use the [JMESPath built-in function](../jmespath_functions/#powertools_json-function) `powertools_json()` to treat the payload as a JSON object (dict) rather than a string. ``` import json import os from dataclasses import dataclass, field from uuid import uuid4 from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent, ) from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") persistence_layer = DynamoDBPersistenceLayer(table_name=table) # Deserialize JSON string under the "body" key # then extract "user" and "product_id" data config = IdempotencyConfig(event_key_jmespath='powertools_json(body).["user_id", "product_id"]') @dataclass class Payment: user_id: str product_id: str payment_id: str = field(default_factory=lambda: f"{uuid4()}") class PaymentError(Exception): ... @idempotent(config=config, persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext): try: payment_info: str = event.get("body", "") payment: Payment = create_subscription_payment(json.loads(payment_info)) return { "payment_id": payment.payment_id, "message": "success", "statusCode": 200, } except Exception as exc: raise PaymentError(f"Error creating payment {str(exc)}") def create_subscription_payment(event: dict) -> Payment: return Payment(**event) ``` ``` { "version": "2.0", "routeKey": "ANY /createpayment", "rawPath": "/createpayment", "rawQueryString": "", "headers": { "Header1": "value1", "Header2": "value2" }, "requestContext": { "accountId": "123456789012", "apiId": "api-id", "domainName": "id.execute-api.us-east-1.amazonaws.com", "domainPrefix": "id", "http": { "method": "POST", "path": "/createpayment", "protocol": "HTTP/1.1", "sourceIp": "ip", "userAgent": "agent" }, "requestId": "id", "routeKey": "ANY /createpayment", "stage": "$default", "time": "10/Feb/2021:13:40:43 +0000", "timeEpoch": 1612964443723 }, "body": "{\"user_id\":\"xyz\",\"product_id\":\"123456789\"}", "isBase64Encoded": false } ``` ### Adjusting expiration window By default, we expire idempotency records after **an hour** (3600 seconds). After that, a transaction with the same payload [will not be considered idempotent](#expired-idempotency-records). You can change this expiration window with the **`expires_after_seconds`** parameter. There is no limit on how long this expiration window can be set to. ``` import os from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent, ) from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") persistence_layer = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig( event_key_jmespath="body", expires_after_seconds=24 * 60 * 60, # 24 hours ) @idempotent(config=config, persistence_store=persistence_layer) def lambda_handler(event, context: LambdaContext): return event ``` ``` { "body": "{\"user_id\":\"xyz\",\"product_id\":\"123456789\"}" } ``` Idempotency record expiration vs DynamoDB time-to-live (TTL) [DynamoDB TTL is a feature](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/howitworks-ttl.html) to remove items after a certain period of time, it may occur within 48 hours of expiration. We don't rely on DynamoDB or any persistence storage layer to determine whether a record is expired to avoid eventual inconsistency states. Instead, Idempotency records saved in the storage layer contain timestamps that can be verified upon retrieval and double checked within Idempotency feature. **Why?** A record might still be valid (`COMPLETE`) when we retrieved, but in some rare cases it might expire a second later. A record could also be [cached in memory](#using-in-memory-cache). You might also want to have idempotent transactions that should expire in seconds. ### Customizing the Idempotency key generation Warning: Changing the idempotency key generation will invalidate existing idempotency records Use **`key_prefix`** parameter in the `@idempotent` or `@idempotent_function` decorators to define a custom prefix for your Idempotency Key. This allows you to decouple idempotency key name from function names. It can be useful during application refactoring, for example. ``` import os from dataclasses import dataclass, field from uuid import uuid4 from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, idempotent, ) from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") persistence_layer = DynamoDBPersistenceLayer(table_name=table) @dataclass class Payment: user_id: str product_id: str payment_id: str = field(default_factory=lambda: f"{uuid4()}") class PaymentError(Exception): ... @idempotent(persistence_store=persistence_layer, key_prefix="my_custom_prefix") # (1)! def lambda_handler(event: dict, context: LambdaContext): try: payment: Payment = create_subscription_payment(event) return { "payment_id": payment.payment_id, "message": "success", "statusCode": 200, } except Exception as exc: raise PaymentError(f"Error creating payment {str(exc)}") def create_subscription_payment(event: dict) -> Payment: return Payment(**event) ``` 1. The Idempotency record will be something like `my_custom_prefix#c4ca4238a0b923820dcc509a6f75849b` ``` import os from dataclasses import dataclass from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent_function, ) from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") dynamodb = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig(event_key_jmespath="order_id") # see Choosing a payload subset section @dataclass class OrderItem: sku: str description: str @dataclass class Order: item: OrderItem order_id: int @idempotent_function( data_keyword_argument="order", config=config, persistence_store=dynamodb, key_prefix="my_custom_prefix", # (1)! ) def process_order(order: Order): return f"processed order {order.order_id}" def lambda_handler(event: dict, context: LambdaContext): # see Lambda timeouts section config.register_lambda_context(context) order_item = OrderItem(sku="fake", description="sample") order = Order(item=order_item, order_id=1) # `order` parameter must be called as a keyword argument to work process_order(order=order) ``` 1. The Idempotency record will be something like `my_custom_prefix#c4ca4238a0b923820dcc509a6f75849b` ### Lambda timeouts You can skip this section if you are using the [`@idempotent` decorator](#idempotent-decorator) By default, we protect against [concurrent executions](#handling-concurrent-executions-with-the-same-payload) with the same payload using a locking mechanism. However, if your Lambda function times out before completing the first invocation it will only accept the same request when the [idempotency record expire](#adjusting-expiration-window). To prevent extended failures, use **`register_lambda_context`** function from your idempotency config to calculate and include the remaining invocation time in your idempotency record. ``` import os from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent_function, ) from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") persistence_layer = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig() @idempotent_function(data_keyword_argument="record", persistence_store=persistence_layer, config=config) def record_handler(record: SQSRecord): return {"message": record["body"]} def lambda_handler(event: dict, context: LambdaContext): config.register_lambda_context(context) return record_handler(event) ``` Mechanics If a second invocation happens **after** this timestamp, and the record is marked as `INPROGRESS`, we will run the invocation again as if it was in the `EXPIRED` state. This means that if an invocation expired during execution, it will be quickly executed again on the next retry. ### Handling exceptions There are two failure modes that can cause new invocations to execute your code again despite having the same payload: - **Unhandled exception**. We catch them to delete the idempotency record to prevent inconsistencies, then propagate them. - **Persistent layer errors**. We raise **`IdempotencyPersistenceLayerError`** for any persistence layer errors *e.g., remove idempotency record*. If an exception is handled or raised **outside** your decorated function, then idempotency will be maintained. ``` import os import requests from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent_function, ) from aws_lambda_powertools.utilities.idempotency.exceptions import IdempotencyPersistenceLayerError from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") persistence_layer = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig() @idempotent_function(data_keyword_argument="data", config=config, persistence_store=persistence_layer) def call_external_service(data: dict): # Any exception raised will lead to idempotency record to be deleted result: requests.Response = requests.post( "https://jsonplaceholder.typicode.com/comments/", json=data, ) return result.json() def lambda_handler(event: dict, context: LambdaContext): try: call_external_service(data=event) except IdempotencyPersistenceLayerError as e: # No idempotency, but you can decide to error differently. raise RuntimeError(f"Oops, can't talk to persistence layer. Permissions? error: {e}") # This exception will not impact the idempotency of 'call_external_service' # because it happens in isolation, or outside their scope. raise SyntaxError("Oops, this shouldn't be here.") ``` ### Persistence layers #### DynamoDBPersistenceLayer This persistence layer is built-in, allowing you to use an existing DynamoDB table or create a new one dedicated to idempotency state (recommended). ``` import os from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, idempotent, ) from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") persistence_layer = DynamoDBPersistenceLayer( table_name=table, key_attr="idempotency_key", expiry_attr="expires_at", in_progress_expiry_attr="in_progress_expires_at", status_attr="current_status", data_attr="result_data", validation_key_attr="validation_key", ) @idempotent(persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext) -> dict: return event ``` ##### Using a composite primary key Use `sort_key_attr` parameter when your table is configured with a composite primary key *(hash+range key)*. When enabled, we will save the idempotency key in the sort key instead. By default, the primary key will now be set to `idempotency#{LAMBDA_FUNCTION_NAME}`. You can optionally set a static value for the partition key using the `static_pk_value` parameter. ``` import os from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, idempotent, ) from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") persistence_layer = DynamoDBPersistenceLayer(table_name=table, sort_key_attr="sort_key") @idempotent(persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext) -> dict: user_id: str = event.get("body", "")["user_id"] return {"message": "success", "user_id": user_id} ``` ``` { "body": "{\"user_id\":\"xyz\",\"product_id\":\"123456789\"}" } ``` Click to expand and learn how table items would look like | id | sort_key | expiration | status | data | | --- | --- | --- | --- | --- | | idempotency#MyLambdaFunction | 1e956ef7da78d0cb890be999aecc0c9e | 1636549553 | COMPLETED | {"user_id": 12391, "message": "success"} | | idempotency#MyLambdaFunction | 2b2cdb5f86361e97b4383087c1ffdf27 | 1636549571 | COMPLETED | {"user_id": 527212, "message": "success"} | | idempotency#MyLambdaFunction | f091d2527ad1c78f05d54cc3f363be80 | 1636549585 | IN_PROGRESS | | ##### DynamoDB attributes You can customize the attribute names during initialization: | Parameter | Required | Default | Description | | --- | --- | --- | --- | | **table_name** | | | Table name to store state | | **key_attr** | | `id` | Partition key of the table. Hashed representation of the payload (unless **sort_key_attr** is specified) | | **expiry_attr** | | `expiration` | Unix timestamp of when record expires | | **in_progress_expiry_attr** | | `in_progress_expiration` | Unix timestamp of when record expires while in progress (in case of the invocation times out) | | **status_attr** | | `status` | Stores status of the lambda execution during and after invocation | | **data_attr** | | `data` | Stores results of successfully executed Lambda handlers | | **validation_key_attr** | | `validation` | Hashed representation of the parts of the event used for validation | | **sort_key_attr** | | | Sort key of the table (if table is configured with a sort key). | | **static_pk_value** | | `idempotency#{LAMBDA_FUNCTION_NAME}` | Static value to use as the partition key. Only used when **sort_key_attr** is set. | #### CachePersistenceLayer The `CachePersistenceLayer` enables you to use Valkey, Redis OSS, or any Redis-compatible cache as the persistence layer for idempotency state. We recommend using [`valkey-glide`](https://pypi.org/project/valkey-glide/) for Valkey or [`redis`](https://pypi.org/project/redis/) for Redis. However, any Redis OSS-compatible client should work. For simple setups, initialize `CachePersistenceLayer` with your Cache endpoint and port to connect. Note that for security, we enforce SSL connections by default; to disable it, set `ssl=False`. ``` import os from dataclasses import dataclass, field from uuid import uuid4 from aws_lambda_powertools.utilities.idempotency import ( idempotent, ) from aws_lambda_powertools.utilities.idempotency.persistence.cache import ( CachePersistenceLayer, ) from aws_lambda_powertools.utilities.typing import LambdaContext redis_endpoint = os.getenv("CACHE_CLUSTER_ENDPOINT", "localhost") persistence_layer = CachePersistenceLayer(host=redis_endpoint, port=6379) @dataclass class Payment: user_id: str product_id: str payment_id: str = field(default_factory=lambda: f"{uuid4()}") class PaymentError(Exception): ... @idempotent(persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext): try: payment: Payment = create_subscription_payment(event) return { "payment_id": payment.payment_id, "message": "success", "statusCode": 200, } except Exception as exc: raise PaymentError(f"Error creating payment {str(exc)}") from exc def create_subscription_payment(event: dict) -> Payment: return Payment(**event) ``` ``` import os from dataclasses import dataclass, field from uuid import uuid4 from glide import GlideClient, GlideClientConfiguration, NodeAddress from aws_lambda_powertools.utilities.idempotency import ( idempotent, ) from aws_lambda_powertools.utilities.idempotency.persistence.cache import ( CachePersistenceLayer, ) from aws_lambda_powertools.utilities.typing import LambdaContext cache_endpoint = os.getenv("CACHE_CLUSTER_ENDPOINT", "localhost") client_config = GlideClientConfiguration( addresses=[ NodeAddress( host="localhost", port=6379, ), ], ) client = GlideClient.create(config=client_config) persistence_layer = CachePersistenceLayer(client=client) # type: ignore[arg-type] @dataclass class Payment: user_id: str product_id: str payment_id: str = field(default_factory=lambda: f"{uuid4()}") class PaymentError(Exception): ... @idempotent(persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext): try: payment: Payment = create_subscription_payment(event) return { "payment_id": payment.payment_id, "message": "success", "statusCode": 200, } except Exception as exc: raise PaymentError(f"Error creating payment {str(exc)}") def create_subscription_payment(event: dict) -> Payment: return Payment(**event) ``` ``` import os from dataclasses import dataclass, field from uuid import uuid4 from redis import Redis from aws_lambda_powertools.utilities.idempotency import ( idempotent, ) from aws_lambda_powertools.utilities.idempotency.persistence.cache import ( CachePersistenceLayer, ) from aws_lambda_powertools.utilities.typing import LambdaContext cache_endpoint = os.getenv("CACHE_CLUSTER_ENDPOINT", "localhost") client = Redis( host=cache_endpoint, port=6379, socket_connect_timeout=5, socket_timeout=5, max_connections=1000, ) persistence_layer = CachePersistenceLayer(client=client) @dataclass class Payment: user_id: str product_id: str payment_id: str = field(default_factory=lambda: f"{uuid4()}") class PaymentError(Exception): ... @idempotent(persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext): try: payment: Payment = create_subscription_payment(event) return { "payment_id": payment.payment_id, "message": "success", "statusCode": 200, } except Exception as exc: raise PaymentError(f"Error creating payment {str(exc)}") def create_subscription_payment(event: dict) -> Payment: return Payment(**event) ``` ``` { "user_id": "xyz", "product_id": "123456789" } ``` ##### Cache SSL connections We recommend using AWS Secrets Manager to store and rotate certificates safely, and the [Parameters feature](../parameters/) to fetch and cache optimally. For advanced configurations, we recommend using an existing Valkey client for optimal compatibility like SSL certificates and timeout. ``` from __future__ import annotations from typing import Any from glide import BackoffStrategy, GlideClient, GlideClientConfiguration, NodeAddress, ServerCredentials from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.idempotency import IdempotencyConfig, idempotent from aws_lambda_powertools.utilities.idempotency.persistence.cache import ( CachePersistenceLayer, ) cache_values: dict[str, Any] = parameters.get_secret("cache_info", transform="json") # (1)! client_config = GlideClientConfiguration( addresses=[ NodeAddress( host=cache_values.get("CACHE_HOST", "localhost"), port=cache_values.get("CACHE_PORT", 6379), ), ], credentials=ServerCredentials( password=cache_values.get("CACHE_PASSWORD", ""), ), request_timeout=10, use_tls=True, reconnect_strategy=BackoffStrategy(num_of_retries=10, factor=2, exponent_base=1), ) valkey_client = GlideClient.create(config=client_config) persistence_layer = CachePersistenceLayer(client=valkey_client) # type: ignore[arg-type] config = IdempotencyConfig( expires_after_seconds=2 * 60, # 2 minutes ) @idempotent(config=config, persistence_store=persistence_layer) def lambda_handler(event, context): return {"message": "Hello"} ``` 1. JSON stored: ``` { "CACHE_HOST": "127.0.0.1", "CACHE_PORT": "6379", "CACHE_PASSWORD": "cache-secret" } ``` ``` from __future__ import annotations from typing import Any from redis import Redis from aws_lambda_powertools.shared.functions import abs_lambda_path from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.idempotency import IdempotencyConfig, idempotent from aws_lambda_powertools.utilities.idempotency.persistence.cache import ( CachePersistenceLayer, ) cache_values: dict[str, Any] = parameters.get_secret("cache_info", transform="json") # (1)! redis_client = Redis( host=cache_values.get("REDIS_HOST", "localhost"), port=cache_values.get("REDIS_PORT", 6379), password=cache_values.get("REDIS_PASSWORD"), decode_responses=True, socket_timeout=10.0, ssl=True, retry_on_timeout=True, ssl_certfile=f"{abs_lambda_path()}/certs/cache_user.crt", # (2)! ssl_keyfile=f"{abs_lambda_path()}/certs/cache_user_private.key", # (3)! ssl_ca_certs=f"{abs_lambda_path()}/certs/cache_ca.pem", # (4)! ) persistence_layer = CachePersistenceLayer(client=redis_client) config = IdempotencyConfig( expires_after_seconds=2 * 60, # 2 minutes ) @idempotent(config=config, persistence_store=persistence_layer) def lambda_handler(event, context): return {"message": "Hello"} ``` 1. JSON stored: ``` { "CACHE_HOST": "127.0.0.1", "CACHE_PORT": "6379", "CACHE_PASSWORD": "cache-secret" } ``` 1. cache_user.crt file stored in the "certs" directory of your Lambda function 1. cache_user_private.key file stored in the "certs" directory of your Lambda function 1. cache_ca.pem file stored in the "certs" directory of your Lambda function ##### Cache attributes You can customize the attribute names during initialization: | Parameter | Required | Default | Description | | --- | --- | --- | --- | | **in_progress_expiry_attr** | | `in_progress_expiration` | Unix timestamp of when record expires while in progress (in case of the invocation times out) | | **status_attr** | | `status` | Stores status of the Lambda execution during and after invocation | | **data_attr** | | `data` | Stores results of successfully executed Lambda handlers | | **validation_key_attr** | | `validation` | Hashed representation of the parts of the event used for validation | ``` import os from aws_lambda_powertools.utilities.idempotency import ( idempotent, ) from aws_lambda_powertools.utilities.idempotency.persistence.redis import ( RedisCachePersistenceLayer, ) from aws_lambda_powertools.utilities.typing import LambdaContext redis_endpoint = os.getenv("REDIS_CLUSTER_ENDPOINT", "localhost") persistence_layer = RedisCachePersistenceLayer( host=redis_endpoint, port=6379, in_progress_expiry_attr="in_progress_expiration", status_attr="status", data_attr="data", validation_key_attr="validation", ) @idempotent(persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext) -> dict: return event ``` ### Common use cases #### Batch processing You can can easily integrate with [Batch](../batch/) using the [idempotent_function decorator](#idempotent_function-decorator) to handle idempotency per message/record in a given batch. Choosing an unique batch record attribute In this example, we choose `messageId` as our idempotency key since we know it'll be unique. Depending on your use case, it might be more accurate [to choose another field](#choosing-a-payload-subset) your producer intentionally set to define uniqueness. ``` import os from typing import Any, Dict from aws_lambda_powertools.utilities.batch import BatchProcessor, EventType, process_partial_response from aws_lambda_powertools.utilities.data_classes.sqs_event import SQSRecord from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent_function, ) from aws_lambda_powertools.utilities.typing import LambdaContext processor = BatchProcessor(event_type=EventType.SQS) table = os.getenv("IDEMPOTENCY_TABLE", "") dynamodb = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig(event_key_jmespath="messageId") @idempotent_function(data_keyword_argument="record", config=config, persistence_store=dynamodb) def record_handler(record: SQSRecord): return {"message": record.body} def lambda_handler(event: Dict[str, Any], context: LambdaContext): config.register_lambda_context(context) # see Lambda timeouts section return process_partial_response( event=event, context=context, processor=processor, record_handler=record_handler, ) ``` ``` { "Records": [ { "messageId": "059f36b4-87a3-44ab-83d2-661975830a7d", "receiptHandle": "AQEBwJnKyrHigUMZj6rYigCgxlaS3SLy0a...", "body": "Test message.", "attributes": { "ApproximateReceiveCount": "1", "SentTimestamp": "1545082649183", "SenderId": "replace-to-pass-gitleak", "ApproximateFirstReceiveTimestamp": "1545082649185" }, "messageAttributes": { "testAttr": { "stringValue": "100", "binaryValue": "base64Str", "dataType": "Number" } }, "md5OfBody": "e4e68fb7bd0e697a0ae8f1bb342846b3", "eventSource": "aws:sqs", "eventSourceARN": "arn:aws:sqs:us-east-2:123456789012:my-queue", "awsRegion": "us-east-2" } ] } ``` ### Idempotency request flow The following sequence diagrams explain how the Idempotency feature behaves under different scenarios. #### Successful request ``` sequenceDiagram participant Client participant Lambda participant Persistence Layer alt initial request Client->>Lambda: Invoke (event) Lambda->>Persistence Layer: Get or set idempotency_key=hash(payload) activate Persistence Layer Note over Lambda,Persistence Layer: Set record status to INPROGRESS.
Prevents concurrent invocations
with the same payload Lambda-->>Lambda: Call your function Lambda->>Persistence Layer: Update record with result deactivate Persistence Layer Persistence Layer-->>Persistence Layer: Update record Note over Lambda,Persistence Layer: Set record status to COMPLETE.
New invocations with the same payload
now return the same result Lambda-->>Client: Response sent to client else retried request Client->>Lambda: Invoke (event) Lambda->>Persistence Layer: Get or set idempotency_key=hash(payload) activate Persistence Layer Persistence Layer-->>Lambda: Already exists in persistence layer. deactivate Persistence Layer Note over Lambda,Persistence Layer: Record status is COMPLETE and not expired Lambda-->>Client: Same response sent to client end ``` *Idempotent successful request* #### Successful request with cache enabled [In-memory cache is disabled by default](#using-in-memory-cache). ``` sequenceDiagram participant Client participant Lambda participant Persistence Layer alt initial request Client->>Lambda: Invoke (event) Lambda->>Persistence Layer: Get or set idempotency_key=hash(payload) activate Persistence Layer Note over Lambda,Persistence Layer: Set record status to INPROGRESS.
Prevents concurrent invocations
with the same payload Lambda-->>Lambda: Call your function Lambda->>Persistence Layer: Update record with result deactivate Persistence Layer Persistence Layer-->>Persistence Layer: Update record Note over Lambda,Persistence Layer: Set record status to COMPLETE.
New invocations with the same payload
now return the same result Lambda-->>Lambda: Save record and result in memory Lambda-->>Client: Response sent to client else retried request Client->>Lambda: Invoke (event) Lambda-->>Lambda: Get idempotency_key=hash(payload) Note over Lambda,Persistence Layer: Record status is COMPLETE and not expired Lambda-->>Client: Same response sent to client end ``` *Idempotent successful request cached* #### Successful request with response_hook configured ``` sequenceDiagram participant Client participant Lambda participant Response hook participant Persistence Layer alt initial request Client->>Lambda: Invoke (event) Lambda->>Persistence Layer: Get or set idempotency_key=hash(payload) activate Persistence Layer Note over Lambda,Persistence Layer: Set record status to INPROGRESS.
Prevents concurrent invocations
with the same payload Lambda-->>Lambda: Call your function Lambda->>Persistence Layer: Update record with result deactivate Persistence Layer Persistence Layer-->>Persistence Layer: Update record Note over Lambda,Persistence Layer: Set record status to COMPLETE.
New invocations with the same payload
now return the same result Lambda-->>Client: Response sent to client else retried request Client->>Lambda: Invoke (event) Lambda->>Persistence Layer: Get or set idempotency_key=hash(payload) activate Persistence Layer Persistence Layer-->>Response hook: Already exists in persistence layer. deactivate Persistence Layer Note over Response hook,Persistence Layer: Record status is COMPLETE and not expired Response hook->>Lambda: Response hook invoked Lambda-->>Client: Manipulated idempotent response sent to client end ``` *Successful idempotent request with a response hook* #### Expired idempotency records ``` sequenceDiagram participant Client participant Lambda participant Persistence Layer alt initial request Client->>Lambda: Invoke (event) Lambda->>Persistence Layer: Get or set idempotency_key=hash(payload) activate Persistence Layer Note over Lambda,Persistence Layer: Set record status to INPROGRESS.
Prevents concurrent invocations
with the same payload Lambda-->>Lambda: Call your function Lambda->>Persistence Layer: Update record with result deactivate Persistence Layer Persistence Layer-->>Persistence Layer: Update record Note over Lambda,Persistence Layer: Set record status to COMPLETE.
New invocations with the same payload
now return the same result Lambda-->>Client: Response sent to client else retried request Client->>Lambda: Invoke (event) Lambda->>Persistence Layer: Get or set idempotency_key=hash(payload) activate Persistence Layer Persistence Layer-->>Lambda: Already exists in persistence layer. deactivate Persistence Layer Note over Lambda,Persistence Layer: Record status is COMPLETE but expired hours ago loop Repeat initial request process Note over Lambda,Persistence Layer: 1. Set record to INPROGRESS,
2. Call your function,
3. Set record to COMPLETE end Lambda-->>Client: Same response sent to client end ``` *Previous Idempotent request expired* #### Concurrent identical in-flight requests ``` sequenceDiagram participant Client participant Lambda participant Persistence Layer Client->>Lambda: Invoke (event) Lambda->>Persistence Layer: Get or set idempotency_key=hash(payload) activate Persistence Layer Note over Lambda,Persistence Layer: Set record status to INPROGRESS.
Prevents concurrent invocations
with the same payload par Second request Client->>Lambda: Invoke (event) Lambda->>Persistence Layer: Get or set idempotency_key=hash(payload) Lambda--xLambda: IdempotencyAlreadyInProgressError Lambda->>Client: Error sent to client if unhandled end Lambda-->>Lambda: Call your function Lambda->>Persistence Layer: Update record with result deactivate Persistence Layer Persistence Layer-->>Persistence Layer: Update record Note over Lambda,Persistence Layer: Set record status to COMPLETE.
New invocations with the same payload
now return the same result Lambda-->>Client: Response sent to client ``` *Concurrent identical in-flight requests* #### Unhandled exception ``` sequenceDiagram participant Client participant Lambda participant Persistence Layer Client->>Lambda: Invoke (event) Lambda->>Persistence Layer: Get or set (id=event.search(payload)) activate Persistence Layer Note right of Persistence Layer: Locked during this time. Prevents multiple
Lambda invocations with the same
payload running concurrently. Lambda--xLambda: Call handler (event).
Raises exception Lambda->>Persistence Layer: Delete record (id=event.search(payload)) deactivate Persistence Layer Lambda-->>Client: Return error response ``` *Idempotent sequence exception* #### Lambda request timeout ``` sequenceDiagram participant Client participant Lambda participant Persistence Layer alt initial request Client->>Lambda: Invoke (event) Lambda->>Persistence Layer: Get or set idempotency_key=hash(payload) activate Persistence Layer Note over Lambda,Persistence Layer: Set record status to INPROGRESS.
Prevents concurrent invocations
with the same payload Lambda-->>Lambda: Call your function Note right of Lambda: Time out Lambda--xLambda: Time out error Lambda-->>Client: Return error response deactivate Persistence Layer else retry after Lambda timeout elapses Client->>Lambda: Invoke (event) Lambda->>Persistence Layer: Get or set idempotency_key=hash(payload) activate Persistence Layer Note over Lambda,Persistence Layer: Set record status to INPROGRESS.
Reset in_progress_expiry attribute Lambda-->>Lambda: Call your function Lambda->>Persistence Layer: Update record with result deactivate Persistence Layer Persistence Layer-->>Persistence Layer: Update record Lambda-->>Client: Response sent to client end ``` *Idempotent request during and after Lambda timeouts* #### Optional idempotency key ``` sequenceDiagram participant Client participant Lambda participant Persistence Layer alt request with idempotency key Client->>Lambda: Invoke (event) Lambda->>Persistence Layer: Get or set idempotency_key=hash(payload) activate Persistence Layer Note over Lambda,Persistence Layer: Set record status to INPROGRESS.
Prevents concurrent invocations
with the same payload Lambda-->>Lambda: Call your function Lambda->>Persistence Layer: Update record with result deactivate Persistence Layer Persistence Layer-->>Persistence Layer: Update record Note over Lambda,Persistence Layer: Set record status to COMPLETE.
New invocations with the same payload
now return the same result Lambda-->>Client: Response sent to client else request(s) without idempotency key Client->>Lambda: Invoke (event) Note over Lambda: Idempotency key is missing Note over Persistence Layer: Skips any operation to fetch, update, and delete Lambda-->>Lambda: Call your function Lambda-->>Client: Response sent to client end ``` *Optional idempotency key* #### Race condition with Cache ``` graph TD; A(Existing orphan record in cache)-->A1; A1[Two Lambda invoke at same time]-->B1[Lambda handler1]; B1-->B2[Fetch from Cache]; B2-->B3[Handler1 got orphan record]; B3-->B4[Handler1 acquired lock]; B4-->B5[Handler1 overwrite orphan record] B5-->B6[Handler1 continue to execution]; A1-->C1[Lambda handler2]; C1-->C2[Fetch from Cache]; C2-->C3[Handler2 got orphan record]; C3-->C4[Handler2 failed to acquire lock]; C4-->C5[Handler2 wait and fetch from Cache]; C5-->C6[Handler2 return without executing]; B6-->D(Lambda handler executed only once); C6-->D; ``` *Race condition with Cache* ## Advanced ### Customizing the default behavior You can override and further extend idempotency behavior via **`IdempotencyConfig`** with the following options: | Parameter | Default | Description | | --- | --- | --- | | **event_key_jmespath** | `""` | JMESPath expression to extract the idempotency key from the event record using [built-in functions](../jmespath_functions/#built-in-jmespath-functions) | | **payload_validation_jmespath** | `""` | JMESPath expression to validate that the specified fields haven't changed across requests for the same idempotency key *e.g., payload tampering.* | | **raise_on_no_idempotency_key** | `False` | Raise exception if no idempotency key was found in the request | | **expires_after_seconds** | 3600 | The number of seconds to wait before a record is expired, allowing a new transaction with the same idempotency key | | **use_local_cache** | `False` | Whether to cache idempotency results in-memory to save on persistence storage latency and costs | | **local_cache_max_items** | 256 | Max number of items to store in local cache | | **hash_function** | `md5` | Function to use for calculating hashes, as provided by [hashlib](https://docs.python.org/3/library/hashlib.html) in the standard library. | | **response_hook** | `None` | Function to use for processing the stored Idempotent response. This function hook is called when an existing idempotent response is found. See [Manipulating The Idempotent Response](./#manipulating-the-idempotent-response) | ### Handling concurrent executions with the same payload This utility will raise an **`IdempotencyAlreadyInProgressError`** exception if you receive **multiple invocations with the same payload while the first invocation hasn't completed yet**. Info If you receive `IdempotencyAlreadyInProgressError`, you can safely retry the operation. This is a locking mechanism for correctness. Since we don't know the result from the first invocation yet, we can't safely allow another concurrent execution. ### Payload validation Question: What if your function is invoked with the same payload except some outer parameters have changed? Example: A payment transaction for a given productID was requested twice for the same customer, **however the amount to be paid has changed in the second transaction**. By default, we will return the same result as it returned before, however in this instance it may be misleading; we provide a fail fast payload validation to address this edge case. With **`payload_validation_jmespath`**, you can provide an additional JMESPath expression to specify which part of the event body should be validated against previous idempotent invocations ``` import os from dataclasses import dataclass, field from uuid import uuid4 from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent, ) from aws_lambda_powertools.utilities.idempotency.exceptions import IdempotencyValidationError from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() table = os.getenv("IDEMPOTENCY_TABLE", "") persistence_layer = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig( event_key_jmespath='["user_id", "product_id"]', payload_validation_jmespath="amount", ) @dataclass class Payment: user_id: str product_id: str charge_type: str amount: int payment_id: str = field(default_factory=lambda: f"{uuid4()}") class PaymentError(Exception): ... @idempotent(config=config, persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext): try: payment: Payment = create_subscription_payment(event) return { "payment_id": payment.payment_id, "message": "success", "statusCode": 200, } except IdempotencyValidationError: logger.exception("Payload tampering detected", payment=payment, failure_type="validation") return { "message": "Unable to process payment at this time. Try again later.", "statusCode": 500, } except Exception as exc: raise PaymentError(f"Error creating payment {str(exc)}") def create_subscription_payment(event: dict) -> Payment: return Payment(**event) ``` ``` { "user_id": 1, "product_id": 1500, "charge_type": "subscription", "amount": 500 } ``` ``` { "user_id": 1, "product_id": 1500, "charge_type": "subscription", "amount": 10 } ``` In this example, the **`user_id`** and **`product_id`** keys are used as the payload to generate the idempotency key, as per **`event_key_jmespath`** parameter. Note If we try to send the same request but with a different amount, we will raise **`IdempotencyValidationError`**. Without payload validation, we would have returned the same result as we did for the initial request. Since we're also returning an amount in the response, this could be quite confusing for the client. By using **`payload_validation_jmespath="amount"`**, we prevent this potentially confusing behavior and instead raise an Exception. ### Making idempotency key required If you want to enforce that an idempotency key is required, you can set **`raise_on_no_idempotency_key`** to `True`. This means that we will raise **`IdempotencyKeyError`** if the evaluation of **`event_key_jmespath`** is `None`. Warning To prevent errors, transactions will not be treated as idempotent if **`raise_on_no_idempotency_key`** is set to `False` and the evaluation of **`event_key_jmespath`** is `None`. Therefore, no data will be fetched, stored, or deleted in the idempotency storage layer. ``` import os from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent, ) from aws_lambda_powertools.utilities.typing import LambdaContext table = os.getenv("IDEMPOTENCY_TABLE", "") persistence_layer = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig( event_key_jmespath='["user.uid", "order_id"]', raise_on_no_idempotency_key=True, ) @idempotent(config=config, persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext) -> dict: return event ``` ``` { "user": { "uid": "BB0D045C-8878-40C8-889E-38B3CB0A61B1", "name": "Foo" }, "order_id": 10000 } ``` ``` { "user": { "uid": "BB0D045C-8878-40C8-889E-38B3CB0A61B1", "name": "Foo", "order_id": 10000 } } ``` ### Customizing boto configuration The **`boto_config`** and **`boto3_session`** parameters enable you to pass in a custom [botocore config object](https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html) or a custom [boto3 session](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html) when constructing the persistence store. ``` import os import boto3 from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent, ) from aws_lambda_powertools.utilities.typing import LambdaContext # See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html#module-boto3.session boto3_session = boto3.session.Session() table = os.getenv("IDEMPOTENCY_TABLE", "") persistence_layer = DynamoDBPersistenceLayer(table_name=table, boto3_session=boto3_session) config = IdempotencyConfig(event_key_jmespath="body") @idempotent(persistence_store=persistence_layer, config=config) def lambda_handler(event: dict, context: LambdaContext) -> dict: return event ``` ``` import os from botocore.config import Config from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent, ) from aws_lambda_powertools.utilities.typing import LambdaContext # See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html#botocore-config boto_config = Config() table = os.getenv("IDEMPOTENCY_TABLE", "") persistence_layer = DynamoDBPersistenceLayer(table_name=table, boto_config=boto_config) config = IdempotencyConfig(event_key_jmespath="body") @idempotent(persistence_store=persistence_layer, config=config) def lambda_handler(event: dict, context: LambdaContext) -> dict: return event ``` ``` { "body": "{\"user_id\":\"xyz\",\"product_id\":\"123456789\"}" } ``` ### Bring your own persistent store This utility provides an abstract base class (ABC), so that you can implement your choice of persistent storage layer. You can create your own persistent store from scratch by inheriting the `BasePersistenceLayer` class, and implementing `_get_record()`, `_put_record()`, `_update_record()` and `_delete_record()`. - **`_get_record()`** – Retrieves an item from the persistence store using an idempotency key and returns it as a `DataRecord` instance. - **`_put_record()`** – Adds a `DataRecord` to the persistence store if it doesn't already exist with that key. Raises an `ItemAlreadyExists` exception if a non-expired entry already exists. - **`_update_record()`** – Updates an item in the persistence store. - **`_delete_record()`** – Removes an item from the persistence store. ``` import datetime import logging from typing import Any, Dict, Optional import boto3 from botocore.config import Config from aws_lambda_powertools.utilities.idempotency import BasePersistenceLayer from aws_lambda_powertools.utilities.idempotency.exceptions import ( IdempotencyItemAlreadyExistsError, IdempotencyItemNotFoundError, ) from aws_lambda_powertools.utilities.idempotency.persistence.base import DataRecord logger = logging.getLogger(__name__) class MyOwnPersistenceLayer(BasePersistenceLayer): def __init__( self, table_name: str, key_attr: str = "id", expiry_attr: str = "expiration", status_attr: str = "status", data_attr: str = "data", validation_key_attr: str = "validation", boto_config: Optional[Config] = None, boto3_session: Optional[boto3.session.Session] = None, ): boto3_session = boto3_session or boto3.session.Session() self._ddb_resource = boto3_session.resource("dynamodb", config=boto_config) self.table_name = table_name self.table = self._ddb_resource.Table(self.table_name) self.key_attr = key_attr self.expiry_attr = expiry_attr self.status_attr = status_attr self.data_attr = data_attr self.validation_key_attr = validation_key_attr super().__init__() def _item_to_data_record(self, item: Dict[str, Any]) -> DataRecord: """ Translate raw item records from DynamoDB to DataRecord Parameters ---------- item: Dict[str, Union[str, int]] Item format from dynamodb response Returns ------- DataRecord representation of item """ return DataRecord( idempotency_key=item[self.key_attr], status=item[self.status_attr], expiry_timestamp=item[self.expiry_attr], response_data=item.get(self.data_attr, ""), payload_hash=item.get(self.validation_key_attr, ""), ) def _get_record(self, idempotency_key) -> DataRecord: response = self.table.get_item(Key={self.key_attr: idempotency_key}, ConsistentRead=True) try: item = response["Item"] except KeyError: raise IdempotencyItemNotFoundError return self._item_to_data_record(item) def _put_record(self, data_record: DataRecord) -> None: item = { self.key_attr: data_record.idempotency_key, self.expiry_attr: data_record.expiry_timestamp, self.status_attr: data_record.status, } if self.payload_validation_enabled: item[self.validation_key_attr] = data_record.payload_hash now = datetime.datetime.now() try: logger.debug(f"Putting record for idempotency key: {data_record.idempotency_key}") self.table.put_item( Item=item, ConditionExpression=f"attribute_not_exists({self.key_attr}) OR {self.expiry_attr} < :now", ExpressionAttributeValues={":now": int(now.timestamp())}, ) except self._ddb_resource.meta.client.exceptions.ConditionalCheckFailedException: logger.debug(f"Failed to put record for already existing idempotency key: {data_record.idempotency_key}") raise IdempotencyItemAlreadyExistsError def _update_record(self, data_record: DataRecord): logger.debug(f"Updating record for idempotency key: {data_record.idempotency_key}") update_expression = "SET #response_data = :response_data, #expiry = :expiry, #status = :status" expression_attr_values = { ":expiry": data_record.expiry_timestamp, ":response_data": data_record.response_data, ":status": data_record.status, } expression_attr_names = { "#response_data": self.data_attr, "#expiry": self.expiry_attr, "#status": self.status_attr, } if self.payload_validation_enabled: update_expression += ", #validation_key = :validation_key" expression_attr_values[":validation_key"] = data_record.payload_hash expression_attr_names["#validation_key"] = self.validation_key_attr self.table.update_item( Key={self.key_attr: data_record.idempotency_key}, UpdateExpression=update_expression, ExpressionAttributeValues=expression_attr_values, ExpressionAttributeNames=expression_attr_names, ) def _delete_record(self, data_record: DataRecord) -> None: logger.debug(f"Deleting record for idempotency key: {data_record.idempotency_key}") self.table.delete_item( Key={self.key_attr: data_record.idempotency_key}, ) ``` Danger Pay attention to the documentation for each - you may need to perform additional checks inside these methods to ensure the idempotency guarantees remain intact. For example, the `_put_record` method needs to raise an exception if a non-expired record already exists in the data store with a matching key. ### Manipulating the Idempotent Response You can set up a `response_hook` in the `IdempotentConfig` class to manipulate the returned data when an operation is idempotent. The hook function will be called with the current deserialized response object and the Idempotency record. ``` import os import uuid from typing import Dict from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent_function, ) from aws_lambda_powertools.utilities.idempotency.persistence.datarecord import ( DataRecord, ) from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() def my_response_hook(response: Dict, idempotent_data: DataRecord) -> Dict: # Return inserted Header data into the Idempotent Response response["x-idempotent-key"] = idempotent_data.idempotency_key # expiry_timestamp can be None so include if set expiry_timestamp = idempotent_data.get_expiration_datetime() if expiry_timestamp: response["x-idempotent-expiration"] = expiry_timestamp.isoformat() # Must return the response here return response table = os.getenv("IDEMPOTENCY_TABLE", "") dynamodb = DynamoDBPersistenceLayer(table_name=table) config = IdempotencyConfig(response_hook=my_response_hook) @idempotent_function(data_keyword_argument="order", config=config, persistence_store=dynamodb) def process_order(order: dict) -> dict: # create the order_id order_id = str(uuid.uuid4()) # create your logic to save the order # append the order_id created order["order_id"] = order_id # return the order return {"order": order} def lambda_handler(event: dict, context: LambdaContext): config.register_lambda_context(context) # see Lambda timeouts section try: logger.info(f"Processing order id {event.get('order_id')}") return process_order(order=event.get("order")) except Exception as err: return {"status_code": 400, "error": f"Error processing {str(err)}"} ``` ``` { "order" : { "user_id": "xyz", "product_id": "123456789", "quantity": 2, "value": 30 } } ``` Info: Using custom de-serialization? The response_hook is called after the custom de-serialization so the payload you process will be the de-serialized version. #### Being a good citizen When using response hooks to manipulate returned data from idempotent operations, it's important to follow best practices to avoid introducing complexity or issues. Keep these guidelines in mind: 1. **Response hook works exclusively when operations are idempotent.** The hook will not be called when an operation is not idempotent, or when the idempotent logic fails. 1. **Catch and Handle Exceptions.** Your response hook code should catch and handle any exceptions that may arise from your logic. Unhandled exceptions will cause the Lambda function to fail unexpectedly. 1. **Keep Hook Logic Simple** Response hooks should consist of minimal and straightforward logic for manipulating response data. Avoid complex conditional branching and aim for hooks that are easy to reason about. ## Compatibility with other utilities ### JSON Schema Validation The idempotency utility can be used with the `validator` decorator. Ensure that idempotency is the innermost decorator. Warning If you use an envelope with the validator, the event received by the idempotency utility will be the unwrapped event - not the "raw" event Lambda was invoked with. Make sure to account for this behavior, if you set the `event_key_jmespath`. ``` import os from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent, ) from aws_lambda_powertools.utilities.typing import LambdaContext from aws_lambda_powertools.utilities.validation import envelopes, validator table = os.getenv("IDEMPOTENCY_TABLE", "") config = IdempotencyConfig(event_key_jmespath='["message", "username"]') persistence_layer = DynamoDBPersistenceLayer(table_name=table) @validator(envelope=envelopes.API_GATEWAY_HTTP) @idempotent(config=config, persistence_store=persistence_layer) def lambda_handler(event, context: LambdaContext): return {"message": event["message"], "statusCode": 200} ``` ``` { "version": "2.0", "routeKey": "$default", "rawPath": "/my/path", "rawQueryString": "parameter1=value1¶meter1=value2¶meter2=value", "cookies": [ "cookie1", "cookie2" ], "headers": { "Header1": "value1", "Header2": "value1,value2" }, "queryStringParameters": { "parameter1": "value1,value2", "parameter2": "value" }, "requestContext": { "accountId": "123456789012", "apiId": "api-id", "authentication": { "clientCert": { "clientCertPem": "CERT_CONTENT", "subjectDN": "www.example.com", "issuerDN": "Example issuer", "serialNumber": "a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1", "validity": { "notBefore": "May 28 12:30:02 2019 GMT", "notAfter": "Aug 5 09:36:04 2021 GMT" } } }, "authorizer": { "jwt": { "claims": { "claim1": "value1", "claim2": "value2" }, "scopes": [ "scope1", "scope2" ] } }, "domainName": "id.execute-api.us-east-1.amazonaws.com", "domainPrefix": "id", "http": { "method": "POST", "path": "/my/path", "protocol": "HTTP/1.1", "sourceIp": "192.168.0.1/32", "userAgent": "agent" }, "requestId": "id", "routeKey": "$default", "stage": "$default", "time": "12/Mar/2020:19:03:58 +0000", "timeEpoch": 1583348638390 }, "body": "{\"message\": \"hello world\", \"username\": \"tom\"}", "pathParameters": { "parameter1": "value1" }, "isBase64Encoded": false, "stageVariables": { "stageVariable1": "value1", "stageVariable2": "value2" } } ``` Tip: JMESPath Powertools for AWS Lambda (Python) functions are also available Built-in functions known in the validation utility like `powertools_json`, `powertools_base64`, `powertools_base64_gzip` are also available to use in this utility. ### Tracer The idempotency utility can be used with the `tracer` decorator. Ensure that idempotency is the innermost decorator. #### First execution During the first execution with a payload, Lambda performs a `PutItem` followed by an `UpdateItem` operation to persist the record in DynamoDB. #### Subsequent executions On subsequent executions with the same payload, Lambda optimistically tries to save the record in DynamoDB. If the record already exists, DynamoDB returns the item. Explore how to handle conditional write errors in high-concurrency scenarios with DynamoDB in this [blog post](https://aws.amazon.com/pt/blogs/database/handle-conditional-write-errors-in-high-concurrency-scenarios-with-amazon-dynamodb/). ## Testing your code The idempotency utility provides several routes to test your code. ### Disabling the idempotency utility When testing your code, you may wish to disable the idempotency logic altogether and focus on testing your business logic. To do this, you can set the environment variable `POWERTOOLS_IDEMPOTENCY_DISABLED` with a truthy value. If you prefer setting this for specific tests, and are using Pytest, you can use [monkeypatch](https://docs.pytest.org/en/latest/monkeypatch.html) fixture: ``` from dataclasses import dataclass import app_test_disabling_idempotency_utility import pytest @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:809313241:function:test" aws_request_id: str = "52fdfc07-2182-154f-163f-5f0f9a621d72" def get_remaining_time_in_millis(self) -> int: return 5 @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() def test_idempotent_lambda_handler(monkeypatch, lambda_context: LambdaContext): # Set POWERTOOLS_IDEMPOTENCY_DISABLED before calling decorated functions monkeypatch.setenv("POWERTOOLS_IDEMPOTENCY_DISABLED", 1) result = app_test_disabling_idempotency_utility.lambda_handler({}, lambda_context) assert result ``` ``` from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, idempotent, ) from aws_lambda_powertools.utilities.typing import LambdaContext persistence_layer = DynamoDBPersistenceLayer(table_name="IdempotencyTable") @idempotent(persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext): print("expensive operation") return { "payment_id": 12345, "message": "success", "statusCode": 200, } ``` ### Testing with DynamoDB Local To test with [DynamoDB Local](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.DownloadingAndRunning.html), you can replace the `DynamoDB client` used by the persistence layer with one you create inside your tests. This allows you to set the endpoint_url. ``` from dataclasses import dataclass import app_test_dynamodb_local import boto3 import pytest @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:809313241:function:test" aws_request_id: str = "52fdfc07-2182-154f-163f-5f0f9a621d72" def get_remaining_time_in_millis(self) -> int: return 5 @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() def test_idempotent_lambda(lambda_context): # Configure the boto3 to use the endpoint for the DynamoDB Local instance dynamodb_local_client = boto3.client("dynamodb", endpoint_url="http://localhost:8000") app_test_dynamodb_local.persistence_layer.client = dynamodb_local_client # If desired, you can use a different DynamoDB Local table name than what your code already uses # app.persistence_layer.table_name = "another table name" # noqa: ERA001 result = app_test_dynamodb_local.handler({"testkey": "testvalue"}, lambda_context) assert result["payment_id"] == 12345 ``` ``` from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, idempotent, ) from aws_lambda_powertools.utilities.typing import LambdaContext persistence_layer = DynamoDBPersistenceLayer(table_name="IdempotencyTable") @idempotent(persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext): print("expensive operation") return { "payment_id": 12345, "message": "success", "statusCode": 200, } ``` ### How do I mock all DynamoDB I/O operations The idempotency utility lazily creates the dynamodb [Table](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/dynamodb.html#table) which it uses to access DynamoDB. This means it is possible to pass a mocked Table resource, or stub various methods. ``` from dataclasses import dataclass from unittest.mock import MagicMock import app_test_io_operations import pytest @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:809313241:function:test" aws_request_id: str = "52fdfc07-2182-154f-163f-5f0f9a621d72" def get_remaining_time_in_millis(self) -> int: return 5 @pytest.fixture def lambda_context() -> LambdaContext: return LambdaContext() def test_idempotent_lambda(lambda_context): mock_client = MagicMock() app_test_io_operations.persistence_layer.client = mock_client result = app_test_io_operations.handler({"testkey": "testvalue"}, lambda_context) mock_client.put_item.assert_called() assert result ``` ``` from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, idempotent, ) from aws_lambda_powertools.utilities.typing import LambdaContext persistence_layer = DynamoDBPersistenceLayer(table_name="IdempotencyTable") @idempotent(persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext): print("expensive operation") return { "payment_id": 12345, "message": "success", "statusCode": 200, } ``` ### Testing with Redis To test locally, you can either utilize [fakeredis-py](https://github.com/cunla/fakeredis-py) for a simulated Redis environment or refer to the [MockRedis](https://github.com/aws-powertools/powertools-lambda-python/blob/ba6532a1c73e20fdaee88c5795fd40e978553e14/tests/functional/idempotency/persistence/test_redis_layer.py#L34-L66) class used in our tests to mock Redis operations. ``` from dataclasses import dataclass import pytest from mock_redis import MockRedis from aws_lambda_powertools.utilities.idempotency import ( idempotent, ) from aws_lambda_powertools.utilities.idempotency.persistence.redis import ( RedisCachePersistenceLayer, ) from aws_lambda_powertools.utilities.typing import LambdaContext @pytest.fixture def lambda_context(): @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:809313241:function:test" aws_request_id: str = "52fdfc07-2182-154f-163f-5f0f9a621d72" def get_remaining_time_in_millis(self) -> int: return 1000 return LambdaContext() def test_idempotent_lambda(lambda_context): # Init the Mock redis client redis_client = MockRedis(decode_responses=True) # Establish persistence layer using the mock redis client persistence_layer = RedisCachePersistenceLayer(client=redis_client) # setup idempotent with redis persistence layer @idempotent(persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext): print("expensive operation") return { "payment_id": 12345, "message": "success", "statusCode": 200, } # Inovke the sim lambda handler result = lambda_handler({"testkey": "testvalue"}, lambda_context) assert result["payment_id"] == 12345 ``` ``` import time as t from typing import Dict # Mock redis class that includes all operations we used in Idempotency class MockRedis: def __init__(self, decode_responses, cache: Dict, **kwargs): self.cache = cache or {} self.expire_dict: Dict = {} self.decode_responses = decode_responses self.acl: Dict = {} self.username = "" def hset(self, name, mapping): self.expire_dict.pop(name, {}) self.cache[name] = mapping def from_url(self, url: str): pass def expire(self, name, time): self.expire_dict[name] = t.time() + time # return {} if no match def hgetall(self, name): if self.expire_dict.get(name, t.time() + 1) < t.time(): self.cache.pop(name, {}) return self.cache.get(name, {}) def get_connection_kwargs(self): return {"decode_responses": self.decode_responses} def auth(self, username, **kwargs): self.username = username def delete(self, name): self.cache.pop(name, {}) ``` If you want to set up a real Redis client for integration testing, you can reference the code provided below. ``` from dataclasses import dataclass import pytest import redis from aws_lambda_powertools.utilities.idempotency import ( idempotent, ) from aws_lambda_powertools.utilities.idempotency.persistence.redis import ( RedisCachePersistenceLayer, ) from aws_lambda_powertools.utilities.typing import LambdaContext @pytest.fixture def lambda_context(): @dataclass class LambdaContext: function_name: str = "test" memory_limit_in_mb: int = 128 invoked_function_arn: str = "arn:aws:lambda:eu-west-1:809313241:function:test" aws_request_id: str = "52fdfc07-2182-154f-163f-5f0f9a621d72" def get_remaining_time_in_millis(self) -> int: return 1000 return LambdaContext() @pytest.fixture def persistence_store_standalone_redis(): # init a Real Redis client and connect to the Port set in the Makefile redis_client = redis.Redis( host="localhost", port="63005", decode_responses=True, ) # return a persistence layer with real Redis return RedisCachePersistenceLayer(client=redis_client) def test_idempotent_lambda(lambda_context, persistence_store_standalone_redis): # Establish persistence layer using the real redis client persistence_layer = persistence_store_standalone_redis # setup idempotent with redis persistence layer @idempotent(persistence_store=persistence_layer) def lambda_handler(event: dict, context: LambdaContext): print("expensive operation") return { "payment_id": 12345, "message": "success", "statusCode": 200, } # Inovke the sim lambda handler result = lambda_handler({"testkey": "testvalue"}, lambda_context) assert result["payment_id"] == 12345 ``` ``` test-idempotency-redis: # (1)! docker run --name test-idempotency-redis -d -p 63005:6379 redis pytest test_with_real_redis.py;docker stop test-idempotency-redis;docker rm test-idempotency-redis ``` 1. Use this script to setup a temp Redis docker and auto remove it upon completion ## Extra resources If you're interested in a deep dive on how Amazon uses idempotency when building our APIs, check out [this article](https://aws.amazon.com/builders-library/making-retries-safe-with-idempotent-APIs/). Tip JMESPath is a query language for JSON used by AWS CLI, AWS Python SDK, and Powertools for AWS Lambda (Python). Built-in [JMESPath](https://jmespath.org/) Functions to easily deserialize common encoded JSON payloads in Lambda functions. ## Key features - Deserialize JSON from JSON strings, base64, and compressed data - Use JMESPath to extract and combine data recursively - Provides commonly used JMESPath expression with popular event sources ## Getting started Tip All examples shared in this documentation are available within the [project repository](https://github.com/aws-powertools/powertools-lambda-python/tree/develop/examples). You might have events that contains encoded JSON payloads as string, base64, or even in compressed format. It is a common use case to decode and extract them partially or fully as part of your Lambda function invocation. Powertools for AWS Lambda (Python) also have utilities like [validation](../validation/), [idempotency](../idempotency/), or [feature flags](../feature_flags/) where you might need to extract a portion of your data before using them. Terminology **Envelope** is the terminology we use for the **JMESPath expression** to extract your JSON object from your data input. We might use those two terms interchangeably. ### Extracting data You can use the `query` function with any [JMESPath expression](https://jmespath.org/tutorial.html). Tip Another common use case is to fetch deeply nested data, filter, flatten, and more. ``` from aws_lambda_powertools.utilities.jmespath_utils import query from aws_lambda_powertools.utilities.typing import LambdaContext def handler(event: dict, context: LambdaContext) -> dict: payload = query(data=event, envelope="powertools_json(body)") customer_id = payload.get("customerId") # now deserialized # also works for fetching and flattening deeply nested data some_data = query(data=event, envelope="deeply_nested[*].some_data[]") return {"customer_id": customer_id, "message": "success", "context": some_data, "statusCode": 200} ``` ``` { "body": "{\"customerId\":\"dd4649e6-2484-4993-acb8-0f9123103394\"}", "deeply_nested": [ { "some_data": [ 1, 2, 3 ] } ] } ``` ### Built-in envelopes We provide built-in envelopes for popular AWS Lambda event sources to easily decode and/or deserialize JSON objects. ``` from __future__ import annotations from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.jmespath_utils import ( envelopes, query, ) from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() def handler(event: dict, context: LambdaContext) -> dict: records: list = query(data=event, envelope=envelopes.SQS) for record in records: # records is a list logger.info(record.get("customerId")) # now deserialized return {"message": "success", "statusCode": 200} ``` ``` { "Records": [ { "messageId": "19dd0b57-b21e-4ac1-bd88-01bbb068cb78", "receiptHandle": "MessageReceiptHandle", "body": "{\"customerId\":\"dd4649e6-2484-4993-acb8-0f9123103394\",\"booking\":{\"id\":\"5b2c4803-330b-42b7-811a-c68689425de1\",\"reference\":\"ySz7oA\",\"outboundFlightId\":\"20c0d2f2-56a3-4068-bf20-ff7703db552d\"},\"payment\":{\"receipt\":\"https:\/\/pay.stripe.com\/receipts\/acct_1Dvn7pF4aIiftV70\/ch_3JTC14F4aIiftV700iFq2CHB\/rcpt_K7QsrFln9FgFnzUuBIiNdkkRYGxUL0X\",\"amount\":100}}", "attributes": { "ApproximateReceiveCount": "1", "SentTimestamp": "1523232000000", "SenderId": "123456789012", "ApproximateFirstReceiveTimestamp": "1523232000001" }, "messageAttributes": {}, "md5OfBody": "7b270e59b47ff90a553787216d55d91d", "eventSource": "aws:sqs", "eventSourceARN": "arn:aws:sqs:us-east-1:123456789012:MyQueue", "awsRegion": "us-east-1" } ] } ``` These are all built-in envelopes you can use along with their expression as a reference: | Envelope | JMESPath expression | | | --- | --- | --- | | **`API_GATEWAY_HTTP`** | `powertools_json(body)` | | | **`API_GATEWAY_REST`** | `powertools_json(body)` | | | **`CLOUDWATCH_EVENTS_SCHEDULED`** | `detail` | | | **`CLOUDWATCH_LOGS`** | `awslogs.powertools_base64_gzip(data) | powertools_json(@).logEvents[*]` | | | **`EVENTBRIDGE`** | `detail` | | | **`KINESIS_DATA_STREAM`** | `Records[*].kinesis.powertools_json(powertools_base64(data))` | | | **`S3_EVENTBRIDGE_SQS`** | `Records[*].powertools_json(body).detail` | | | **`S3_KINESIS_FIREHOSE`** | `records[*].powertools_json(powertools_base64(data)).Records[0]` | | | **`S3_SNS_KINESIS_FIREHOSE`** | `records[*].powertools_json(powertools_base64(data)).powertools_json(Message).Records[0]` | | | **`S3_SNS_SQS`** | `Records[*].powertools_json(body).powertools_json(Message).Records[0]` | | | **`S3_SQS`** | `Records[*].powertools_json(body).Records[0]` | | | **`SNS`** | `Records[0].Sns.Message | powertools_json(@)` | | | **`SQS`** | `Records[*].powertools_json(body)` | | Using SNS? If you don't require SNS metadata, enable [raw message delivery](https://docs.aws.amazon.com/sns/latest/dg/sns-large-payload-raw-message-delivery.html). It will reduce multiple payload layers and size, when using SNS in combination with other services (*e.g., SQS, S3, etc*). ## Advanced ### Built-in JMESPath functions You can use our built-in JMESPath functions within your envelope expression. They handle deserialization for common data formats found in AWS Lambda event sources such as JSON strings, base64, and uncompress gzip data. Info We use these everywhere in Powertools for AWS Lambda (Python) to easily decode and unwrap events from Amazon API Gateway, Amazon Kinesis, AWS CloudWatch Logs, etc. #### powertools_json function Use `powertools_json` function to decode any JSON string anywhere a JMESPath expression is allowed. > **Validation scenario** This sample will deserialize the JSON string within the `data` key before validation. ``` import json from dataclasses import asdict, dataclass, field, is_dataclass from uuid import uuid4 import powertools_json_jmespath_schema as schemas from jmespath.exceptions import JMESPathTypeError from aws_lambda_powertools.utilities.typing import LambdaContext from aws_lambda_powertools.utilities.validation import SchemaValidationError, validate @dataclass class Order: user_id: int product_id: int quantity: int price: float currency: str order_id: str = field(default_factory=lambda: f"{uuid4()}") class DataclassCustomEncoder(json.JSONEncoder): """A custom JSON encoder to serialize dataclass obj""" def default(self, obj): # Only called for values that aren't JSON serializable # where `obj` will be an instance of Order in this example return asdict(obj) if is_dataclass(obj) else super().default(obj) def lambda_handler(event, context: LambdaContext) -> dict: try: # Validate order against our schema validate(event=event, schema=schemas.INPUT, envelope="powertools_json(payload)") # Deserialize JSON string order as dict # alternatively, query works here too order_payload: dict = json.loads(event.get("payload")) return { "order": json.dumps(Order(**order_payload), cls=DataclassCustomEncoder), "message": "order created", "success": True, } except JMESPathTypeError: # The powertools_json() envelope function must match a valid path return return_error_message("Invalid request.") except SchemaValidationError as exception: # SchemaValidationError indicates where a data mismatch is return return_error_message(str(exception)) except json.JSONDecodeError: return return_error_message("Payload must be valid JSON (base64 encoded).") def return_error_message(message: str) -> dict: return {"order": None, "message": message, "success": False} ``` ``` INPUT = { "$schema": "http://json-schema.org/draft-07/schema", "$id": "http://example.com/example.json", "type": "object", "title": "Sample order schema", "description": "The root schema comprises the entire JSON document.", "examples": [{"user_id": 123, "product_id": 1, "quantity": 2, "price": 10.40, "currency": "USD"}], "required": ["user_id", "product_id", "quantity", "price", "currency"], "properties": { "user_id": { "$id": "#/properties/user_id", "type": "integer", "title": "The unique identifier of the user", "examples": [123], "maxLength": 10, }, "product_id": { "$id": "#/properties/product_id", "type": "integer", "title": "The unique identifier of the product", "examples": [1], "maxLength": 10, }, "quantity": { "$id": "#/properties/quantity", "type": "integer", "title": "The quantity of the product", "examples": [2], "maxLength": 10, }, "price": { "$id": "#/properties/price", "type": "number", "title": "The individual price of the product", "examples": [10.40], "maxLength": 10, }, "currency": { "$id": "#/properties/currency", "type": "string", "title": "The currency", "examples": ["The currency of the order"], "maxLength": 100, }, }, } ``` ``` { "payload":"{\"user_id\": 123, \"product_id\": 1, \"quantity\": 2, \"price\": 10.40, \"currency\": \"USD\"}" } ``` > **Idempotency scenario** This sample will deserialize the JSON string within the `body` key before [Idempotency](../idempotency/) processes it. ``` import json from uuid import uuid4 import requests from aws_lambda_powertools.utilities.idempotency import ( DynamoDBPersistenceLayer, IdempotencyConfig, idempotent, ) persistence_layer = DynamoDBPersistenceLayer(table_name="IdempotencyTable") # Treat everything under the "body" key # in the event json object as our payload config = IdempotencyConfig(event_key_jmespath="powertools_json(body)") class PaymentError(Exception): ... @idempotent(config=config, persistence_store=persistence_layer) def handler(event, context) -> dict: body = json.loads(event["body"]) try: payment: dict = create_subscription_payment(user=body["user"], product_id=body["product_id"]) return {"payment_id": payment.get("id"), "message": "success", "statusCode": 200} except requests.HTTPError as e: raise PaymentError("Unable to create payment subscription") from e def create_subscription_payment(user: str, product_id: str) -> dict: payload = {"user": user, "product_id": product_id} ret: requests.Response = requests.post(url="https://httpbin.org/anything", data=payload) ret.raise_for_status() return {"id": f"{uuid4()}", "message": "paid"} ``` ``` { "version":"2.0", "routeKey":"ANY /createpayment", "rawPath":"/createpayment", "rawQueryString":"", "headers": { "Header1": "value1", "Header2": "value2" }, "requestContext":{ "accountId":"123456789012", "apiId":"api-id", "domainName":"id.execute-api.us-east-1.amazonaws.com", "domainPrefix":"id", "http":{ "method":"POST", "path":"/createpayment", "protocol":"HTTP/1.1", "sourceIp":"ip", "userAgent":"agent" }, "requestId":"id", "routeKey":"ANY /createpayment", "stage":"$default", "time":"10/Feb/2021:13:40:43 +0000", "timeEpoch":1612964443723 }, "body":"{\"user\":\"xyz\",\"product_id\":\"123456789\"}", "isBase64Encoded":false } ``` #### powertools_base64 function Use `powertools_base64` function to decode any base64 data. This sample will decode the base64 value within the `data` key, and deserialize the JSON string before validation. ``` import base64 import binascii import json from dataclasses import asdict, dataclass, field, is_dataclass from uuid import uuid4 import powertools_base64_jmespath_schema as schemas from jmespath.exceptions import JMESPathTypeError from aws_lambda_powertools.utilities.typing import LambdaContext from aws_lambda_powertools.utilities.validation import SchemaValidationError, validate @dataclass class Order: user_id: int product_id: int quantity: int price: float currency: str order_id: str = field(default_factory=lambda: f"{uuid4()}") class DataclassCustomEncoder(json.JSONEncoder): """A custom JSON encoder to serialize dataclass obj""" def default(self, obj): # Only called for values that aren't JSON serializable # where `obj` will be an instance of Todo in this example return asdict(obj) if is_dataclass(obj) else super().default(obj) def lambda_handler(event, context: LambdaContext) -> dict: # Try to validate the schema try: validate(event=event, schema=schemas.INPUT, envelope="powertools_json(powertools_base64(payload))") # alternatively, query works here too payload_decoded = base64.b64decode(event["payload"]).decode() order_payload: dict = json.loads(payload_decoded) return { "order": json.dumps(Order(**order_payload), cls=DataclassCustomEncoder), "message": "order created", "success": True, } except JMESPathTypeError: return return_error_message( "The powertools_json(powertools_base64()) envelope function must match a valid path.", ) except binascii.Error: return return_error_message("Payload must be a valid base64 encoded string") except json.JSONDecodeError: return return_error_message("Payload must be valid JSON (base64 encoded).") except SchemaValidationError as exception: # SchemaValidationError indicates where a data mismatch is return return_error_message(str(exception)) def return_error_message(message: str) -> dict: return {"order": None, "message": message, "success": False} ``` ``` INPUT = { "$schema": "http://json-schema.org/draft-07/schema", "$id": "http://example.com/example.json", "type": "object", "title": "Sample order schema", "description": "The root schema comprises the entire JSON document.", "examples": [{"user_id": 123, "product_id": 1, "quantity": 2, "price": 10.40, "currency": "USD"}], "required": ["user_id", "product_id", "quantity", "price", "currency"], "properties": { "user_id": { "$id": "#/properties/user_id", "type": "integer", "title": "The unique identifier of the user", "examples": [123], "maxLength": 10, }, "product_id": { "$id": "#/properties/product_id", "type": "integer", "title": "The unique identifier of the product", "examples": [1], "maxLength": 10, }, "quantity": { "$id": "#/properties/quantity", "type": "integer", "title": "The quantity of the product", "examples": [2], "maxLength": 10, }, "price": { "$id": "#/properties/price", "type": "number", "title": "The individual price of the product", "examples": [10.40], "maxLength": 10, }, "currency": { "$id": "#/properties/currency", "type": "string", "title": "The currency", "examples": ["The currency of the order"], "maxLength": 100, }, }, } ``` ``` { "payload":"eyJ1c2VyX2lkIjogMTIzLCAicHJvZHVjdF9pZCI6IDEsICJxdWFudGl0eSI6IDIsICJwcmljZSI6IDEwLjQwLCAiY3VycmVuY3kiOiAiVVNEIn0=" } ``` #### powertools_base64_gzip function Use `powertools_base64_gzip` function to decompress and decode base64 data. This sample will decompress and decode base64 data from Cloudwatch Logs, then use JMESPath pipeline expression to pass the result for decoding its JSON string. ``` import base64 import binascii import gzip import json import powertools_base64_gzip_jmespath_schema as schemas from jmespath.exceptions import JMESPathTypeError from aws_lambda_powertools.utilities.typing import LambdaContext from aws_lambda_powertools.utilities.validation import SchemaValidationError, validate def lambda_handler(event, context: LambdaContext) -> dict: try: validate(event=event, schema=schemas.INPUT, envelope="powertools_base64_gzip(payload) | powertools_json(@)") # Alternatively, query works here too encoded_payload = base64.b64decode(event["payload"]) uncompressed_payload = gzip.decompress(encoded_payload).decode() log: dict = json.loads(uncompressed_payload) return { "message": "Logs processed", "log_group": log.get("logGroup"), "owner": log.get("owner"), "success": True, } except JMESPathTypeError: return return_error_message("The powertools_base64_gzip() envelope function must match a valid path.") except binascii.Error: return return_error_message("Payload must be a valid base64 encoded string") except json.JSONDecodeError: return return_error_message("Payload must be valid JSON (base64 encoded).") except SchemaValidationError as exception: # SchemaValidationError indicates where a data mismatch is return return_error_message(str(exception)) def return_error_message(message: str) -> dict: return {"message": message, "success": False} ``` ``` INPUT = { "$schema": "http://json-schema.org/draft-07/schema", "$id": "http://example.com/example.json", "type": "object", "title": "Sample schema", "description": "The root schema comprises the entire JSON document.", "examples": [ { "owner": "123456789012", "logGroup": "/aws/lambda/powertools-example", "logStream": "2022/08/07/[$LATEST]d3a8dcaffc7f4de2b8db132e3e106660", "logEvents": {}, }, ], "required": ["owner", "logGroup", "logStream", "logEvents"], "properties": { "owner": { "$id": "#/properties/owner", "type": "string", "title": "The owner", "examples": ["123456789012"], "maxLength": 12, }, "logGroup": { "$id": "#/properties/logGroup", "type": "string", "title": "The logGroup", "examples": ["/aws/lambda/powertools-example"], "maxLength": 100, }, "logStream": { "$id": "#/properties/logStream", "type": "string", "title": "The logGroup", "examples": ["2022/08/07/[$LATEST]d3a8dcaffc7f4de2b8db132e3e106660"], "maxLength": 100, }, "logEvents": { "$id": "#/properties/logEvents", "type": "array", "title": "The logEvents", "examples": [ "{'id': 'eventId1', 'message': {'username': 'lessa', 'message': 'hello world'}, 'timestamp': 1440442987000}" # noqa E501 ], }, }, } ``` ``` { "payload": "H4sIACZAXl8C/52PzUrEMBhFX2UILpX8tPbHXWHqIOiq3Q1F0ubrWEiakqTWofTdTYYB0YWL2d5zvnuTFellBIOedoiyKH5M0iwnlKH7HZL6dDB6ngLDfLFYctUKjie9gHFaS/sAX1xNEq525QxwFXRGGMEkx4Th491rUZdV3YiIZ6Ljfd+lfSyAtZloacQgAkqSJCGhxM6t7cwwuUGPz4N0YKyvO6I9WDeMPMSo8Z4Ca/kJ6vMEYW5f1MX7W1lVxaG8vqX8hNFdjlc0iCBBSF4ERT/3Pl7RbMGMXF2KZMh/C+gDpNS7RRsp0OaRGzx0/t8e0jgmcczyLCWEePhni/23JWalzjdu0a3ZvgEaNLXeugEAAA==" } ``` ### Bring your own JMESPath function Warning This should only be used for advanced use cases where you have special formats not covered by the built-in functions. For special binary formats that you want to decode before applying JSON Schema validation, you can bring your own [JMESPath function](https://github.com/jmespath/jmespath.py#custom-functions) and any additional option via `jmespath_options` param. To keep Powertools for AWS Lambda (Python) built-in functions, you can subclass from `PowertoolsFunctions`. Here is an example of how to decompress messages using [zlib](https://docs.python.org/3/library/zlib.html): ``` import base64 import binascii import zlib from jmespath.exceptions import JMESPathTypeError from jmespath.functions import signature from aws_lambda_powertools.utilities.jmespath_utils import ( PowertoolsFunctions, query, ) class CustomFunctions(PowertoolsFunctions): # only decode if value is a string # see supported data types: https://jmespath.org/specification.html#built-in-functions @signature({"types": ["string"]}) def _func_decode_zlib_compression(self, payload: str): decoded: bytes = base64.b64decode(payload) return zlib.decompress(decoded) custom_jmespath_options = {"custom_functions": CustomFunctions()} def lambda_handler(event, context) -> dict: try: logs = [] logs.append( query( data=event, # NOTE: Use the prefix `_func_` before the name of the function envelope="Records[*].decode_zlib_compression(log)", jmespath_options=custom_jmespath_options, ), ) return {"logs": logs, "message": "Extracted messages", "success": True} except JMESPathTypeError: return return_error_message("The envelope function must match a valid path.") except zlib.error: return return_error_message("Log must be a valid zlib compressed message") except binascii.Error: return return_error_message("Log must be a valid base64 encoded string") def return_error_message(message: str) -> dict: return {"logs": None, "message": message, "success": False} ``` ``` { "Records": [ { "application": "notification", "datetime": "2022-01-01T00:00:00.000Z", "log": "eJxtUNFOwkAQ/JXN+dJq6e22tMD5ZHwQRYSEJpqQhtRy2AvlWq+tEr/eg6DExOzDJjM7M5tZsgCDgGPMKQaKRRAJRFjmRrUphBjRcIQXpy3gkiCvtJZ567jQVkDBwEc7JCK0sk2mSrkGh0IBc2l2qmlUpWEttZJrFz4LS/8YKP12cOjqpjUy23mQl0rqVpw9PWik+ZBGwMoDI9872ViazebJ/expARzGSTLn5BPzfm0sX7RtLTj/+xq3N0V11J+JITELnwHo2VlSzB86zQ+1CFtIiIJGcIWEmP4bDgH2AYH1GLBp9aXKMuORj+C8EF3Do9LdHvbDeBX3Xbip61I+y9eJankUDvwwBmcyTqaPHpRqK+FO5tvKhdvCVDvJCYPjYwiLbJMZdZKwQxZL02+NI3Vs" } ] } ``` Middleware factory provides a decorator factory to create your own middleware to run logic before, and after each Lambda invocation synchronously. ## Key features - Run logic before, after, and handle exceptions - Built-in tracing opt-in capability ## Getting started Tip All examples shared in this documentation are available within the [project repository](https://github.com/aws-powertools/powertools-lambda-python/tree/develop/examples). You might need a custom middleware to abstract non-functional code. These are often custom authorization or any reusable logic you might need to run before/after a Lambda function invocation. ### Middleware with no params You can create your own middleware using `lambda_handler_decorator`. The decorator factory expects 3 arguments in your function signature: - **handler** - Lambda function handler - **event** - Lambda function invocation event - **context** - Lambda function context object ### Middleware with before logic ``` from dataclasses import dataclass, field from typing import Callable from uuid import uuid4 from aws_lambda_powertools.middleware_factory import lambda_handler_decorator from aws_lambda_powertools.utilities.jmespath_utils import ( envelopes, query, ) from aws_lambda_powertools.utilities.typing import LambdaContext @dataclass class Payment: user_id: str order_id: str amount: float status_id: str payment_id: str = field(default_factory=lambda: f"{uuid4()}") class PaymentError(Exception): ... @lambda_handler_decorator def middleware_before( handler: Callable[[dict, LambdaContext], dict], event: dict, context: LambdaContext, ) -> dict: # extract payload from a EventBridge event detail: dict = query(data=event, envelope=envelopes.EVENTBRIDGE) # check if status_id exists in payload, otherwise add default state before processing payment if "status_id" not in detail: event["detail"]["status_id"] = "pending" return handler(event, context) @middleware_before def lambda_handler(event: dict, context: LambdaContext) -> dict: try: payment_payload: dict = query(data=event, envelope=envelopes.EVENTBRIDGE) return { "order": Payment(**payment_payload).__dict__, "message": "payment created", "success": True, } except Exception as e: raise PaymentError("Unable to create payment") from e ``` ``` { "version": "0", "id": "9c95e8e4-96a4-ef3f-b739-b6aa5b193afb", "detail-type": "PaymentCreated", "source": "app.payment", "account": "0123456789012", "time": "2022-08-08T20:41:53Z", "region": "eu-east-1", "detail": { "amount": "150.00", "order_id": "8f1f1710-1b30-48a5-a6bd-153fd23b866b", "user_id": "f80e3c51-5b8c-49d5-af7d-c7804966235f" } } ``` ### Middleware with after logic ``` import time from typing import Callable import requests from requests import Response from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.middleware_factory import lambda_handler_decorator from aws_lambda_powertools.utilities.typing import LambdaContext app = APIGatewayRestResolver() @lambda_handler_decorator def middleware_after( handler: Callable[[dict, LambdaContext], dict], event: dict, context: LambdaContext, ) -> dict: start_time = time.time() response = handler(event, context) execution_time = time.time() - start_time # adding custom headers in response object after lambda executing response["headers"]["execution_time"] = execution_time response["headers"]["aws_request_id"] = context.aws_request_id return response @app.post("/todos") def create_todo() -> dict: todo_data: dict = app.current_event.json_body # deserialize json str to dict todo: Response = requests.post("https://jsonplaceholder.typicode.com/todos", data=todo_data) todo.raise_for_status() return {"todo": todo.json()} @middleware_after def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "resource": "/todos", "path": "/todos", "httpMethod": "POST", "body": "{\"title\": \"foo\", \"userId\": 1, \"completed\": false}" } ``` ### Middleware with params You can also have your own keyword arguments after the mandatory arguments. ``` import base64 from dataclasses import dataclass, field from typing import Any, Callable, List from uuid import uuid4 from aws_lambda_powertools.middleware_factory import lambda_handler_decorator from aws_lambda_powertools.utilities.jmespath_utils import ( envelopes, query, ) from aws_lambda_powertools.utilities.typing import LambdaContext @dataclass class Booking: days: int date_from: str date_to: str hotel_id: int country: str city: str guest: dict booking_id: str = field(default_factory=lambda: f"{uuid4()}") class BookingError(Exception): ... @lambda_handler_decorator def obfuscate_sensitive_data( handler: Callable[[dict, LambdaContext], dict], event: dict, context: LambdaContext, fields: List, ) -> dict: # extracting payload from a EventBridge event detail: dict = query(data=event, envelope=envelopes.EVENTBRIDGE) guest_data: Any = detail.get("guest") # Obfuscate fields (email, vat, passport) before calling Lambda handler for guest_field in fields: if guest_data.get(guest_field): event["detail"]["guest"][guest_field] = obfuscate_data(str(guest_data.get(guest_field))) return handler(event, context) def obfuscate_data(value: str) -> bytes: # base64 is not effective for obfuscation, this is an example return base64.b64encode(value.encode("ascii")) @obfuscate_sensitive_data(fields=["email", "passport", "vat"]) def lambda_handler(event: dict, context: LambdaContext) -> dict: try: booking_payload: dict = query(data=event, envelope=envelopes.EVENTBRIDGE) return { "book": Booking(**booking_payload).__dict__, "message": "booking created", "success": True, } except Exception as e: raise BookingError("Unable to create booking") from e ``` ``` { "version": "0", "id": "9c95e8e4-96a4-ef3f-b739-b6aa5b193afb", "detail-type": "BookingCreated", "source": "app.booking", "account": "0123456789012", "time": "2022-08-08T20:41:53Z", "region": "eu-east-1", "detail": { "days": 5, "date_from": "2020-08-08", "date_to": "2020-08-13", "hotel_id": "1", "country": "Portugal", "city": "Lisbon", "guest": { "name": "Lambda", "email": "lambda@powertool.tools", "passport": "AA123456", "vat": "123456789" } } } ``` ### Environment variables The following environment variable is available to configure the middleware factory at a global scope: | Setting | Description | Environment variable | Default | | --- | --- | --- | --- | | **Middleware Trace** | Creates sub-segment for each custom middleware. | `POWERTOOLS_TRACE_MIDDLEWARES` | `false` | You can also use [`POWERTOOLS_TRACE_MIDDLEWARES`](#tracing-middleware-execution) on a per-method basis, which will consequently override the environment variable value. ## Advanced For advanced use cases, you can instantiate [Tracer](../../core/tracer/) inside your middleware, and add annotations as well as metadata for additional operational insights. ``` import time from typing import Callable import requests from requests import Response from aws_lambda_powertools import Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.middleware_factory import lambda_handler_decorator from aws_lambda_powertools.utilities.typing import LambdaContext tracer = Tracer() app = APIGatewayRestResolver() @lambda_handler_decorator(trace_execution=True) def middleware_with_advanced_tracing( handler: Callable[[dict, LambdaContext], dict], event: dict, context: LambdaContext, ) -> dict: tracer.put_metadata(key="resource", value=event.get("resource")) start_time = time.time() response = handler(event, context) execution_time = time.time() - start_time tracer.put_annotation(key="TotalExecutionTime", value=str(execution_time)) # adding custom headers in response object after lambda executing response["headers"]["execution_time"] = execution_time response["headers"]["aws_request_id"] = context.aws_request_id return response @app.get("/products") def create_product() -> dict: product: Response = requests.get("https://dummyjson.com/products/1") product.raise_for_status() return {"product": product.json()} @middleware_with_advanced_tracing def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "resource": "/products", "path": "/products", "httpMethod": "GET" } ``` ### Tracing middleware **execution** If you are making use of [Tracer](../../core/tracer/), you can trace the execution of your middleware to ease operations. This makes use of an existing Tracer instance that you may have initialized anywhere in your code. Warning You must [enable Active Tracing](../../core/tracer/#permissions) in your Lambda function when using this feature, otherwise Lambda cannot send traces to XRay. ``` import time from typing import Callable import requests from requests import Response from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.middleware_factory import lambda_handler_decorator from aws_lambda_powertools.utilities.typing import LambdaContext app = APIGatewayRestResolver() @lambda_handler_decorator(trace_execution=True) def middleware_with_tracing( handler: Callable[[dict, LambdaContext], dict], event: dict, context: LambdaContext, ) -> dict: start_time = time.time() response = handler(event, context) execution_time = time.time() - start_time # adding custom headers in response object after lambda executing response["headers"]["execution_time"] = execution_time response["headers"]["aws_request_id"] = context.aws_request_id return response @app.get("/products") def create_product() -> dict: product: Response = requests.get("https://dummyjson.com/products/1") product.raise_for_status() return {"product": product.json()} @middleware_with_tracing def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` { "resource": "/products", "path": "/products", "httpMethod": "GET" } ``` When executed, your middleware name will [appear in AWS X-Ray Trace details as](../../core/tracer/) `## middleware_name`, in this example the middleware name is `## middleware_with_tracing`. ### Combining Powertools for AWS Lambda (Python) utilities You can create your own middleware and combine many features of Powertools for AWS Lambda (Python) such as [trace](../../core/logger/), [logs](../../core/logger/), [feature flags](../feature_flags/), [validation](../validation/), [jmespath_functions](../jmespath_functions/) and others to abstract non-functional code. In the example below, we create a Middleware with the following features: - Logs and traces - Validate if the payload contains a specific header - Extract specific keys from event - Automatically add security headers on every execution - Validate if a specific feature flag is enabled - Save execution history to a DynamoDB table ``` import json from typing import Callable from urllib.parse import quote import boto3 import combining_powertools_utilities_schema as schemas import requests from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.event_handler.exceptions import InternalServerError from aws_lambda_powertools.middleware_factory import lambda_handler_decorator from aws_lambda_powertools.utilities.feature_flags import AppConfigStore, FeatureFlags from aws_lambda_powertools.utilities.feature_flags.types import JSONType from aws_lambda_powertools.utilities.jmespath_utils import query from aws_lambda_powertools.utilities.typing import LambdaContext from aws_lambda_powertools.utilities.validation import SchemaValidationError, validate app = APIGatewayRestResolver() tracer = Tracer() logger = Logger() table_historic = boto3.resource("dynamodb").Table("HistoricTable") app_config = AppConfigStore(environment="dev", application="comments", name="features") feature_flags = FeatureFlags(store=app_config) @lambda_handler_decorator(trace_execution=True) def middleware_custom( handler: Callable[[dict, LambdaContext], dict], event: dict, context: LambdaContext, ) -> dict: # validating the INPUT with the given schema # X-Customer-Id header must be informed in all requests try: validate(event=event, schema=schemas.INPUT) except SchemaValidationError as e: return { "statusCode": 400, "body": json.dumps(str(e)), } # extracting headers and requestContext from event headers = query(data=event, envelope="headers") request_context = query(data=event, envelope="requestContext") logger.debug(f"X-Customer-Id => {headers.get('X-Customer-Id')}") tracer.put_annotation(key="CustomerId", value=headers.get("X-Customer-Id")) response = handler(event, context) # automatically adding security headers to all responses # see: https://securityheaders.com/ logger.info("Injecting security headers") response["headers"]["Referrer-Policy"] = "no-referrer" response["headers"]["Strict-Transport-Security"] = "max-age=15552000; includeSubDomains; preload" response["headers"]["X-DNS-Prefetch-Control"] = "off" response["headers"]["X-Content-Type-Options"] = "nosniff" response["headers"]["X-Permitted-Cross-Domain-Policies"] = "none" response["headers"]["X-Download-Options"] = "noopen" logger.info("Saving api call in history table") save_api_execution_history(str(event.get("path")), headers, request_context) # return lambda execution return response @tracer.capture_method def save_api_execution_history(path: str, headers: dict, request_context: dict) -> None: try: # using the feature flags utility to check if the new feature "save api call to history" is enabled by default # see: https://docs.powertools.aws.dev/lambda/python/latest/utilities/feature_flags/#static-flags save_history: JSONType = feature_flags.evaluate(name="save_history", default=False) if save_history: # saving history in dynamodb table tracer.put_metadata(key="execution detail", value=request_context) table_historic.put_item( Item={ "customer_id": headers.get("X-Customer-Id"), "request_id": request_context.get("requestId"), "path": path, "request_time": request_context.get("requestTime"), "source_ip": request_context.get("identity", {}).get("sourceIp"), "http_method": request_context.get("httpMethod"), }, ) return None except Exception: # you can add more logic here to handle exceptions or even save this to a DLQ # but not to make this example too long, we just return None since the Lambda has been successfully executed return None @app.get("/comments") @tracer.capture_method def get_comments(): try: comments: requests.Response = requests.get("https://jsonplaceholder.typicode.com/comments") comments.raise_for_status() return {"comments": comments.json()[:10]} except Exception as exc: raise InternalServerError(str(exc)) from exc @app.get("/comments/") @tracer.capture_method def get_comments_by_id(comment_id: str): try: comment_id = quote(comment_id, safe="") comments: requests.Response = requests.get(f"https://jsonplaceholder.typicode.com/comments/{comment_id}") comments.raise_for_status() return {"comments": comments.json()} except Exception as exc: raise InternalServerError(str(exc)) from exc @middleware_custom def lambda_handler(event: dict, context: LambdaContext) -> dict: return app.resolve(event, context) ``` ``` INPUT = { "$schema": "http://json-schema.org/draft-07/schema#", "$id": "https://example.com/object1661012141.json", "title": "Root", "type": "object", "required": ["headers"], "properties": { "headers": { "$id": "#root/headers", "title": "Headers", "type": "object", "required": ["X-Customer-Id"], "properties": { "X-Customer-Id": { "$id": "#root/headers/X-Customer-Id", "title": "X-customer-id", "type": "string", "default": "", "examples": ["1"], "pattern": "^.*$", }, }, }, }, } ``` ``` { "body":"None", "headers":{ "Accept":"*/*", "Accept-Encoding":"gzip, deflate, br", "Connection":"keep-alive", "Host":"127.0.0.1:3001", "Postman-Token":"a9d49365-ebe1-4bb0-8627-d5e37cdce86d", "User-Agent":"PostmanRuntime/7.29.0", "X-Customer-Id":"1", "X-Forwarded-Port":"3001", "X-Forwarded-Proto":"http" }, "httpMethod":"GET", "isBase64Encoded":false, "multiValueHeaders":{ "Accept":[ "*/*" ], "Accept-Encoding":[ "gzip, deflate, br" ], "Connection":[ "keep-alive" ], "Host":[ "127.0.0.1:3001" ], "Postman-Token":[ "a9d49365-ebe1-4bb0-8627-d5e37cdce86d" ], "User-Agent":[ "PostmanRuntime/7.29.0" ], "X-Customer-Id":[ "1" ], "X-Forwarded-Port":[ "3001" ], "X-Forwarded-Proto":[ "http" ] }, "multiValueQueryStringParameters":"None", "path":"/comments", "pathParameters":"None", "queryStringParameters":"None", "requestContext":{ "accountId":"123456789012", "apiId":"1234567890", "domainName":"127.0.0.1:3001", "extendedRequestId":"None", "httpMethod":"GET", "identity":{ "accountId":"None", "apiKey":"None", "caller":"None", "cognitoAuthenticationProvider":"None", "cognitoAuthenticationType":"None", "cognitoIdentityPoolId":"None", "sourceIp":"127.0.0.1", "user":"None", "userAgent":"Custom User Agent String", "userArn":"None" }, "path":"/comments", "protocol":"HTTP/1.1", "requestId":"56d1a102-6d9d-4f13-b4f7-26751c10a131", "requestTime":"20/Aug/2022:18:18:58 +0000", "requestTimeEpoch":1661019538, "resourceId":"123456", "resourcePath":"/comments", "stage":"Prod" }, "resource":"/comments", "stageVariables":"None", "version":"1.0" } ``` ``` AWSTemplateFormatVersion: '2010-09-09' Transform: AWS::Serverless-2016-10-31 Description: Middleware-powertools-utilities example Globals: Function: Timeout: 5 Runtime: python3.12 Tracing: Active Architectures: - x86_64 Environment: Variables: POWERTOOLS_LOG_LEVEL: DEBUG POWERTOOLS_LOGGER_SAMPLE_RATE: 0.1 POWERTOOLS_LOGGER_LOG_EVENT: true POWERTOOLS_SERVICE_NAME: middleware Resources: MiddlewareFunction: Type: AWS::Serverless::Function # More info about Function Resource: https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#awsserverlessfunction Properties: CodeUri: middleware/ Handler: app.lambda_handler Description: Middleware function Policies: - AWSLambdaBasicExecutionRole # Managed Policy - Version: '2012-10-17' # Policy Document Statement: - Effect: Allow Action: - dynamodb:PutItem Resource: !GetAtt HistoryTable.Arn - Effect: Allow Action: # https://docs.aws.amazon.com/appconfig/latest/userguide/getting-started-with-appconfig-permissions.html - ssm:GetDocument - ssm:ListDocuments - appconfig:GetLatestConfiguration - appconfig:StartConfigurationSession - appconfig:ListApplications - appconfig:GetApplication - appconfig:ListEnvironments - appconfig:GetEnvironment - appconfig:ListConfigurationProfiles - appconfig:GetConfigurationProfile - appconfig:ListDeploymentStrategies - appconfig:GetDeploymentStrategy - appconfig:GetConfiguration - appconfig:ListDeployments - appconfig:GetDeployment Resource: "*" Events: GetComments: Type: Api Properties: Path: /comments Method: GET GetCommentsById: Type: Api Properties: Path: /comments/{comment_id} Method: GET # DynamoDB table to store historical data HistoryTable: Type: AWS::DynamoDB::Table Properties: TableName: "HistoryTable" AttributeDefinitions: - AttributeName: customer_id AttributeType: S - AttributeName: request_id AttributeType: S KeySchema: - AttributeName: customer_id KeyType: HASH - AttributeName: request_id KeyType: "RANGE" BillingMode: PAY_PER_REQUEST # Feature flags using AppConfig FeatureCommentApp: Type: AWS::AppConfig::Application Properties: Description: "Comments Application for feature toggles" Name: comments FeatureCommentDevEnv: Type: AWS::AppConfig::Environment Properties: ApplicationId: !Ref FeatureCommentApp Description: "Development Environment for the App Config Comments" Name: dev FeatureCommentConfigProfile: Type: AWS::AppConfig::ConfigurationProfile Properties: ApplicationId: !Ref FeatureCommentApp Name: features LocationUri: "hosted" HostedConfigVersion: Type: AWS::AppConfig::HostedConfigurationVersion Properties: ApplicationId: !Ref FeatureCommentApp ConfigurationProfileId: !Ref FeatureCommentConfigProfile Description: 'A sample hosted configuration version' Content: | { "save_history": { "default": true } } ContentType: 'application/json' # this is just an example # change this values according your deployment strategy BasicDeploymentStrategy: Type: AWS::AppConfig::DeploymentStrategy Properties: Name: "Deployment" Description: "Deployment strategy for comments app." DeploymentDurationInMinutes: 1 FinalBakeTimeInMinutes: 1 GrowthFactor: 100 GrowthType: LINEAR ReplicateTo: NONE ConfigDeployment: Type: AWS::AppConfig::Deployment Properties: ApplicationId: !Ref FeatureCommentApp ConfigurationProfileId: !Ref FeatureCommentConfigProfile ConfigurationVersion: !Ref HostedConfigVersion DeploymentStrategyId: !Ref BasicDeploymentStrategy EnvironmentId: !Ref FeatureCommentDevEnv ``` ## Tips - Use `trace_execution` to quickly understand the performance impact of your middlewares, and reduce or merge tasks when necessary - When nesting multiple middlewares, always return the handler with event and context, or response - Keep in mind [Python decorators execution order](https://realpython.com/primer-on-python-decorators/#nesting-decorators). Lambda handler is actually called once (top-down) - Async middlewares are not supported The parameters utility provides high-level functions to retrieve one or multiple parameter values from [AWS Systems Manager Parameter Store](https://docs.aws.amazon.com/systems-manager/latest/userguide/systems-manager-parameter-store.html), [AWS Secrets Manager](https://aws.amazon.com/secrets-manager/), [AWS AppConfig](https://docs.aws.amazon.com/appconfig/latest/userguide/what-is-appconfig.html), [Amazon DynamoDB](https://aws.amazon.com/dynamodb/), or bring your own. ## Key features - Retrieve one or multiple parameters from the underlying provider - Cache parameter values for a given amount of time (defaults to 5 minutes) - Transform parameter values from JSON or base 64 encoded strings - Bring Your Own Parameter Store Provider ## Getting started Tip All examples shared in this documentation are available within the [project repository](https://github.com/aws-powertools/powertools-lambda-python/tree/develop/examples). By default, we fetch parameters from System Manager Parameter Store, secrets from Secrets Manager, and application configuration from AppConfig. ### IAM Permissions This utility requires additional permissions to work as expected. Note Different parameter providers require different permissions. | Provider | Function/Method | IAM Permission | | --- | --- | --- | | SSM | **`get_parameter`**, **`SSMProvider.get`** | **`ssm:GetParameter`** | | SSM | **`get_parameters`**, **`SSMProvider.get_multiple`** | **`ssm:GetParametersByPath`** | | SSM | **`get_parameters_by_name`**, **`SSMProvider.get_parameters_by_name`** | **`ssm:GetParameter`** and **`ssm:GetParameters`** | | SSM | **`set_parameter`**, **`SSMProvider.set_parameter`** | **`ssm:PutParameter`** | | SSM | If using **`decrypt=True`** | You must add an additional permission **`kms:Decrypt`** | | Secrets | **`get_secret`**, **`SecretsProvider.get`** | **`secretsmanager:GetSecretValue`** | | Secrets | **`set_secret`**, **`SecretsProvider.set`** | **`secretsmanager:PutSecretValue`** and **`secretsmanager:CreateSecret`** (if creating secrets) | | DynamoDB | **`DynamoDBProvider.get`** | **`dynamodb:GetItem`** | | DynamoDB | **`DynamoDBProvider.get_multiple`** | **`dynamodb:Query`** | | AppConfig | **`get_app_config`**, **`AppConfigProvider.get_app_config`** | **`appconfig:GetLatestConfiguration`** and **`appconfig:StartConfigurationSession`** | ### Fetching parameters You can retrieve a single parameter using the `get_parameter` high-level function. ``` import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext) -> dict: try: # Retrieve a single parameter endpoint_comments = parameters.get_parameter("/lambda-powertools/endpoint_comments") # the value of this parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10], "statusCode": 200} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` For multiple parameters, you can use either: - `get_parameters` to recursively fetch all parameters by path. - `get_parameters_by_name` to fetch distinct parameters by their full name. It also accepts custom caching, transform, decrypt per parameter. This is useful when you want to fetch all parameters from a given path, say `/dev`, e.g., `/dev/config`, `/dev/webhook/config` To ease readability in deeply nested paths, we strip the path name. For example: - `/dev/config` -> `config` - `/dev/webhook/config` -> `webhook/config` ``` import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve all parameters within a path e.g., /dev # Say, you had two parameters under `/dev`: /dev/config, /dev/webhook/config all_parameters: dict = parameters.get_parameters("/dev", max_age=20) endpoint_comments = None # We strip the path prefix name for readability and memory usage in deeply nested paths # all_parameters would then look like: ## all_parameters["config"] = value # noqa: ERA001 ## all_parameters["webhook/config"] = value # noqa: ERA001 for parameter, value in all_parameters.items(): if parameter == "endpoint_comments": endpoint_comments = value if endpoint_comments is None: return {"comments": None} # the value of parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10]} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ``` from __future__ import annotations from typing import Any from aws_lambda_powertools.utilities.parameters.ssm import get_parameters_by_name parameters = { "/develop/service/commons/telemetry/config": {"max_age": 300, "transform": "json"}, "/no_cache_param": {"max_age": 0}, # inherit default values "/develop/service/payment/api/capture/url": {}, } def handler(event, context): # This returns a dict with the parameter name as key response: dict[str, Any] = get_parameters_by_name(parameters=parameters, max_age=60) for parameter, value in response.items(): print(f"{parameter}: {value}") ``` Failing gracefully if one or more parameters cannot be fetched or decrypted. By default, we will raise `GetParameterError` when any parameter fails to be fetched. You can override it by setting `raise_on_error=False`. When disabled, we take the following actions: - Add failed parameter name in the `_errors` key, *e.g.*, `{_errors: ["/param1", "/param2"]}` - Keep only successful parameter names and their values in the response - Raise `GetParameterError` if any of your parameters is named `_errors` ``` from __future__ import annotations from typing import Any from aws_lambda_powertools.utilities.parameters.ssm import get_parameters_by_name parameters = { "/develop/service/commons/telemetry/config": {"max_age": 300, "transform": "json"}, # it would fail by default "/this/param/does/not/exist": {}, } def handler(event, context): values: dict[str, Any] = get_parameters_by_name(parameters=parameters, raise_on_error=False) errors: list[str] = values.get("_errors", []) # Handle gracefully, since '/this/param/does/not/exist' will only be available in `_errors` if errors: ... for parameter, value in values.items(): print(f"{parameter}: {value}") ``` ### Setting parameters You can set a parameter using the `set_parameter` high-level function. This will create a new parameter if it doesn't exist. ``` from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext) -> dict: try: # Set a single parameter, returns the version ID of the parameter parameter_version = parameters.set_parameter(name="/mySuper/Parameter", value="PowerToolsIsAwesome") return {"mySuperParameterVersion": parameter_version, "statusCode": 200} except parameters.exceptions.SetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` Sometimes you may be setting a parameter that you will have to update later on. Use the `overwrite` option to overwrite any existing value. If you do not set this option, the parameter value will not be overwritten and an exception will be raised. ``` from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext) -> dict: try: # Set a single parameter, but overwrite if it already exists. # Overwrite is False by default, so we explicitly set it to True updating_parameter = parameters.set_parameter( name="/mySuper/Parameter", value="PowerToolsIsAwesome", overwrite=True, ) return {"mySuperParameterVersion": updating_parameter, "statusCode": 200} except parameters.exceptions.SetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ### Fetching secrets You can fetch secrets stored in Secrets Manager using `get_secret`. ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext): try: # Usually an endpoint is not sensitive data, so we store it in SSM Parameters endpoint_comments: Any = parameters.get_parameter("/lambda-powertools/endpoint_comments") # An API-KEY is a sensitive data and should be stored in SecretsManager api_key: Any = parameters.get_secret("/lambda-powertools/api-key") headers: dict = {"X-API-Key": api_key} comments: requests.Response = requests.get(endpoint_comments, headers=headers) return {"comments": comments.json()[:10], "statusCode": 200} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ### Setting secrets You can set secrets stored in Secrets Manager using `set_secret`. Note We strive to minimize API calls by attempting to update existing secrets as our primary approach. If a secret doesn't exist, we proceed to create a new one. ``` from typing import Any from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger(serialize_stacktrace=True) def access_token(client_id: str, client_secret: str, audience: str) -> str: # example function that returns a JWT Access Token # add your own logic here return f"{client_id}.{client_secret}.{audience}" def lambda_handler(event: dict, context: LambdaContext): try: client_id: Any = parameters.get_parameter("/aws-powertools/client_id") client_secret: Any = parameters.get_parameter("/aws-powertools/client_secret") audience: Any = parameters.get_parameter("/aws-powertools/audience") jwt_token = access_token(client_id=client_id, client_secret=client_secret, audience=audience) # set-secret will create a new secret if it doesn't exist and return the version id update_secret_version_id = parameters.set_secret(name="/aws-powertools/jwt_token", value=jwt_token) return {"access_token": "updated", "statusCode": 200, "update_secret_version_id": update_secret_version_id} except parameters.exceptions.SetSecretError as error: logger.exception(error) return {"access_token": "updated", "statusCode": 400} ``` ### Fetching app configurations You can fetch application configurations in AWS AppConfig using `get_app_config`. The following will retrieve the latest version and store it in the cache. Warning We make two API calls to fetch each unique configuration name during the first time. This is by design in AppConfig. Please consider adjusting `max_age` parameter to enhance performance. ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve a single parameter endpoint_comments: Any = parameters.get_app_config(name="config", environment="dev", application="comments") # the value of this parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10], "statusCode": 200} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ### Environment variables The following environment variables are available to configure the parameter utility at a global scope: | Setting | Description | Environment variable | Default | | --- | --- | --- | --- | | **Max Age** | Adjusts for how long values are kept in cache (in seconds). | `POWERTOOLS_PARAMETERS_MAX_AGE` | `300` | | **Debug Sample Rate** | Sets whether to decrypt or not values retrieved from AWS SSM Parameters Store. | `POWERTOOLS_PARAMETERS_SSM_DECRYPT` | `false` | You can also use [`POWERTOOLS_PARAMETERS_MAX_AGE`](#adjusting-cache-ttl) through the `max_age` parameter and [`POWERTOOLS_PARAMETERS_SSM_DECRYPT`](#ssmprovider) through the `decrypt` parameter to override the environment variable values. ## Advanced ### Adjusting cache TTL Tip `max_age` parameter is also available in underlying provider functions like `get()`, `get_multiple()`, etc. By default, we cache parameters retrieved in-memory for 300 seconds (5 minutes). If you want to change this default value and set the same TTL for all parameters, you can set the `POWERTOOLS_PARAMETERS_MAX_AGE` environment variable. **You can still set `max_age` for individual parameters**. You can adjust how long we should keep values in cache by using the param `max_age`, when using `get_parameter()`, `get_parameters()` and `get_secret()` methods across all providers. ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve a single parameter with 20s cache endpoint_comments: Any = parameters.get_parameter("/lambda-powertools/endpoint_comments", max_age=20) # the value of this parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10], "statusCode": 200} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve multiple parameters from a path prefix all_parameters: Any = parameters.get_parameters("/lambda-powertools/", max_age=20) endpoint_comments = "https://jsonplaceholder.typicode.com/noexists/" for parameter, value in all_parameters.items(): if parameter == "endpoint_comments": endpoint_comments = value # the value of parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10]} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext): try: # Usually an endpoint is not sensitive data, so we store it in SSM Parameters endpoint_comments: Any = parameters.get_parameter("/lambda-powertools/endpoint_comments") # An API-KEY is a sensitive data and should be stored in SecretsManager api_key: Any = parameters.get_secret("/lambda-powertools/api-key", max_age=20) headers: dict = {"X-API-Key": api_key} comments: requests.Response = requests.get(endpoint_comments, headers=headers) return {"comments": comments.json()[:10], "statusCode": 200} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve a single parameter endpoint_comments: Any = parameters.get_app_config( name="config", environment="dev", application="comments", max_age=20, ) # the value of this parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10], "statusCode": 200} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ### Always fetching the latest If you'd like to always ensure you fetch the latest parameter from the store regardless if already available in cache, use `force_fetch` param. ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve a single parameter with 20s cache endpoint_comments: Any = parameters.get_parameter("/lambda-powertools/endpoint_comments", force_fetch=True) # the value of this parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10], "statusCode": 200} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve multiple parameters from a path prefix all_parameters: Any = parameters.get_parameters("/lambda-powertools/", force_fetch=True) endpoint_comments = "https://jsonplaceholder.typicode.com/noexists/" for parameter, value in all_parameters.items(): if parameter == "endpoint_comments": endpoint_comments = value # the value of parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10]} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext): try: # Usually an endpoint is not sensitive data, so we store it in SSM Parameters endpoint_comments: Any = parameters.get_parameter("/lambda-powertools/endpoint_comments") # An API-KEY is a sensitive data and should be stored in SecretsManager api_key: Any = parameters.get_secret("/lambda-powertools/api-key", force_fetch=True) headers: dict = {"X-API-Key": api_key} comments: requests.Response = requests.get(endpoint_comments, headers=headers) return {"comments": comments.json()[:10], "statusCode": 200} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve a single parameter endpoint_comments: Any = parameters.get_app_config( name="config", environment="dev", application="comments", force_fetch=True, ) # the value of this parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10], "statusCode": 200} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ### Built-in provider class For greater flexibility such as configuring the underlying SDK client used by built-in providers, you can use their respective Provider Classes directly. Tip This is useful when you need to customize parameters for the SDK client, such as region, credentials, retries and others. For more information, read [botocore.config](https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html) and [boto3.session](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html#module-boto3.session). #### SSMProvider ``` from typing import Any import requests from botocore.config import Config from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext # changing region_name, connect_timeout and retrie configurations # see: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html config = Config(region_name="sa-east-1", connect_timeout=1, retries={"total_max_attempts": 2, "max_attempts": 5}) ssm_provider = parameters.SSMProvider(config=config) def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve a single parameter endpoint_comments: Any = ssm_provider.get("/lambda-powertools/endpoint_comments") # the value of this parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10], "statusCode": 200} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ``` from typing import Any import boto3 import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext # assuming role from another account to get parameter there # see: https://docs.aws.amazon.com/STS/latest/APIReference/API_AssumeRole.html sts_client = boto3.client("sts") assumed_role_object = sts_client.assume_role( RoleArn="arn:aws:iam::account-of-role-to-assume:role/name-of-role", RoleSessionName="RoleAssume1", ) credentials = assumed_role_object["Credentials"] # using temporary credentials in your SSMProvider provider # see: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html#module-boto3.session boto3_session = boto3.session.Session( region_name="us-east-1", aws_access_key_id=credentials["AccessKeyId"], aws_secret_access_key=credentials["SecretAccessKey"], aws_session_token=credentials["SessionToken"], ) ssm_provider = parameters.SSMProvider(boto3_session=boto3_session) def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve multiple parameters from a path prefix all_parameters: Any = ssm_provider.get_multiple("/lambda-powertools/") endpoint_comments = "https://jsonplaceholder.typicode.com/noexists/" for parameter, value in all_parameters.items(): if parameter == "endpoint_comments": endpoint_comments = value # the value of parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10]} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` The AWS Systems Manager Parameter Store provider supports two additional arguments for the `get()` and `get_multiple()` methods: | Parameter | Default | Description | | --- | --- | --- | | **decrypt** | `False` | Will automatically decrypt the parameter. | | **recursive** | `True` | For `get_multiple()` only, will fetch all parameter values recursively based on a path prefix. | You can create `SecureString` parameters, which are parameters that have a plaintext parameter name and an encrypted parameter value. If you don't use the `decrypt` argument, you will get an encrypted value. Read [here](https://docs.aws.amazon.com/kms/latest/developerguide/services-parameter-store.html) about best practices using KMS to secure your parameters. Tip If you want to always decrypt parameters, you can set the `POWERTOOLS_PARAMETERS_SSM_DECRYPT=true` environment variable. **This will override the default value of `false` but you can still set the `decrypt` option for individual parameters**. ``` from typing import Any from uuid import uuid4 import boto3 from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext ec2 = boto3.resource("ec2") ssm_provider = parameters.SSMProvider() def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve the key pair from secure string parameter ec2_pem: Any = ssm_provider.get("/lambda-powertools/ec2_pem", decrypt=True) name_key_pair = f"kp_{uuid4()}" ec2.import_key_pair(KeyName=name_key_pair, PublicKeyMaterial=ec2_pem) ec2.create_instances( ImageId="ami-026b57f3c383c2eec", InstanceType="t2.micro", MinCount=1, MaxCount=1, KeyName=name_key_pair, ) return {"message": "EC2 created", "success": True} except parameters.exceptions.GetParameterError as error: return {"message": f"Error creating EC2 => {str(error)}", "success": False} ``` ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext ssm_provider = parameters.SSMProvider() class ConfigNotFound(Exception): ... def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve multiple parameters from a path prefix # /config = root # /config/endpoint = url # /config/endpoint/query = querystring all_parameters: Any = ssm_provider.get_multiple("/config", recursive=False) endpoint_comments = "https://jsonplaceholder.typicode.com/comments/" for parameter, value in all_parameters.items(): # query parameter is used to query endpoint if "query" in parameter: endpoint_comments = f"{endpoint_comments}{value}" break else: # scheme config was not found because get_multiple is not recursive raise ConfigNotFound("URL query parameter was not found") # the value of parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` #### SecretsProvider ``` from typing import Any import requests from botocore.config import Config from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext config = Config(region_name="sa-east-1", connect_timeout=1, retries={"total_max_attempts": 2, "max_attempts": 5}) ssm_provider = parameters.SecretsProvider(config=config) def lambda_handler(event: dict, context: LambdaContext): try: # Usually an endpoint is not sensitive data, so we store it in SSM Parameters endpoint_comments: Any = parameters.get_parameter("/lambda-powertools/endpoint_comments") # An API-KEY is a sensitive data and should be stored in SecretsManager api_key: Any = ssm_provider.get("/lambda-powertools/api-key") headers: dict = {"X-API-Key": api_key} comments: requests.Response = requests.get(endpoint_comments, headers=headers) return {"comments": comments.json()[:10], "statusCode": 200} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` #### DynamoDBProvider The DynamoDB Provider does not have any high-level functions, as it needs to know the name of the DynamoDB table containing the parameters. **DynamoDB table structure for single parameters** For single parameters, you must use `id` as the [partition key](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.CoreComponents.html#HowItWorks.CoreComponents.PrimaryKey) for that table. Example DynamoDB table with `id` partition key and `value` as attribute | id | value | | --- | --- | | my-parameter | my-value | With this table, `dynamodb_provider.get("my-parameter")` will return `my-value`. ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext dynamodb_provider = parameters.DynamoDBProvider(table_name="ParameterTable") def lambda_handler(event: dict, context: LambdaContext): try: # Usually an endpoint is not sensitive data, so we store it in DynamoDB Table endpoint_comments: Any = dynamodb_provider.get("comments_endpoint") comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10], "statusCode": 200} # general exception except Exception as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ``` AWSTemplateFormatVersion: '2010-09-09' Transform: AWS::Serverless-2016-10-31 Description: 'DynamoDB Table example' Resources: ParameterTable: Type: AWS::DynamoDB::Table Properties: TableName: ParameterTable AttributeDefinitions: - AttributeName: id AttributeType: S KeySchema: - AttributeName: id KeyType: HASH TimeToLiveSpecification: AttributeName: expiration Enabled: true BillingMode: PAY_PER_REQUEST ``` You can initialize the DynamoDB provider pointing to [DynamoDB Local](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.html) using `endpoint_url` parameter: ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext dynamodb_provider = parameters.DynamoDBProvider(table_name="ParameterTable", endpoint_url="http://localhost:8000") def lambda_handler(event: dict, context: LambdaContext): try: # Usually an endpoint is not sensitive data, so we store it in DynamoDB Table endpoint_comments: Any = dynamodb_provider.get("comments_endpoint") comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10], "statusCode": 200} # general exception except Exception as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` **DynamoDB table structure for multiple values parameters** You can retrieve multiple parameters sharing the same `id` by having a sort key named `sk`. Example DynamoDB table with `id` primary key, `sk` as sort key and `value` as attribute | id | sk | value | | --- | --- | --- | | config | endpoint_comments | | | config | limit | 10 | With this table, `dynamodb_provider.get_multiple("config")` will return a dictionary response in the shape of `sk:value`. ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext dynamodb_provider = parameters.DynamoDBProvider(table_name="ParameterTable") def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve multiple parameters using HASH KEY all_parameters: Any = dynamodb_provider.get_multiple("config") endpoint_comments = "https://jsonplaceholder.typicode.com/noexists/" limit = 2 for parameter, value in all_parameters.items(): if parameter == "endpoint_comments": endpoint_comments = value if parameter == "limit": limit = int(value) # the value of parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[limit]} # general exception except Exception as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ``` AWSTemplateFormatVersion: '2010-09-09' Transform: AWS::Serverless-2016-10-31 Description: 'DynamoDB Table example' Resources: ParameterTable: Type: AWS::DynamoDB::Table Properties: TableName: ParameterTable AttributeDefinitions: - AttributeName: id AttributeType: S - AttributeName: sk AttributeType: S KeySchema: - AttributeName: id KeyType: HASH - AttributeName: sk KeyType: RANGE TimeToLiveSpecification: AttributeName: expiration Enabled: true BillingMode: PAY_PER_REQUEST ``` **Customizing DynamoDBProvider** DynamoDB provider can be customized at initialization to match your table structure: | Parameter | Mandatory | Default | Description | | --- | --- | --- | --- | | **table_name** | **Yes** | *(N/A)* | Name of the DynamoDB table containing the parameter values. | | **key_attr** | No | `id` | Hash key for the DynamoDB table. | | **sort_attr** | No | `sk` | Range key for the DynamoDB table. You don't need to set this if you don't use the `get_multiple()` method. | | **value_attr** | No | `value` | Name of the attribute containing the parameter value. | ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext dynamodb_provider = parameters.DynamoDBProvider( table_name="ParameterTable", key_attr="IdKeyAttr", sort_attr="SkKeyAttr", value_attr="ValueAttr", ) def lambda_handler(event: dict, context: LambdaContext): try: # Usually an endpoint is not sensitive data, so we store it in DynamoDB Table endpoint_comments: Any = dynamodb_provider.get("comments_endpoint") comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10], "statusCode": 200} # general exception except Exception as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ``` AWSTemplateFormatVersion: '2010-09-09' Transform: AWS::Serverless-2016-10-31 Description: 'DynamoDB Table example' Resources: ParameterTable: Type: AWS::DynamoDB::Table Properties: TableName: ParameterTable AttributeDefinitions: - AttributeName: IdKeyAttr AttributeType: S - AttributeName: SkKeyAttr AttributeType: S KeySchema: - AttributeName: IdKeyAttr KeyType: HASH - AttributeName: SkKeyAttr KeyType: RANGE TimeToLiveSpecification: AttributeName: expiration Enabled: true BillingMode: PAY_PER_REQUEST ``` #### AppConfigProvider ``` from typing import Any import requests from botocore.config import Config from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext config = Config(region_name="sa-east-1") appconf_provider = parameters.AppConfigProvider(environment="dev", application="comments", config=config) def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve a single parameter endpoint_comments: Any = appconf_provider.get("config") # the value of this parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10], "statusCode": 200} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ### Create your own provider You can create your own custom parameter store provider by inheriting the `BaseProvider` class, and implementing both `_get()` and `_get_multiple()` methods to retrieve a single, or multiple parameters from your custom store. All transformation and caching logic is handled by the `get()` and `get_multiple()` methods from the base provider class. Here are two examples of implementing a custom parameter store. One using an external service like [Hashicorp Vault](https://www.vaultproject.io/), a widely popular key-value and secret storage and the other one using [Amazon S3](https://aws.amazon.com/s3/?nc1=h_ls), a popular object storage. ``` from typing import Any import hvac import requests from custom_provider_vault import VaultProvider from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() # In production you must use Vault over HTTPS and certificates. vault_provider = VaultProvider(vault_url="http://192.168.68.105:8200/", vault_token="YOUR_TOKEN") def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve a single parameter endpoint_comments: Any = vault_provider.get("comments_endpoint") # you can get all parameters using get_multiple and specifying vault mount point # # for testing purposes we will not use it all_parameters: Any = vault_provider.get_multiple("/") logger.info(all_parameters) # the value of this parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments["url"]) return {"comments": comments.json()[:10], "statusCode": 200} except hvac.exceptions.InvalidPath as error: return {"comments": None, "message": str(error), "statusCode": 400} # general exception except Exception as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ``` from typing import Any, Dict from hvac import Client from aws_lambda_powertools.utilities.parameters import BaseProvider class VaultProvider(BaseProvider): def __init__(self, vault_url: str, vault_token: str) -> None: super().__init__() self.vault_client = Client(url=vault_url, verify=False, timeout=10) self.vault_client.token = vault_token def _get(self, name: str, **sdk_options) -> Dict[str, Any]: # for example proposal, the mountpoint is always /secret kv_configuration = self.vault_client.secrets.kv.v2.read_secret(path=name) return kv_configuration["data"]["data"] def _get_multiple(self, path: str, **sdk_options) -> Dict[str, str]: list_secrets = {} all_secrets = self.vault_client.secrets.kv.v2.list_secrets(path=path) # for example proposal, the mountpoint is always /secret for secret in all_secrets["data"]["keys"]: kv_configuration = self.vault_client.secrets.kv.v2.read_secret(path=secret) for key, value in kv_configuration["data"]["data"].items(): list_secrets[key] = value return list_secrets ``` ``` from typing import Any import requests from custom_provider_s3 import S3Provider from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() s3_provider = S3Provider(bucket_name="bucket_name") def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve a single parameter using key endpoint_comments: Any = s3_provider.get("comments_endpoint") # you can get all parameters using get_multiple and specifying a bucket prefix # # for testing purposes we will not use it all_parameters: Any = s3_provider.get_multiple("/") logger.info(all_parameters) # the value of this parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10], "statusCode": 200} # general exception except Exception as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ``` import copy from typing import Dict import boto3 from aws_lambda_powertools.utilities.parameters import BaseProvider class S3Provider(BaseProvider): def __init__(self, bucket_name: str): # Initialize the client to your custom parameter store # E.g.: super().__init__() self.bucket_name = bucket_name self.client = boto3.client("s3") def _get(self, name: str, **sdk_options) -> str: # Retrieve a single value # E.g.: sdk_options["Bucket"] = self.bucket_name sdk_options["Key"] = name response = self.client.get_object(**sdk_options) return response["Body"].read().decode() def _get_multiple(self, path: str, **sdk_options) -> Dict[str, str]: # Retrieve multiple values # E.g.: list_sdk_options = copy.deepcopy(sdk_options) list_sdk_options["Bucket"] = self.bucket_name list_sdk_options["Prefix"] = path list_response = self.client.list_objects_v2(**list_sdk_options) parameters = {} for obj in list_response.get("Contents", []): get_sdk_options = copy.deepcopy(sdk_options) get_sdk_options["Bucket"] = self.bucket_name get_sdk_options["Key"] = obj["Key"] get_response = self.client.get_object(**get_sdk_options) parameters[obj["Key"]] = get_response["Body"].read().decode() return parameters ``` ### Deserializing values with transform parameter For parameters stored in JSON or Base64 format, you can use the `transform` argument for deserialization. Info The `transform` argument is available across all providers, including the high level functions. ``` from typing import Any import requests from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext) -> dict: try: # Retrieve a single parameter endpoint_comments: Any = parameters.get_parameter("/lambda-powertools/endpoint_comments", transform="json") # the value of this parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10], "statusCode": 200} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` ``` from typing import Any import requests from botocore.config import Config from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext config = Config(region_name="sa-east-1") appconf_provider = parameters.AppConfigProvider(environment="dev", application="comments", config=config) def lambda_handler(event: dict, context: LambdaContext): try: # Retrieve a single parameter endpoint_comments: Any = appconf_provider.get("config", transform="json") # the value of this parameter is https://jsonplaceholder.typicode.com/comments/ comments: requests.Response = requests.get(endpoint_comments) return {"comments": comments.json()[:10], "statusCode": 200} except parameters.exceptions.GetParameterError as error: return {"comments": None, "message": str(error), "statusCode": 400} ``` #### Partial transform failures with `get_multiple()` If you use `transform` with `get_multiple()`, you can have a single malformed parameter value. To prevent failing the entire request, the method will return a `None` value for the parameters that failed to transform. You can override this by setting the `raise_on_transform_error` argument to `True`. If you do so, a single transform error will raise a **`TransformParameterError`** exception. For example, if you have three parameters, */param/a*, */param/b* and */param/c*, but */param/c* is malformed: ``` from typing import Any from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext ssm_provider = parameters.SSMProvider() def lambda_handler(event: dict, context: LambdaContext): # This will display: # /param/a: [some value] # /param/b: [some value] # /param/c: None values: Any = ssm_provider.get_multiple("/param", transform="json") for key, value in values.items(): print(f"{key}: {value}") try: # This will raise a TransformParameterError exception values = ssm_provider.get_multiple("/param", transform="json", raise_on_transform_error=True) except parameters.exceptions.TransformParameterError: ... ``` #### Auto-transform values on suffix If you use `transform` with `get_multiple()`, you might want to retrieve and transform parameters encoded in different formats. You can do this with a single request by using `transform="auto"`. This will instruct any Parameter to to infer its type based on the suffix and transform it accordingly. Info `transform="auto"` feature is available across all providers, including the high level functions. ``` from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext ssm_provider = parameters.SSMProvider() def lambda_handler(event: dict, context: LambdaContext): values = ssm_provider.get_multiple("/param", transform="auto") return values ``` For example, if you have two parameters with the following suffixes `.json` and `.binary`: | Parameter name | Parameter value | | --- | --- | | /param/a.json | [some encoded value] | | /param/a.binary | [some encoded value] | The return of `ssm_provider.get_multiple("/param", transform="auto")` call will be a dictionary like: ``` { "a.json": [some value], "b.binary": [some value] } ``` ### Passing additional SDK arguments You can use arbitrary keyword arguments to pass it directly to the underlying SDK method. ``` from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext secrets_provider = parameters.SecretsProvider() def lambda_handler(event: dict, context: LambdaContext): # The 'VersionId' argument will be passed to the underlying get_secret_value() call. value = secrets_provider.get("my-secret", VersionId="e62ec170-6b01-48c7-94f3-d7497851a8d2") return value ``` Here is the mapping between this utility's functions and methods and the underlying SDK: | Provider | Function/Method | Client name | Function name | | --- | --- | --- | --- | | SSM Parameter Store | `get_parameter` | `ssm` | [get_parameter](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ssm.html#SSM.Client.get_parameter) | | SSM Parameter Store | `get_parameters` | `ssm` | [get_parameters_by_path](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ssm.html#SSM.Client.get_parameters_by_path) | | SSM Parameter Store | `SSMProvider.get` | `ssm` | [get_parameter](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ssm.html#SSM.Client.get_parameter) | | SSM Parameter Store | `SSMProvider.get_multiple` | `ssm` | [get_parameters_by_path](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ssm.html#SSM.Client.get_parameters_by_path) | | Secrets Manager | `get_secret` | `secretsmanager` | [get_secret_value](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/secretsmanager.html#SecretsManager.Client.get_secret_value) | | Secrets Manager | `SecretsProvider.get` | `secretsmanager` | [get_secret_value](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/secretsmanager.html#SecretsManager.Client.get_secret_value) | | DynamoDB | `DynamoDBProvider.get` | `dynamodb` | ([Table resource](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/dynamodb.html#table)) | | DynamoDB | `DynamoDBProvider.get_multiple` | `dynamodb` | ([Table resource](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/dynamodb.html#table)) | | App Config | `get_app_config` | `appconfigdata` | [start_configuration_session](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/appconfigdata.html#AppConfigData.Client.start_configuration_session) and [get_latest_configuration](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/appconfigdata.html#AppConfigData.Client.get_latest_configuration) | ### Bring your own boto client You can use `boto3_client` parameter via any of the available [Provider Classes](#built-in-provider-class). Some providers expect a low level boto3 client while others expect a high level boto3 client, here is the mapping for each of them: | Provider | Type | Boto client construction | | --- | --- | --- | | [SSMProvider](#ssmprovider) | low level | `boto3.client("ssm")` | | [SecretsProvider](#secretsprovider) | low level | `boto3.client("secrets")` | | [AppConfigProvider](#appconfigprovider) | low level | `boto3.client("appconfigdata")` | | [DynamoDBProvider](#dynamodbprovider) | high level | `boto3.resource("dynamodb")` | Bringing them together in a single code snippet would look like this: ``` import boto3 from botocore.config import Config from aws_lambda_powertools.utilities import parameters config = Config(region_name="us-west-1") # construct boto clients with any custom configuration ssm = boto3.client("ssm", config=config) secrets = boto3.client("secretsmanager", config=config) appconfig = boto3.client("appconfigdata", config=config) dynamodb = boto3.resource("dynamodb", config=config) ssm_provider = parameters.SSMProvider(boto3_client=ssm) secrets_provider = parameters.SecretsProvider(boto3_client=secrets) appconf_provider = parameters.AppConfigProvider(boto3_client=appconfig, environment="my_env", application="my_app") dynamodb_provider = parameters.DynamoDBProvider(boto3_client=dynamodb, table_name="my-table") ``` When is this useful? Injecting a custom boto3 client can make unit/snapshot testing easier, including SDK customizations. ### Customizing boto configuration The **`boto_config`** , **`boto3_session`**, and **`boto3_client`** parameters enable you to pass in a custom [botocore config object](https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html), [boto3 session](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html), or a [boto3 client](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/boto3.html) when constructing any of the built-in provider classes. Tip You can use a custom session for retrieving parameters cross-account/region and for snapshot testing. When using VPC private endpoints, you can pass a custom client altogether. It's also useful for testing when injecting fake instances. ``` import boto3 from aws_lambda_powertools.utilities import parameters boto3_session = boto3.session.Session() ssm_provider = parameters.SSMProvider(boto3_session=boto3_session) def handler(event, context): # Retrieve a single parameter value = ssm_provider.get("/my/parameter") return value ``` ``` from botocore.config import Config from aws_lambda_powertools.utilities import parameters boto_config = Config() ssm_provider = parameters.SSMProvider(boto_config=boto_config) def handler(event, context): # Retrieve a single parameter value = ssm_provider.get("/my/parameter") return value ``` ``` import boto3 from aws_lambda_powertools.utilities import parameters boto3_client = boto3.client("ssm") ssm_provider = parameters.SSMProvider(boto3_client=boto3_client) def handler(event, context): # Retrieve a single parameter value = ssm_provider.get("/my/parameter") return value ``` ## Testing your code ### Mocking parameter values For unit testing your applications, you can mock the calls to the parameters utility to avoid calling AWS APIs. This can be achieved in a number of ways - in this example, we use the [pytest monkeypatch fixture](https://docs.pytest.org/en/latest/how-to/monkeypatch.html) to patch the `parameters.get_parameter` method: ``` from src import single_mock def test_handler(monkeypatch): def mockreturn(name): return "mock_value" monkeypatch.setattr(single_mock.parameters, "get_parameter", mockreturn) return_val = single_mock.handler({}, {}) assert return_val.get("message") == "mock_value" ``` ``` from aws_lambda_powertools.utilities import parameters def handler(event, context): # Retrieve a single parameter value = parameters.get_parameter("my-parameter-name") return {"message": value} ``` If we need to use this pattern across multiple tests, we can avoid repetition by refactoring to use our own pytest fixture: ``` import pytest from src import single_mock @pytest.fixture def mock_parameter_response(monkeypatch): def mockreturn(name): return "mock_value" monkeypatch.setattr(single_mock.parameters, "get_parameter", mockreturn) # Pass our fixture as an argument to all tests where we want to mock the get_parameter response def test_handler(mock_parameter_response): return_val = single_mock.handler({}, {}) assert return_val.get("message") == "mock_value" ``` Alternatively, if we need more fully featured mocking (for example checking the arguments passed to `get_parameter`), we can use [unittest.mock](https://docs.python.org/3/library/unittest.mock.html) from the python stdlib instead of pytest's `monkeypatch` fixture. In this example, we use the [patch](https://docs.python.org/3/library/unittest.mock.html#unittest.mock.patch) decorator to replace the `aws_lambda_powertools.utilities.parameters.get_parameter` function with a [MagicMock](https://docs.python.org/3/library/unittest.mock.html#unittest.mock.MagicMock) object named `get_parameter_mock`. ``` from unittest.mock import patch from src import single_mock # Replaces "aws_lambda_powertools.utilities.parameters.get_parameter" with a Mock object @patch("aws_lambda_powertools.utilities.parameters.get_parameter") def test_handler(get_parameter_mock): get_parameter_mock.return_value = "mock_value" return_val = single_mock.handler({}, {}) get_parameter_mock.assert_called_with("my-parameter-name") assert return_val.get("message") == "mock_value" ``` ### Clearing cache Parameters utility caches all parameter values for performance and cost reasons. However, this can have unintended interference in tests using the same parameter name. Within your tests, you can use `clear_cache` method available in [every provider](#built-in-provider-class). When using multiple providers or higher level functions like `get_parameter`, use `clear_caches` standalone function to clear cache globally. ``` import pytest from src import app @pytest.fixture(scope="function", autouse=True) def clear_parameters_cache(): yield app.ssm_provider.clear_cache() # This will clear SSMProvider cache @pytest.fixture def mock_parameter_response(monkeypatch): def mockreturn(name): return "mock_value" monkeypatch.setattr(app.ssm_provider, "get", mockreturn) # Pass our fixture as an argument to all tests where we want to mock the get_parameter response def test_handler(mock_parameter_response): return_val = app.handler({}, {}) assert return_val.get("message") == "mock_value" ``` ``` import pytest from src import app from aws_lambda_powertools.utilities import parameters @pytest.fixture(scope="function", autouse=True) def clear_parameters_cache(): yield parameters.clear_caches() # This will clear all providers cache @pytest.fixture def mock_parameter_response(monkeypatch): def mockreturn(name): return "mock_value" monkeypatch.setattr(app.ssm_provider, "get", mockreturn) # Pass our fixture as an argument to all tests where we want to mock the get_parameter response def test_handler(mock_parameter_response): return_val = app.handler({}, {}) assert return_val.get("message") == "mock_value" ``` ``` from botocore.config import Config from aws_lambda_powertools.utilities import parameters ssm_provider = parameters.SSMProvider(config=Config(region_name="us-west-1")) def handler(event, context): value = ssm_provider.get("/my/parameter") return {"message": value} ``` The Parser utility simplifies data parsing and validation using [Pydantic](https://pydantic-docs.helpmanual.io/). It allows you to define data models in pure Python classes, parse and validate incoming events, and extract only the data you need. ## Key features - Define data models using Python classes - Parse and validate Lambda event payloads - Built-in support for common AWS event sources - Runtime type checking with user-friendly error messages - Compatible with Pydantic v2.x ## Getting started ### Install Powertools only supports Pydantic v2, so make sure to install the required dependencies for Pydantic v2 before using the Parser. ``` pip install aws-lambda-powertools[parser] ``` This is not necessary if you're installing Powertools for AWS Lambda (Python) via [Lambda Layer/SAR](../../#lambda-layer) You can also add as a dependency in your preferred tool: `e.g., requirements.txt, pyproject.toml`, etc. ### Data Model with Parser You can define models by inheriting from `BaseModel` or any other supported type through `TypeAdapter` to parse incoming events. Pydantic then validates the data, ensuring that all fields conform to the specified types and maintaining data integrity. Info The new TypeAdapter feature provide a flexible way to perform validation and serialization based on a Python type. Read more in the [Pydantic documentation](https://docs.pydantic.dev/latest/api/type_adapter/). #### Event parser The `@event_parser` decorator automatically parses the incoming event into the specified Pydantic model `MyEvent`. If the input doesn't match the model's structure or type requirements, it raises a `ValidationError` directly from Pydantic. ``` from pydantic import BaseModel from aws_lambda_powertools.utilities.parser import event_parser class MyEvent(BaseModel): id: int name: str @event_parser(model=MyEvent) def lambda_handler(event: MyEvent, context): # if your model is valid, you can return return {"statusCode": 200, "body": f"Hello {event.name}, your ID is {event.id}"} ``` ``` { "id": "12345", "name": "Jane Doe" } ``` #### Parse function You can use the `parse()` function when you need to have flexibility with different event formats, custom pre-parsing logic, and better exception handling. ``` from pydantic import BaseModel, ValidationError from aws_lambda_powertools.utilities.parser import parse # Define a Pydantic model for the expected structure of the input class MyEvent(BaseModel): id: int name: str def lambda_handler(event: dict, context): try: # Manually parse the incoming event into MyEvent model parsed_event: MyEvent = parse(model=MyEvent, event=event) return {"statusCode": 200, "body": f"Hello {parsed_event.name}, your ID is {parsed_event.id}"} except ValidationError as e: # Catch validation errors and return a 400 response return {"statusCode": 400, "body": f"Validation error: {str(e)}"} ``` ``` { "id": "12345", "name": "Jane Doe" } ``` #### Keys differences between parse and event_parser The `parse()` function offers more flexibility and control: - It allows parsing different parts of an event using multiple models. - You can conditionally handle events before parsing them. - It's useful for integrating with complex workflows where a decorator might not be sufficient. - It provides more control over the validation process and handling exceptions. The `@event_parser` decorator is ideal for: - Fail-fast scenarios where you want to immediately stop execution if the event payload is invalid. - Simplifying your code by automatically parsing and validating the event at the function entry point. ### Built-in models You can use pre-built models to work events from AWS services, so you don’t need to create them yourself. We’ve already done that for you! ``` from aws_lambda_powertools.utilities.parser import parse from aws_lambda_powertools.utilities.parser.models import SqsModel from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: dict, context: LambdaContext) -> list: parsed_event = parse(model=SqsModel, event=event) results = [] for record in parsed_event.Records: results.append( { "message_id": record.messageId, "body": record.body, }, ) return results ``` ``` { "Records": [ { "messageId": "059f36b4-87a3-44ab-83d2-661975830a7d", "receiptHandle": "AQEBwJnKyrHigUMZj6rYigCgxlaS3SLy0a...", "body": "Test message hello!", "attributes": { "ApproximateReceiveCount": "1", "SentTimestamp": "1545082649183", "SenderId": "AIDAIENQZJOLO23YVJ4VO", "ApproximateFirstReceiveTimestamp": "1545082649185" }, "messageAttributes": { "testAttr": { "stringValue": "100", "binaryValue": "base64Str", "dataType": "Number" } }, "md5OfBody": "e4e68fb7bd0e697a0ae8f1bb342846b3", "eventSource": "aws:sqs", "eventSourceARN": "arn:aws:sqs:us-east-2:123456789012:my-queue", "awsRegion": "us-east-2" } ] } ``` The example above uses `SqsModel`. Other built-in models can be found below. | Model name | Description | | --- | --- | | **AlbModel** | Lambda Event Source payload for Amazon Application Load Balancer | | **APIGatewayProxyEventModel** | Lambda Event Source payload for Amazon API Gateway | | **ApiGatewayAuthorizerToken** | Lambda Event Source payload for Amazon API Gateway Lambda Authorizer with Token | | **ApiGatewayAuthorizerRequest** | Lambda Event Source payload for Amazon API Gateway Lambda Authorizer with Request | | **APIGatewayProxyEventV2Model** | Lambda Event Source payload for Amazon API Gateway v2 payload | | **ApiGatewayAuthorizerRequestV2** | Lambda Event Source payload for Amazon API Gateway v2 Lambda Authorizer | | **APIGatewayWebSocketMessageEventModel** | Lambda Event Source payload for Amazon API Gateway WebSocket API message body | | **APIGatewayWebSocketConnectEventModel** | Lambda Event Source payload for Amazon API Gateway WebSocket API $connect message | | **APIGatewayWebSocketDisconnectEventModel** | Lambda Event Source payload for Amazon API Gateway WebSocket API $disconnect message | | **AppSyncResolverEventModel** | Lambda Event Source payload for AWS AppSync Resolver | | **BedrockAgentEventModel** | Lambda Event Source payload for Bedrock Agents | | **CloudFormationCustomResourceCreateModel** | Lambda Event Source payload for AWS CloudFormation `CREATE` operation | | **CloudFormationCustomResourceUpdateModel** | Lambda Event Source payload for AWS CloudFormation `UPDATE` operation | | **CloudFormationCustomResourceDeleteModel** | Lambda Event Source payload for AWS CloudFormation `DELETE` operation | | **CloudwatchLogsModel** | Lambda Event Source payload for Amazon CloudWatch Logs | | **DynamoDBStreamModel** | Lambda Event Source payload for Amazon DynamoDB Streams | | **EventBridgeModel** | Lambda Event Source payload for Amazon EventBridge | | **IoTCoreThingEvent** | Lambda Event Source payload for IoT Core Thing created, updated, or deleted. | | **IoTCoreThingTypeEvent** | Lambda Event Source payload for IoT Core Thing Type events. | | **IoTCoreThingTypeAssociationEvent** | Lambda Event Source payload for IoT Core Thing Type associated or disassociated with a Thing. | | **IoTCoreThingGroupEvent** | Lambda Event Source payload for IoT Core Thing Group created, updated, or deleted. | | **IoTCoreAddOrRemoveFromThingGroupEvent** | Lambda Event Source payload for IoT Core Thing added to or removed from a Thing Group. | | **IoTCoreAddOrDeleteFromThingGroupEvent** | Lambda Event Source payload for IoT Core Thing Group added to or deleted from a Thing Group. | | **KafkaMskEventModel** | Lambda Event Source payload for AWS MSK payload | | **KafkaSelfManagedEventModel** | Lambda Event Source payload for self managed Kafka payload | | **KinesisDataStreamModel** | Lambda Event Source payload for Amazon Kinesis Data Streams | | **KinesisFirehoseModel** | Lambda Event Source payload for Amazon Kinesis Firehose | | **KinesisFirehoseSqsModel** | Lambda Event Source payload for SQS messages wrapped in Kinesis Firehose records | | **LambdaFunctionUrlModel** | Lambda Event Source payload for Lambda Function URL payload | | **S3BatchOperationModel** | Lambda Event Source payload for Amazon S3 Batch Operation | | **S3EventNotificationEventBridgeModel** | Lambda Event Source payload for Amazon S3 Event Notification to EventBridge. | | **S3Model** | Lambda Event Source payload for Amazon S3 | | **S3ObjectLambdaEvent** | Lambda Event Source payload for Amazon S3 Object Lambda | | **S3SqsEventNotificationModel** | Lambda Event Source payload for S3 event notifications wrapped in SQS event (S3->SQS) | | **SesModel** | Lambda Event Source payload for Amazon Simple Email Service | | **SnsModel** | Lambda Event Source payload for Amazon Simple Notification Service | | **SqsModel** | Lambda Event Source payload for Amazon SQS | | **TransferFamilyAuthorizer** | Lambda Event Source payload for AWS Transfer Family Lambda authorizer | | **VpcLatticeModel** | Lambda Event Source payload for Amazon VPC Lattice | | **VpcLatticeV2Model** | Lambda Event Source payload for Amazon VPC Lattice v2 payload | #### Extending built-in models You can extend them to include your own models, and yet have all other known fields parsed along the way. Tip For Mypy users, we only allow type override for fields where payload is injected e.g. `detail`, `body`, etc. **Example: custom data model with Amazon EventBridge** Use the model to validate and extract relevant information from the incoming event. This can be useful when you need to handle events with a specific structure or when you want to ensure that the event data conforms to certain rules. ``` from pydantic import Field, ValidationError from aws_lambda_powertools.utilities.parser import parse from aws_lambda_powertools.utilities.parser.models import EventBridgeModel # Define a custom EventBridge model by extending the built-in EventBridgeModel class MyCustomEventBridgeModel(EventBridgeModel): # type: ignore[override] detail_type: str = Field(alias="detail-type") source: str detail: dict def lambda_handler(event: dict, context): try: # Manually parse the incoming event into the custom model parsed_event: MyCustomEventBridgeModel = parse(model=MyCustomEventBridgeModel, event=event) return {"statusCode": 200, "body": f"Event from {parsed_event.source}, type: {parsed_event.detail_type}"} except ValidationError as e: return {"statusCode": 400, "body": f"Validation error: {str(e)}"} ``` ``` { "version": "0", "id": "abcd-1234-efgh-5678", "detail-type": "order.created", "source": "my.order.service", "account": "123456789012", "time": "2023-09-10T12:00:00Z", "region": "us-west-2", "resources": [], "detail": { "orderId": "O-12345", "amount": 100.0 } } ``` ## Advanced ### Envelopes You can use **Envelopes** to extract specific portions of complex, nested JSON structures. This is useful when your actual payload is wrapped around a known structure, for example Lambda Event Sources like **EventBridge**. Envelopes can be used via `envelope` parameter available in both `parse` function and `event_parser` decorator. ``` from pydantic import BaseModel from aws_lambda_powertools.utilities.parser import envelopes, event_parser from aws_lambda_powertools.utilities.typing import LambdaContext class UserModel(BaseModel): username: str parentid_1: str parentid_2: str @event_parser(model=UserModel, envelope=envelopes.EventBridgeEnvelope) def lambda_handler(event: UserModel, context: LambdaContext): if event.parentid_1 != event.parentid_2: return {"statusCode": 400, "body": "Parent ids do not match"} # If parentids match, proceed with user registration return {"statusCode": 200, "body": f"User {event.username} registered successfully"} ``` ``` { "version": "0", "id": "6a7e8feb-b491-4cf7-a9f1-bf3703467718", "detail-type": "CustomerSignedUp", "source": "CustomerService", "account": "111122223333", "time": "2020-10-22T18:43:48Z", "region": "us-west-1", "resources": [ "some_additional_" ], "detail": { "username": "universe", "parentid_1": "12345", "parentid_2": "6789" } } ``` #### Built-in envelopes You can use pre-built envelopes provided by the Parser to extract and parse specific parts of complex event structures. | Envelope name | Behaviour | Return | | --- | --- | --- | | **DynamoDBStreamEnvelope** | 1. Parses data using `DynamoDBStreamModel`. `2. Parses records in `NewImage`and`OldImage` keys using your model.` 3. Returns a list with a dictionary containing `NewImage` and `OldImage` keys | `List[Dict[str, Optional[Model]]]` | | **EventBridgeEnvelope** | 1. Parses data using `EventBridgeModel`. `2. Parses `detail` key using your model` and returns it. | `Model` | | **SqsEnvelope** | 1. Parses data using `SqsModel`. `2. Parses records in `body` key using your model` and return them in a list. | `List[Model]` | | **CloudWatchLogsEnvelope** | 1. Parses data using `CloudwatchLogsModel` which will base64 decode and decompress it. `2. Parses records in `message` key using your model` and return them in a list. | `List[Model]` | | **KinesisDataStreamEnvelope** | 1. Parses data using `KinesisDataStreamModel` which will base64 decode it. `2. Parses records in in `Records` key using your model` and returns them in a list. | `List[Model]` | | **KinesisFirehoseEnvelope** | 1. Parses data using `KinesisFirehoseModel` which will base64 decode it. `2. Parses records in in` Records` key using your model` and returns them in a list. | `List[Model]` | | **SnsEnvelope** | 1. Parses data using `SnsModel`. `2. Parses records in `body` key using your model` and return them in a list. | `List[Model]` | | **SnsSqsEnvelope** | 1. Parses data using `SqsModel`. `2. Parses SNS records in `body`key using`SnsNotificationModel`.` 3. Parses data in `Message` key using your model and return them in a list. | `List[Model]` | | **ApiGatewayV2Envelope** | 1. Parses data using `APIGatewayProxyEventV2Model`. `2. Parses `body` key using your model` and returns it. | `Model` | | **ApiGatewayEnvelope** | 1. Parses data using `APIGatewayProxyEventModel`. `2. Parses `body` key using your model` and returns it. | `Model` | | **ApiGatewayWebSocketEnvelope** | 1. Parses data using `APIGatewayWebSocketMessageEventModel`. `2. Parses `body` key using your model` and returns it. | `Model` | | **LambdaFunctionUrlEnvelope** | 1. Parses data using `LambdaFunctionUrlModel`. `2. Parses `body` key using your model` and returns it. | `Model` | | **KafkaEnvelope** | 1. Parses data using `KafkaRecordModel`. `2. Parses `value` key using your model` and returns it. | `Model` | | **VpcLatticeEnvelope** | 1. Parses data using `VpcLatticeModel`. `2. Parses `value` key using your model` and returns it. | `Model` | | **BedrockAgentEnvelope** | 1. Parses data using `BedrockAgentEventModel`. `2. Parses `inputText` key using your model` and returns it. | `Model` | #### Bringing your own envelope You can create your own Envelope model and logic by inheriting from `BaseEnvelope`, and implementing the `parse` method. Here's a snippet of how the EventBridge envelope we demonstrated previously is implemented. ``` import json from typing import Any, Dict, Optional, Type, TypeVar, Union from pydantic import BaseModel from aws_lambda_powertools.utilities.parser import BaseEnvelope, event_parser from aws_lambda_powertools.utilities.parser.models import EventBridgeModel from aws_lambda_powertools.utilities.typing import LambdaContext Model = TypeVar("Model", bound=BaseModel) class EventBridgeEnvelope(BaseEnvelope): def parse(self, data: Optional[Union[Dict[str, Any], Any]], model: Type[Model]) -> Optional[Model]: if data is None: return None parsed_envelope = EventBridgeModel.model_validate(data) return self._parse(data=parsed_envelope.detail, model=model) class OrderDetail(BaseModel): order_id: str amount: float customer_id: str @event_parser(model=OrderDetail, envelope=EventBridgeEnvelope) def lambda_handler(event: OrderDetail, context: LambdaContext): try: # Process the order print(f"Processing order {event.order_id} for customer {event.customer_id}") print(f"Order amount: ${event.amount:.2f}") # Your business logic here # For example, you might save the order to a database or trigger a payment process return { "statusCode": 200, "body": json.dumps( { "message": f"Order {event.order_id} processed successfully", "order_id": event.order_id, "amount": event.amount, "customer_id": event.customer_id, }, ), } except Exception as e: print(f"Error processing order: {str(e)}") return {"statusCode": 500, "body": json.dumps({"error": "Internal server error"})} ``` ``` { "version": "0", "id": "12345678-1234-1234-1234-123456789012", "detail-type": "Order Placed", "source": "com.mycompany.orders", "account": "123456789012", "time": "2023-05-03T12:00:00Z", "region": "us-west-2", "resources": [], "detail": { "order_id": "ORD-12345", "amount": 99.99, "customer_id": "CUST-6789" } } ``` **What's going on here, you might ask**: - **EventBridgeEnvelope**: extracts the detail field from EventBridge events. - **OrderDetail Model**: defines and validates the structure of order data. - **@event_parser**: decorator automates parsing and validation of incoming events using the specified model and envelope. ### Data model validation Warning This is radically different from the **Validator utility** which validates events against JSON Schema. You can use Pydantic's validator for deep inspection of object values and complex relationships. There are two types of class method decorators you can use: - **`field_validator`** - Useful to quickly validate an individual field and its value - **`model_validator`** - Useful to validate the entire model's data Keep the following in mind regardless of which decorator you end up using it: - You must raise either `ValueError`, `TypeError`, or `AssertionError` when value is not compliant - You must return the value(s) itself if compliant #### Field Validator Quick validation using decorator `field_validator` to verify whether the field `message` has the value of `hello world`. ``` from pydantic import BaseModel, field_validator from aws_lambda_powertools.utilities.parser import parse from aws_lambda_powertools.utilities.typing import LambdaContext class HelloWorldModel(BaseModel): message: str @field_validator("message") def is_hello_world(cls, v): if v != "hello world": raise ValueError("Message must be hello world!") return v def lambda_handler(event: dict, context: LambdaContext): try: parsed_event = parse(model=HelloWorldModel, event=event) return {"statusCode": 200, "body": f"Received message: {parsed_event.message}"} except ValueError as e: return {"statusCode": 400, "body": str(e)} ``` If you run using a test event `{"message": "hello universe"}` you should expect the following error with the message we provided in our exception: ``` Message must be hello world! (type=value_error) ``` #### Model validator `model_validator` can help when you have a complex validation mechanism. For example finding whether data has been omitted or comparing field values. If you are still using the deprecated `root_validator` function, switch to `model_validator` for the latest Pydantic functionality. ``` from pydantic import BaseModel, model_validator from aws_lambda_powertools.utilities.parser import parse from aws_lambda_powertools.utilities.typing import LambdaContext class UserModel(BaseModel): username: str parentid_1: str parentid_2: str @model_validator(mode="after") # (1)! def check_parents_match(cls, values): pi1, pi2 = values.get("parentid_1"), values.get("parentid_2") if pi1 is not None and pi2 is not None and pi1 != pi2: raise ValueError("Parent ids do not match") return values def lambda_handler(event: dict, context: LambdaContext): try: parsed_event = parse(model=UserModel, event=event) return { "statusCode": 200, "body": f"Received parent id from: {parsed_event.username}", } except ValueError as e: return { "statusCode": 400, "body": str(e), } ``` 1. The keyword argument `mode='after'` will cause the validator to be called after all field-level validation and parsing has been completed. Info You can read more about validating list items, reusing validators, validating raw inputs, and a lot more in [Pydantic's documentation](https://pydantic-docs.helpmanual.io/usage/validators/). #### String fields that contain JSON data Wrap these fields with [Pydantic's Json Type](https://pydantic-docs.helpmanual.io/usage/types/#json-type). This approach allows Pydantic to properly parse and validate the JSON content, ensuring type safety and data integrity. ``` from __future__ import annotations from typing import TYPE_CHECKING, Any from pydantic import BaseModel, Json from aws_lambda_powertools.utilities.parser import BaseEnvelope, event_parser from aws_lambda_powertools.utilities.parser.functions import ( _parse_and_validate_event, _retrieve_or_set_model_from_cache, ) from aws_lambda_powertools.utilities.typing import LambdaContext if TYPE_CHECKING: from aws_lambda_powertools.utilities.parser.types import T class CancelOrder(BaseModel): order_id: int reason: str class CancelOrderModel(BaseModel): body: Json[CancelOrder] class CustomEnvelope(BaseEnvelope): def parse(self, data: dict[str, Any] | Any | None, model: type[T]): adapter = _retrieve_or_set_model_from_cache(model=model) return _parse_and_validate_event(data=data, adapter=adapter) @event_parser(model=CancelOrderModel, envelope=CustomEnvelope) def lambda_handler(event: CancelOrderModel, context: LambdaContext): cancel_order: CancelOrder = event.body assert cancel_order.order_id is not None # Process the cancel order request print(f"Cancelling order {cancel_order.order_id} for reason: {cancel_order.reason}") return { "statusCode": 200, "body": f"Order {cancel_order.order_id} cancelled successfully", } ``` ``` { "body": "{\"order_id\": 12345, \"reason\": \"Changed my mind\"}" } ``` ### Serialization Models in Pydantic offer more than direct attribute access. They can be transformed, serialized, and exported in various formats. Pydantic's definition of *serialization* is broader than usual. It includes converting structured objects to simpler Python types, not just data to strings or bytes. This reflects the close relationship between these processes in Pydantic. Read more at [Serialization for Pydantic documentation](https://docs.pydantic.dev/latest/concepts/serialization/#model_copy). ``` from pydantic import BaseModel from aws_lambda_powertools.logging import Logger from aws_lambda_powertools.utilities.parser import parse from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() class UserModel(BaseModel): username: str parentid_1: str parentid_2: str def validate_user(event): try: user = parse(model=UserModel, event=event) return {"statusCode": 200, "body": user.model_dump_json()} except Exception as e: logger.exception("Validation error") return {"statusCode": 400, "body": str(e)} @logger.inject_lambda_context def lambda_handler(event: dict, context: LambdaContext) -> dict: logger.info("Received event", extra={"event": event}) result = validate_user(event) if result["statusCode"] == 200: user = UserModel.model_validate_json(result["body"]) logger.info("User validated successfully", extra={"username": user.username}) # Example of serialization user_dict = user.model_dump() user_json = user.model_dump_json() logger.debug("User serializations", extra={"dict": user_dict, "json": user_json}) return result ``` Info There are number of advanced use cases well documented in Pydantic's doc such as creating [immutable models](https://pydantic-docs.helpmanual.io/usage/models/#faux-immutability), [declaring fields with dynamic values](https://pydantic-docs.helpmanual.io/usage/models/#field-with-dynamic-default-value). ## FAQ **When should I use parser vs data_classes utility?** Use data classes utility when you're after autocomplete, self-documented attributes and helpers to extract data from common event sources. Parser is best suited for those looking for a trade-off between defining their models for deep validation, parsing and autocomplete for an additional dependency to be brought in. **How do I import X from Pydantic?** We recommend importing directly from Pydantic to access all features and stay up-to-date with the latest Pydantic updates. For example: ``` from pydantic import BaseModel, Field, ValidationError ``` While we export some common Pydantic classes and utilities through the parser for convenience (e.g., `from aws_lambda_powertools.utilities.parser import BaseModel`), importing directly from Pydantic ensures you have access to all features and the most recent updates. The streaming utility handles datasets larger than the available memory as streaming data. ## Key features - Stream Amazon S3 objects with a file-like interface with minimal memory consumption - Built-in popular data transformations to decompress and deserialize (gzip, CSV, and ZIP) - Build your own data transformation and add it to the pipeline ## Background Within Lambda, processing S3 objects larger than the allocated amount of memory can lead to out of memory or timeout situations. For cost efficiency, your S3 objects may be encoded and compressed in various formats (*gzip, CSV, zip files, etc*), increasing the amount of non-business logic and reliability risks. Streaming utility makes this process easier by fetching parts of your data as you consume it, and transparently applying data transformations to the data stream. This allows you to process one, a few, or all rows of your large dataset while consuming a few MBs only. ## Getting started ### Streaming from a S3 object With `S3Object`, you'll need the bucket, object key, and optionally a version ID to stream its content. We will fetch parts of your data from S3 as you process each line, consuming only the absolute minimal amount of memory. ``` from typing import Dict from aws_lambda_powertools.utilities.streaming.s3_object import S3Object from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: Dict[str, str], context: LambdaContext): s3 = S3Object(bucket=event["bucket"], key=event["key"]) for line in s3: print(line) ``` ``` from typing import Dict from aws_lambda_powertools.utilities.streaming.s3_object import S3Object from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: Dict[str, str], context: LambdaContext): s3 = S3Object(bucket=event["bucket"], key=event["key"], version_id=event["version_id"]) for line in s3: print(line) ``` ### Data transformations Think of data transformations like a data processing pipeline - apply one or more in order. As data is streamed, you can apply transformations to your data like decompressing gzip content and deserializing a CSV into a dictionary. For popular data transformations like CSV or Gzip, you can quickly enable it at the constructor level: ``` from typing import Dict from aws_lambda_powertools.utilities.streaming.s3_object import S3Object from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: Dict[str, str], context: LambdaContext): s3 = S3Object(bucket=event["bucket"], key=event["key"], is_gzip=True, is_csv=True) for line in s3: print(line) ``` Alternatively, you can apply transformations later via the `transform` method. By default, it will return the transformed stream you can use to read its contents. If you prefer in-place modifications, use `in_place=True`. When is this useful? In scenarios where you might have a reusable logic to apply common transformations. This might be a function or a class that receives an instance of `S3Object`. ``` from typing import Dict from aws_lambda_powertools.utilities.streaming.s3_object import S3Object from aws_lambda_powertools.utilities.streaming.transformations import ( CsvTransform, GzipTransform, ) from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: Dict[str, str], context: LambdaContext): s3 = S3Object(bucket=event["bucket"], key=event["key"]) data = s3.transform([GzipTransform(), CsvTransform()]) for line in data: print(line) # returns a dict ``` Note that when using `in_place=True`, there is no return (`None`). ``` from typing import Dict from aws_lambda_powertools.utilities.streaming.s3_object import S3Object from aws_lambda_powertools.utilities.streaming.transformations import ( CsvTransform, GzipTransform, ) from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: Dict[str, str], context: LambdaContext): s3 = S3Object(bucket=event["bucket"], key=event["key"]) s3.transform([GzipTransform(), CsvTransform()], in_place=True) for line in s3: print(line) # returns a dict ``` #### Handling ZIP files `ZipTransform` doesn't support combining other transformations. This is because a Zip file contains multiple files while transformations apply to a single stream. That said, you can still open a specific file as a stream, reading only the necessary bytes to extract it: ``` from aws_lambda_powertools.utilities.streaming import S3Object from aws_lambda_powertools.utilities.streaming.transformations import ZipTransform s3object = S3Object(bucket="bucket", key="key") zip_reader = s3object.transform(ZipTransform()) with zip_reader.open("filename.txt") as f: for line in f: print(line) ``` #### Built-in data transformations We provide popular built-in transformations that you can apply against your streaming data. | Name | Description | Class name | | --- | --- | --- | | **Gzip** | Gunzips the stream of data using the [gzip library](https://docs.python.org/3/library/gzip.html) | GzipTransform | | **Zip** | Exposes the stream as a [ZipFile object](https://docs.python.org/3/library/zipfile.html) | ZipTransform | | **CSV** | Parses each CSV line as a CSV object, returning dictionary objects | CsvTransform | ## Advanced ### Skipping or reading backwards `S3Object` implements [Python I/O interface](https://docs.python.org/3/tutorial/inputoutput.html). This means you can use `seek` to start reading contents of your file from any particular position, saving you processing time. #### Reading backwards For example, let's imagine you have a large CSV file, each row has a non-uniform size (bytes), and you want to read and process the last row only. ``` id,name,location 1,Ruben Fonseca, Denmark 2,Heitor Lessa, Netherlands 3,Leandro Damascena, Portugal ``` You found out the last row has exactly 30 bytes. We can use `seek()` to skip to the end of the file, read 30 bytes, then transform to CSV. ``` import io from typing import Dict from aws_lambda_powertools.utilities.streaming.s3_object import S3Object from aws_lambda_powertools.utilities.streaming.transformations import CsvTransform from aws_lambda_powertools.utilities.typing import LambdaContext LAST_ROW_SIZE = 30 CSV_HEADERS = ["id", "name", "location"] def lambda_handler(event: Dict[str, str], context: LambdaContext): sample_csv = S3Object(bucket=event["bucket"], key="sample.csv") # From the end of the file, jump exactly 30 bytes backwards sample_csv.seek(-LAST_ROW_SIZE, io.SEEK_END) # Transform portion of data into CSV with our headers sample_csv.transform(CsvTransform(fieldnames=CSV_HEADERS), in_place=True) # We will only read the last portion of the file from S3 # as we're only interested in the last 'location' from our dataset for last_row in sample_csv: print(last_row["location"]) ``` #### Skipping What if we want to jump the first N rows? You can also solve with `seek`, but let's take a large uniform CSV file to make this easier to grasp. ``` reading,position,type 21.3,5,+ 23.4,4,+ 21.3,0,- ``` You found out that each row has 8 bytes, the header line has 21 bytes, and every new line has 1 byte. You want to skip the first 100 lines. ``` import io from typing import Dict from aws_lambda_powertools.utilities.streaming.s3_object import S3Object from aws_lambda_powertools.utilities.streaming.transformations import CsvTransform from aws_lambda_powertools.utilities.typing import LambdaContext """ Assuming the CSV files contains rows after the header always has 8 bytes + 1 byte newline: reading,position,type 21.3,5,+ 23.4,4,+ 21.3,0,- ... """ CSV_HEADERS = ["reading", "position", "type"] ROW_SIZE = 8 + 1 # 1 byte newline HEADER_SIZE = 21 + 1 # 1 byte newline LINES_TO_JUMP = 100 def lambda_handler(event: Dict[str, str], context: LambdaContext): sample_csv = S3Object(bucket=event["bucket"], key=event["key"]) # Skip the header line sample_csv.seek(HEADER_SIZE, io.SEEK_SET) # Jump 100 lines of 9 bytes each (8 bytes of data + 1 byte newline) sample_csv.seek(LINES_TO_JUMP * ROW_SIZE, io.SEEK_CUR) sample_csv.transform(CsvTransform(), in_place=True) for row in sample_csv: print(row["reading"]) ``` ### Custom options for data transformations We will propagate additional options to the underlying implementation for each transform class. | Name | Available options | | --- | --- | | **GzipTransform** | [GzipFile constructor](https://docs.python.org/3/library/gzip.html#gzip.GzipFile) | | **ZipTransform** | [ZipFile constructor](https://docs.python.org/3/library/zipfile.html#zipfile.ZipFile) | | **CsvTransform** | [DictReader constructor](https://docs.python.org/3/library/csv.html#csv.DictReader) | For instance, take `ZipTransform`. You can use the `compression` parameter if you want to unzip an S3 object compressed with `LZMA`. ``` import zipfile from typing import Dict from aws_lambda_powertools.utilities.streaming.s3_object import S3Object from aws_lambda_powertools.utilities.streaming.transformations import ZipTransform from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: Dict[str, str], context: LambdaContext): s3 = S3Object(bucket=event["bucket"], key=event["key"]) zf = s3.transform(ZipTransform(compression=zipfile.ZIP_LZMA)) print(zf.nameslist()) zf.extract(zf.namelist()[0], "/tmp") ``` Or, if you want to load a tab-separated file (TSV), you can use the `delimiter` parameter in the `CsvTransform`: ``` from typing import Dict from aws_lambda_powertools.utilities.streaming.s3_object import S3Object from aws_lambda_powertools.utilities.streaming.transformations import CsvTransform from aws_lambda_powertools.utilities.typing import LambdaContext def lambda_handler(event: Dict[str, str], context: LambdaContext): s3 = S3Object(bucket=event["bucket"], key=event["key"]) tsv_stream = s3.transform(CsvTransform(delimiter="\t")) for obj in tsv_stream: print(obj) ``` ### Building your own data transformation You can build your own custom data transformation by extending the `BaseTransform` class. The `transform` method receives an `IO[bytes]` object, and you are responsible for returning an `IO[bytes]` object. ``` import io from typing import IO, Optional import ijson from aws_lambda_powertools.utilities.streaming.transformations import BaseTransform # Using io.RawIOBase gets us default implementations of many of the common IO methods class JsonDeserializer(io.RawIOBase): def __init__(self, input_stream: IO[bytes]): self.input = ijson.items(input_stream, "", multiple_values=True) def read(self, size: int = -1) -> Optional[bytes]: raise NotImplementedError(f"{__name__} does not implement read") def readline(self, size: Optional[int] = None) -> bytes: raise NotImplementedError(f"{__name__} does not implement readline") def read_object(self) -> dict: return self.input.__next__() def __next__(self): return self.read_object() class JsonTransform(BaseTransform): def transform(self, input_stream: IO[bytes]) -> JsonDeserializer: return JsonDeserializer(input_stream=input_stream) ``` ## Testing your code ### Asserting data transformations Create an input payload using `io.BytesIO` and assert the response of the transformation: ``` import io import boto3 from assert_transformation_module import UpperTransform from botocore import stub from aws_lambda_powertools.utilities.streaming import S3Object from aws_lambda_powertools.utilities.streaming.compat import PowertoolsStreamingBody def test_upper_transform(): # GIVEN data_stream = io.BytesIO(b"hello world") # WHEN data_stream = UpperTransform().transform(data_stream) # THEN assert data_stream.read() == b"HELLO WORLD" def test_s3_object_with_upper_transform(): # GIVEN payload = b"hello world" s3_client = boto3.client("s3") s3_stub = stub.Stubber(s3_client) s3_stub.add_response( "get_object", {"Body": PowertoolsStreamingBody(raw_stream=io.BytesIO(payload), content_length=len(payload))}, ) s3_stub.activate() # WHEN data_stream = S3Object(bucket="bucket", key="key", boto3_client=s3_client) data_stream.transform(UpperTransform(), in_place=True) # THEN assert data_stream.read() == b"HELLO WORLD" ``` ``` import io from typing import IO, Optional from aws_lambda_powertools.utilities.streaming.transformations import BaseTransform class UpperIO(io.RawIOBase): def __init__(self, input_stream: IO[bytes], encoding: str): self.encoding = encoding self.input_stream = io.TextIOWrapper(input_stream, encoding=encoding) def read(self, size: int = -1) -> Optional[bytes]: data = self.input_stream.read(size) return data.upper().encode(self.encoding) class UpperTransform(BaseTransform): def transform(self, input_stream: IO[bytes]) -> UpperIO: return UpperIO(input_stream=input_stream, encoding="utf-8") ``` ## Known limitations ### AWS X-Ray segment size limit We make multiple API calls to S3 as you read chunks from your S3 object. If your function is decorated with [Tracer](../../core/tracer/), you can easily hit [AWS X-Ray 64K segment size](https://docs.aws.amazon.com/general/latest/gr/xray.html#limits_xray) when processing large files. Use tracer decorators in parts where you don't read your `S3Object` instead. This typing utility provides static typing classes that can be used to ease the development by providing the IDE type hints. ## Key features - Add static typing classes - Ease the development by leveraging your IDE's type hints - Avoid common typing mistakes in Python ## Getting started Tip All examples shared in this documentation are available within the [project repository](https://github.com/aws-powertools/powertools-lambda-python/tree/develop/examples). We provide static typing for any context methods or properties implemented by [Lambda context object](https://docs.aws.amazon.com/lambda/latest/dg/python-context.html). ## LambdaContext The `LambdaContext` typing is typically used in the handler method for the Lambda function. ``` from aws_lambda_powertools.utilities.typing import LambdaContext def handler(event: dict, context: LambdaContext) -> dict: # Insert business logic return event ``` ## Working with context methods and properties Using `LambdaContext` typing makes it possible to access information and hints of all properties and methods implemented by Lambda context object. ``` from time import sleep import requests from aws_lambda_powertools import Logger from aws_lambda_powertools.utilities.typing import LambdaContext logger = Logger() def lambda_handler(event, context: LambdaContext) -> dict: limit_execution: int = 1000 # milliseconds # scrape website and exit before lambda timeout while context.get_remaining_time_in_millis() > limit_execution: comments: requests.Response = requests.get("https://jsonplaceholder.typicode.com/comments") # add logic here and save the results of the request to an S3 bucket, for example. logger.info( { "operation": "scrape_website", "request_id": context.aws_request_id, "remaining_time": context.get_remaining_time_in_millis(), "comments": comments.json()[:2], }, ) sleep(1) return {"message": "Success"} ``` This utility provides JSON Schema validation for events and responses, including JMESPath support to unwrap events before validation. ## Key features - Validate incoming event and response - JMESPath support to unwrap events before validation applies - Built-in envelopes to unwrap popular event sources payloads ## Getting started Tip All examples shared in this documentation are available within the [project repository](https://github.com/aws-powertools/powertools-lambda-python/tree/develop/examples). You can validate inbound and outbound events using [`validator` decorator](#validator-decorator). You can also use the standalone `validate` function, if you want more control over the validation process such as handling a validation error. Tip: Using JSON Schemas for the first time? Check this [step-by-step tour in the official JSON Schema website](https://json-schema.org/learn/getting-started-step-by-step.html). We support any JSONSchema draft supported by [fastjsonschema](https://horejsek.github.io/python-fastjsonschema/) library. Warning Both `validator` decorator and `validate` standalone function expects your JSON Schema to be a **dictionary**, not a filename. ### Install This is not necessary if you're installing Powertools for AWS Lambda (Python) via [Lambda Layer/SAR](../../#lambda-layer) Add `aws-lambda-powertools[validation]` as a dependency in your preferred tool: *e.g.*, *requirements.txt*, *pyproject.toml*. This will ensure you have the required dependencies before using Validation. ### Validator decorator **Validator** decorator is typically used to validate either inbound or functions' response. It will fail fast with `SchemaValidationError` exception if event or response doesn't conform with given JSON Schema. ``` from dataclasses import dataclass, field from uuid import uuid4 import getting_started_validator_decorator_schema as schemas from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext from aws_lambda_powertools.utilities.validation import validator # we can get list of allowed IPs from AWS Parameter Store using Parameters Utility # See: https://docs.powertools.aws.dev/lambda/python/latest/utilities/parameters/ ALLOWED_IPS = parameters.get_parameter("/lambda-powertools/allowed_ips") class UserPermissionsError(Exception): ... @dataclass class User: ip: str permissions: list user_id: str = field(default_factory=lambda: f"{uuid4()}") name: str = "Project Lambda Powertools" # using a decorator to validate input and output data @validator(inbound_schema=schemas.INPUT, outbound_schema=schemas.OUTPUT) def lambda_handler(event, context: LambdaContext) -> dict: try: user_details: dict = {} # get permissions by user_id and project if ( event.get("user_id") == "0d44b083-8206-4a3a-aa95-5d392a99be4a" and event.get("project") == "powertools" and event.get("ip") in ALLOWED_IPS ): user_details = User(ip=event.get("ip"), permissions=["read", "write"]).__dict__ # the body must be an object because must match OUTPUT schema, otherwise it fails return {"body": user_details or None, "statusCode": 200 if user_details else 204} except Exception as e: raise UserPermissionsError(str(e)) ``` ``` INPUT = { "$schema": "http://json-schema.org/draft-07/schema", "$id": "http://example.com/example.json", "type": "object", "title": "Sample schema", "description": "The root schema comprises the entire JSON document.", "examples": [{"user_id": "0d44b083-8206-4a3a-aa95-5d392a99be4a", "project": "powertools", "ip": "192.168.0.1"}], "required": ["user_id", "project", "ip"], "properties": { "user_id": { "$id": "#/properties/user_id", "type": "string", "title": "The user_id", "examples": ["0d44b083-8206-4a3a-aa95-5d392a99be4a"], "maxLength": 50, }, "project": { "$id": "#/properties/project", "type": "string", "title": "The project", "examples": ["powertools"], "maxLength": 30, }, "ip": { "$id": "#/properties/ip", "type": "string", "title": "The ip", "format": "ipv4", "examples": ["192.168.0.1"], "maxLength": 30, }, }, } OUTPUT = { "$schema": "http://json-schema.org/draft-07/schema", "$id": "http://example.com/example.json", "type": "object", "title": "Sample outgoing schema", "description": "The root schema comprises the entire JSON document.", "examples": [{"statusCode": 200, "body": {}}], "required": ["statusCode", "body"], "properties": { "statusCode": { "$id": "#/properties/statusCode", "type": "integer", "title": "The statusCode", "examples": [200], "maxLength": 3, }, "body": { "$id": "#/properties/body", "type": "object", "title": "The body", "examples": [ '{"ip": "192.168.0.1", "permissions": ["read", "write"], "user_id": "7576b683-295e-4f69-b558-70e789de1b18", "name": "Project Lambda Powertools"}' # noqa E501 ], }, }, } ``` ``` { "user_id": "0d44b083-8206-4a3a-aa95-5d392a99be4a", "project": "powertools", "ip": "192.168.0.1" } ``` Note It's not a requirement to validate both inbound and outbound schemas - You can either use one, or both. ### Validate function **Validate** standalone function is typically used within the Lambda handler, or any other methods that perform data validation. Info This function returns the validated event as a JSON object. If the schema specifies `default` values for omitted fields, those default values will be included in the response. You can also gracefully handle schema validation errors by catching `SchemaValidationError` exception. ``` import getting_started_validator_standalone_schema as schemas from aws_lambda_powertools.utilities import parameters from aws_lambda_powertools.utilities.typing import LambdaContext from aws_lambda_powertools.utilities.validation import SchemaValidationError, validate # we can get list of allowed IPs from AWS Parameter Store using Parameters Utility # See: https://docs.powertools.aws.dev/lambda/python/latest/utilities/parameters/ ALLOWED_IPS = parameters.get_parameter("/lambda-powertools/allowed_ips") def lambda_handler(event, context: LambdaContext) -> dict: try: user_authenticated: str = "" # using standalone function to validate input data only validate(event=event, schema=schemas.INPUT) if ( event.get("user_id") == "0d44b083-8206-4a3a-aa95-5d392a99be4a" and event.get("project") == "powertools" and event.get("ip") in ALLOWED_IPS ): user_authenticated = "Allowed" # in this example the body can be of any type because we are not validating the OUTPUT return {"body": user_authenticated, "statusCode": 200 if user_authenticated else 204} except SchemaValidationError as exception: # SchemaValidationError indicates where a data mismatch is return {"body": str(exception), "statusCode": 400} ``` ``` INPUT = { "$schema": "http://json-schema.org/draft-07/schema", "$id": "http://example.com/example.json", "type": "object", "title": "Sample schema", "description": "The root schema comprises the entire JSON document.", "examples": [{"user_id": "0d44b083-8206-4a3a-aa95-5d392a99be4a", "powertools": "lessa", "ip": "192.168.0.1"}], "required": ["user_id", "project", "ip"], "properties": { "user_id": { "$id": "#/properties/user_id", "type": "string", "title": "The user_id", "examples": ["0d44b083-8206-4a3a-aa95-5d392a99be4a"], "maxLength": 50, }, "project": { "$id": "#/properties/project", "type": "string", "title": "The project", "examples": ["powertools"], "maxLength": 30, }, "ip": { "$id": "#/properties/ip", "type": "string", "title": "The ip", "format": "ipv4", "examples": ["192.168.0.1"], "maxLength": 30, }, }, } ``` ``` { "user_id": "0d44b083-8206-4a3a-aa95-5d392a99be4a", "project": "powertools", "ip": "192.168.0.1" } ``` ### Unwrapping events prior to validation You might want to validate only a portion of your event - This is what the `envelope` parameter is for. Envelopes are [JMESPath expressions](https://jmespath.org/tutorial.html) to extract a portion of JSON you want before applying JSON Schema validation. Here is a sample custom EventBridge event, where we only validate what's inside the `detail` key: ``` import boto3 import getting_started_validator_unwrapping_schema as schemas from aws_lambda_powertools.utilities.data_classes.event_bridge_event import ( EventBridgeEvent, ) from aws_lambda_powertools.utilities.typing import LambdaContext from aws_lambda_powertools.utilities.validation import validator s3_client = boto3.resource("s3") # we use the 'envelope' parameter to extract the payload inside the 'detail' key before validating @validator(inbound_schema=schemas.INPUT, envelope="detail") def lambda_handler(event: dict, context: LambdaContext) -> dict: my_event = EventBridgeEvent(event) data = my_event.detail.get("data", {}) s3_bucket, s3_key = data.get("s3_bucket"), data.get("s3_key") try: s3_object = s3_client.Object(bucket_name=s3_bucket, key=s3_key) payload = s3_object.get()["Body"] content = payload.read().decode("utf-8") return {"message": process_data_object(content), "success": True} except s3_client.meta.client.exceptions.NoSuchBucket as exception: return return_error_message(str(exception)) except s3_client.meta.client.exceptions.NoSuchKey as exception: return return_error_message(str(exception)) def return_error_message(message: str) -> dict: return {"message": message, "success": False} def process_data_object(content: str) -> str: # insert logic here return "Data OK" ``` ``` INPUT = { "$schema": "http://json-schema.org/draft-07/schema#", "$id": "https://example.com/object1660222326.json", "type": "object", "title": "Sample schema", "description": "The root schema comprises the entire JSON document.", "examples": [ { "data": { "s3_bucket": "aws-lambda-powertools", "s3_key": "event.txt", "file_size": 200, "file_type": "text/plain", }, }, ], "required": ["data"], "properties": { "data": { "$id": "#root/data", "title": "Root", "type": "object", "required": ["s3_bucket", "s3_key", "file_size", "file_type"], "properties": { "s3_bucket": { "$id": "#root/data/s3_bucket", "title": "The S3 Bucker", "type": "string", "default": "", "examples": ["aws-lambda-powertools"], "pattern": "^.*$", }, "s3_key": { "$id": "#root/data/s3_key", "title": "The S3 Key", "type": "string", "default": "", "examples": ["folder/event.txt"], "pattern": "^.*$", }, "file_size": { "$id": "#root/data/file_size", "title": "The file size", "type": "integer", "examples": [200], "default": 0, }, "file_type": { "$id": "#root/data/file_type", "title": "The file type", "type": "string", "default": "", "examples": ["text/plain"], "pattern": "^.*$", }, }, }, }, } ``` ``` { "id": "cdc73f9d-aea9-11e3-9d5a-835b769c0d9c", "detail-type": "CustomEvent", "source": "mycompany.service", "account": "123456789012", "time": "1970-01-01T00:00:00Z", "region": "us-east-1", "resources": [], "detail": { "data": { "s3_bucket": "aws-lambda-powertools", "s3_key": "folder/event.txt", "file_size": 200, "file_type": "text/plain" } } } ``` This is quite powerful because you can use JMESPath Query language to extract records from [arrays](https://jmespath.org/tutorial.html#list-and-slice-projections), combine [pipe](https://jmespath.org/tutorial.html#pipe-expressions) and [function expressions](https://jmespath.org/tutorial.html#functions). When combined, these features allow you to extract what you need before validating the actual payload. ### Built-in envelopes We provide built-in envelopes to easily extract the payload from popular event sources. ``` import boto3 import unwrapping_popular_event_source_schema as schemas from botocore.exceptions import ClientError from aws_lambda_powertools.utilities.data_classes.event_bridge_event import ( EventBridgeEvent, ) from aws_lambda_powertools.utilities.typing import LambdaContext from aws_lambda_powertools.utilities.validation import envelopes, validator # extracting detail from EventBridge custom event # see: https://docs.powertools.aws.dev/lambda/python/latest/utilities/jmespath_functions/#built-in-envelopes @validator(inbound_schema=schemas.INPUT, envelope=envelopes.EVENTBRIDGE) def lambda_handler(event: dict, context: LambdaContext) -> dict: my_event = EventBridgeEvent(event) ec2_client = boto3.resource("ec2", region_name=my_event.region) try: instance_id = my_event.detail.get("instance_id") instance = ec2_client.Instance(instance_id) instance.stop() return {"message": f"Successfully stopped {instance_id}", "success": True} except ClientError as exception: return {"message": str(exception), "success": False} ``` ``` INPUT = { "definitions": {}, "$schema": "http://json-schema.org/draft-07/schema#", "$id": "https://example.com/object1660233148.json", "title": "Root", "type": "object", "required": ["instance_id", "region"], "properties": { "instance_id": { "$id": "#root/instance_id", "title": "Instance_id", "type": "string", "default": "", "examples": ["i-042dd005362091826"], "pattern": "^.*$", }, "region": { "$id": "#root/region", "title": "Region", "type": "string", "default": "", "examples": ["us-east-1"], "pattern": "^.*$", }, }, } ``` ``` { "id": "cdc73f9d-aea9-11e3-9d5a-835b769c0d9c", "detail-type": "Scheduled Event", "source": "aws.events", "account": "123456789012", "time": "1970-01-01T00:00:00Z", "region": "us-east-1", "resources": [ "arn:aws:events:us-east-1:123456789012:rule/ExampleRule" ], "detail": { "instance_id": "i-042dd005362091826", "region": "us-east-2" } } ``` Here is a handy table with built-in envelopes along with their JMESPath expressions in case you want to build your own. | Envelope | JMESPath expression | | --- | --- | | **`API_GATEWAY_HTTP`** | `powertools_json(body)` | | **`API_GATEWAY_REST`** | `powertools_json(body)` | | **`CLOUDWATCH_EVENTS_SCHEDULED`** | `detail` | | **`CLOUDWATCH_LOGS`** | `awslogs.powertools_base64_gzip(data)` or `powertools_json(@).logEvents[*]` | | **`EVENTBRIDGE`** | `detail` | | **`KINESIS_DATA_STREAM`** | `Records[*].kinesis.powertools_json(powertools_base64(data))` | | **`SNS`** | `Records[0].Sns.Message` or `powertools_json(@)` | | **`SQS`** | `Records[*].powertools_json(body)` | ## Advanced ### Validating custom formats Note JSON Schema DRAFT 7 [has many new built-in formats](https://json-schema.org/understanding-json-schema/reference/string.html#format) such as date, time, and specifically a regex format which might be a better replacement for a custom format, if you do have control over the schema. JSON Schemas with custom formats like `awsaccountid` will fail validation. If you have these, you can pass them using `formats` parameter: ``` { "accountid": { "format": "awsaccountid", "type": "string" } } ``` For each format defined in a dictionary key, you must use a regex, or a function that returns a boolean to instruct the validator on how to proceed when encountering that type. ``` import json import re import boto3 import custom_format_schema as schemas from aws_lambda_powertools.utilities.typing import LambdaContext from aws_lambda_powertools.utilities.validation import SchemaValidationError, validate # awsaccountid must have 12 digits custom_format = {"awsaccountid": lambda value: re.match(r"^(\d{12})$", value)} def lambda_handler(event, context: LambdaContext) -> dict: try: # validate input using custom json format validate(event=event, schema=schemas.INPUT, formats=custom_format) client_organization = boto3.client("organizations", region_name=event.get("region")) account_data = client_organization.describe_account(AccountId=event.get("accountid")) return { "account": json.dumps(account_data.get("Account"), default=str), "message": "Success", "statusCode": 200, } except SchemaValidationError as exception: return return_error_message(str(exception)) except Exception as exception: return return_error_message(str(exception)) def return_error_message(message: str) -> dict: return {"account": None, "message": message, "statusCode": 400} ``` ``` INPUT = { "definitions": {}, "$schema": "http://json-schema.org/draft-07/schema#", "$id": "https://example.com/object1660245931.json", "title": "Root", "type": "object", "required": ["accountid", "region"], "properties": { "accountid": { "$id": "#root/accountid", "title": "The accountid", "type": "string", "format": "awsaccountid", "default": "", "examples": ["123456789012"], }, "region": { "$id": "#root/region", "title": "The region", "type": "string", "default": "", "examples": ["us-east-1"], "pattern": "^.*$", }, }, } ``` ``` { "accountid": "200984112386", "region": "us-east-1" } ``` ### Built-in JMESPath functions You might have events or responses that contain non-encoded JSON, where you need to decode before validating them. You can use our built-in [JMESPath functions](../jmespath_functions/) within your expressions to do exactly that to [deserialize JSON Strings](../jmespath_functions/#powertools_json-function), [decode base64](../jmespath_functions/#powertools_base64-function), and [decompress gzip data](../jmespath_functions/#powertools_base64_gzip-function). Info We use these for [built-in envelopes](#built-in-envelopes) to easily to decode and unwrap events from sources like Kinesis, CloudWatch Logs, etc. ### Validating with external references JSON Schema [allows schemas to reference other schemas](https://json-schema.org/understanding-json-schema/structuring#dollarref) using the `$ref` keyword with a URI value. By default, `fastjsonschema` will make a HTTP request to resolve this URI. You can use `handlers` parameter to have full control over how references schemas are fetched. This is useful when you might want to optimize caching, reducing HTTP calls, or fetching them from non-HTTP endpoints. ``` from custom_handlers_schema import CHILD_SCHEMA, PARENT_SCHEMA from aws_lambda_powertools.utilities.typing import LambdaContext from aws_lambda_powertools.utilities.validation import validator # Function to return the child schema def get_child_schema(uri: str): return CHILD_SCHEMA @validator(inbound_schema=PARENT_SCHEMA, inbound_handlers={"https": get_child_schema}) def lambda_handler(event, context: LambdaContext) -> dict: return event ``` ``` PARENT_SCHEMA = { "$schema": "http://json-schema.org/draft-07/schema#", "$id": "https://example.com/schemas/parent.json", "type": "object", "properties": { "ParentSchema": { "$ref": "https://SCHEMA", }, }, } CHILD_SCHEMA = { "$schema": "http://json-schema.org/draft-07/schema#", "$id": "https://example.com/schemas/child.json", "type": "object", "properties": { "project": { "type": "string", }, }, "required": ["project"], } ``` ``` PARENT_SCHEMA = { "$schema": "http://json-schema.org/draft-07/schema#", "$id": "https://example.com/schemas/parent.json", "type": "object", "properties": { "ParentSchema": { "$ref": "https://SCHEMA", }, }, } CHILD_SCHEMA = { "$schema": "http://json-schema.org/draft-07/schema#", "$id": "https://example.com/schemas/child.json", "type": "object", "properties": { "project": { "type": "string", }, }, "required": ["project"], } ``` ``` { "ParentSchema": { "project": "powertools" } } ``` # Tutorial This tutorial progressively introduces Powertools for AWS Lambda (Python) core utilities by using one feature at a time. ## Requirements - [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html) and [configured with your credentials](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-getting-started-set-up-credentials.html). - [AWS SAM CLI](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html) installed. ## Getting started Let's clone our sample project before we add one feature at a time. Tip: Want to skip to the final project? Bootstrap directly via SAM CLI: ``` sam init --app-template hello-world-powertools-python --name sam-app --package-type Zip --runtime python3.13 --no-tracing ``` ``` sam init --runtime python3.13 --dependency-manager pip --app-template hello-world --name powertools-quickstart ``` ### Project structure As we move forward, we will modify the following files within the `powertools-quickstart` folder: - **app.py** - Application code. - **template.yaml** - AWS infrastructure configuration using SAM. - **requirements.txt** - List of extra Python packages needed. ### Code example Let's configure our base application to look like the following code snippet. ``` import json def hello(): return {"statusCode": 200, "body": json.dumps({"message": "hello unknown!"})} def lambda_handler(event, context): return hello() ``` ``` AWSTemplateFormatVersion: '2010-09-09' Transform: AWS::Serverless-2016-10-31 Description: Sample SAM Template for powertools-quickstart Globals: Function: Timeout: 3 Resources: HelloWorldFunction: Type: AWS::Serverless::Function Properties: CodeUri: hello_world/ Handler: app.lambda_handler Runtime: python3.9 Architectures: - x86_64 Events: HelloWorld: Type: Api Properties: Path: /hello Method: get Outputs: HelloWorldApi: Description: "API Gateway endpoint URL for Prod stage for Hello World function" Value: !Sub "https://${ServerlessRestApi}.execute-api.${AWS::Region}.amazonaws.com/Prod/hello/" ``` Our Lambda code consists of an entry point function named `lambda_handler`, and a `hello` function. When API Gateway receives a HTTP GET request on `/hello` route, Lambda will call our `lambda_handler` function, subsequently calling the `hello` function. API Gateway will use this response to return the correct HTTP Status Code and payload back to the caller. Warning For simplicity, we do not set up authentication and authorization! You can find more information on how to implement it on [AWS SAM documentation](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-controlling-access-to-apis.html). ### Run your code At each point, you have two ways to run your code: locally and within your AWS account. #### Local test AWS SAM allows you to execute a serverless application locally by running `sam build && sam local start-api` in your preferred shell. ``` > sam build && sam local start-api ... 2021-11-26 17:43:08 * Running on http://127.0.0.1:3000/ (Press CTRL+C to quit) ``` As a result, a local API endpoint will be exposed and you can invoke it using your browser, or your preferred HTTP API client e.g., [Postman](https://www.postman.com/downloads/), [httpie](https://httpie.io/), etc. ``` > curl http://127.0.0.1:3000/hello {"message": "hello unknown!"} ``` Info To learn more about local testing, please visit the [AWS SAM CLI local testing](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-cli-command-reference-sam-local-start-api.html) documentation. #### Live test First, you need to deploy your application into your AWS Account by issuing `sam build && sam deploy --guided` command. This command builds a ZIP package of your source code, and deploy it to your AWS Account. ``` > sam build && sam deploy --guided ... CloudFormation outputs from deployed stack ------------------------------------------------------------------------------------------------------------------------------------------ Outputs ------------------------------------------------------------------------------------------------------------------------------------------ Key HelloWorldFunctionIamRole Description Implicit IAM Role created for Hello World function Value arn:aws:iam::123456789012:role/sam-app-HelloWorldFunctionRole-1T2W3H9LZHGGV Key HelloWorldApi Description API Gateway endpoint URL for Prod stage for Hello World function Value https://1234567890.execute-api.eu-central-1.amazonaws.com/Prod/hello/ Key HelloWorldFunction Description Hello World Lambda Function ARN Value arn:aws:lambda:eu-central-1:123456789012:function:sam-app-HelloWorldFunction-dOcfAtYoEiGo ------------------------------------------------------------------------------------------------------------------------------------------ Successfully created/updated stack - sam-app in eu-central-1 ``` At the end of the deployment, you will find the API endpoint URL within `Outputs` section. You can use this URL to test your serverless application. ``` > curl https://1234567890.execute-api.eu-central-1.amazonaws.com/Prod/hello {"message": "hello unknown!"}% ``` Info For more details on AWS SAM deployment mechanism, see [SAM Deploy reference docs](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-cli-command-reference-sam-deploy.html). ## Routing ### Adding a new route Let's expand our application with a new route - `/hello/{name}`. It will accept an username as a path input and return it in the response. For this to work, we could create a new Lambda function to handle incoming requests for `/hello/{name}` - It'd look like this: ``` import json def hello_name(name): return {"statusCode": 200, "body": json.dumps({"message": f"hello {name}!"})} def lambda_handler(event, context): name = event["pathParameters"]["name"] return hello_name(name) ``` ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: Sample SAM Template for powertools-quickstart Globals: Function: Timeout: 3 Resources: HelloWorldFunction: Type: AWS::Serverless::Function Properties: CodeUri: hello_world/ Handler: app.lambda_handler Runtime: python3.9 Events: HelloWorld: Type: Api Properties: Path: /hello Method: get HelloWorldByNameFunctionName: Type: AWS::Serverless::Function Properties: CodeUri: hello_world/ Handler: hello_by_name.lambda_handler Runtime: python3.9 Events: HelloWorldName: Type: Api Properties: Path: /hello/{name} Method: get Outputs: HelloWorldApi: Description: "API Gateway endpoint URL for Prod stage for Hello World function" Value: !Sub "https://${ServerlessRestApi}.execute-api.${AWS::Region}.amazonaws.com/Prod/hello/" ``` Question But what happens if your application gets bigger and we need to cover numerous URL paths and HTTP methods for them? **This would quickly become non-trivial to maintain**. Adding new Lambda function for each path, or multiple if/else to handle several routes & HTTP Methods can be error prone. ### Creating our own router Question What if we create a simple router to reduce boilerplate? We could group similar routes and intents, separate read and write operations resulting in fewer functions. It doesn't address the boilerplate routing code, but maybe it will be easier to add additional URLs. Info: You might be already asking yourself about mono vs micro-functions If you want a more detailed explanation of these two approaches, head over to the [trade-offs on each approach](../core/event_handler/api_gateway/#considerations) later. A first attempt at the routing logic might look similar to the following code snippet. ``` import json def hello_name(event, **kargs): username = event["pathParameters"]["name"] return {"statusCode": 200, "body": json.dumps({"message": f"hello {username}!"})} def hello(**kargs): return {"statusCode": 200, "body": json.dumps({"message": "hello unknown!"})} class Router: def __init__(self): self.routes = {} def set(self, path, method, handler): self.routes[f"{path}-{method}"] = handler def get(self, path, method): try: route = self.routes[f"{path}-{method}"] except KeyError: raise RuntimeError(f"Cannot route request to the correct method. path={path}, method={method}") return route router = Router() router.set(path="/hello", method="GET", handler=hello) router.set(path="/hello/{name}", method="GET", handler=hello_name) def lambda_handler(event, context): path = event["resource"] http_method = event["httpMethod"] method = router.get(path=path, method=http_method) return method(event=event) ``` ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: Sample SAM Template for powertools-quickstart Globals: Function: Timeout: 3 Resources: HelloWorldFunction: Type: AWS::Serverless::Function Properties: CodeUri: hello_world/ Handler: app.lambda_handler Runtime: python3.9 Events: HelloWorld: Type: Api Properties: Path: /hello Method: get HelloWorldName: Type: Api Properties: Path: /hello/{name} Method: get Outputs: HelloWorldApi: Description: "API Gateway endpoint URL for Prod stage for Hello World function" Value: !Sub "https://${ServerlessRestApi}.execute-api.${AWS::Region}.amazonaws.com/Prod/hello/" ``` Let's break this down: - **L4,9**: We defined two `hello_name` and `hello` functions to handle `/hello/{name}` and `/hello` routes. - **L13:** We added a `Router` class to map a path, a method, and the function to call. - **L27-29**: We create a `Router` instance and map both `/hello` and `/hello/{name}`. - **L35:** We use Router's `get` method to retrieve a reference to the processing method (`hello` or `hello_name`). - **L36:** Finally, we run this method and send the results back to API Gateway. This approach simplifies the configuration of our infrastructure since we have added all API Gateway paths in the `HelloWorldFunction` event section. However, it forces us to understand the internal structure of the API Gateway request events, responses, and it could lead to other errors such as CORS not being handled properly, error handling, etc. ### Simplifying with Event Handler We can massively simplify cross-cutting concerns while keeping it lightweight by using [Event Handler](../core/event_handler/api_gateway/). Tip This is available for both [REST API (API Gateway, ALB)](../core/event_handler/api_gateway/) and [GraphQL API (AppSync)](../core/event_handler/appsync/). Let's include Powertools for AWS Lambda (Python) as a dependency in `requirement.txt`, and use Event Handler to refactor our previous example. ``` from aws_lambda_powertools.event_handler import APIGatewayRestResolver app = APIGatewayRestResolver() @app.get("/hello/") def hello_name(name): return {"message": f"hello {name}!"} @app.get("/hello") def hello(): return {"message": "hello unknown!"} def lambda_handler(event, context): return app.resolve(event, context) ``` ``` aws-lambda-powertools[tracer] # Tracer requires AWS X-Ray SDK dependency ``` Use `sam build && sam local start-api` and try run it locally again. Note If you're coming from [Flask](https://flask.palletsprojects.com/en/2.0.x/), you will be familiar with this experience already. [Event Handler for API Gateway](../core/event_handler/api_gateway/) uses `APIGatewayRestResolver` to give a Flask-like experience while staying true to our tenet `Keep it lean`. We have added the route annotation as the decorator for our methods. It enables us to use the parameters passed in the request directly, and our responses are simply dictionaries. Lastly, we used `return app.resolve(event, context)` so Event Handler can resolve routes, inject the current request, handle serialization, route validation, etc. From here, we could handle [404 routes](../core/event_handler/api_gateway/#handling-not-found-routes), [error handling](../core/event_handler/api_gateway/#exception-handling), [access query strings, payload](../core/event_handler/api_gateway/#accessing-request-details), etc. Tip If you'd like to learn how python decorators work under the hood, you can follow [Real Python](https://realpython.com/primer-on-python-decorators/)'s article. ## Structured Logging Over time, you realize that searching logs as text results in poor observability, it's hard to create metrics from, enumerate common exceptions, etc. Then, you decided to propose production quality logging capabilities to your Lambda code. You found out that by having logs as `JSON` you can [structure them](https://docs.aws.amazon.com/lambda/latest/operatorguide/parse-logs.html), so that you can use any Log Analytics tool out there to quickly analyze them. This helps not only in searching, but produces consistent logs containing enough context and data to ask arbitrary questions on the status of your system. We can take advantage of CloudWatch Logs and Cloudwatch Insight for this purpose. ### JSON as output The first option could be to use the standard Python Logger, and use a specialized library like `pythonjsonlogger` to create a JSON Formatter. ``` import logging import os from pythonjsonlogger import jsonlogger from aws_lambda_powertools.event_handler import APIGatewayRestResolver logger = logging.getLogger("APP") logHandler = logging.StreamHandler() formatter = jsonlogger.JsonFormatter(fmt="%(asctime)s %(levelname)s %(name)s %(message)s") logHandler.setFormatter(formatter) logger.addHandler(logHandler) logger.setLevel(os.getenv("POWERTOOLS_LOG_LEVEL", "INFO")) app = APIGatewayRestResolver() @app.get("/hello/") def hello_name(name): logger.info(f"Request from {name} received") return {"message": f"hello {name}!"} @app.get("/hello") def hello(): logger.info("Request from unknown received") return {"message": "hello unknown!"} def lambda_handler(event, context): logger.debug(event) return app.resolve(event, context) ``` ``` aws-lambda-powertools python-json-logger ``` With just a few lines our logs will now output to `JSON` format. We've taken the following steps to make that work: - **L7**: Creates an application logger named `APP`. - **L8-11**: Configures handler and formatter. - **L12**: Sets the logging level set in the `POWERTOOLS_LOG_LEVEL` environment variable, or `INFO` as a sentinel value. After that, we use this logger in our application code to record the required information. We see logs structured as follows: ``` { "asctime": "2021-11-22 15:32:02,145", "levelname": "INFO", "name": "APP", "message": "Request from unknown received" } ``` ``` [INFO] 2021-11-22T15:32:02.145Z ba3bea3d-fe3a-45db-a2ce-72e813d55b91 Request from unknown received ``` So far, so good! We can take a step further now by adding additional context to the logs. We could start by creating a dictionary with Lambda context information or something from the incoming event, which should always be logged. Additional attributes could be added on every `logger.info` using `extra` keyword like in any standard Python logger. ### Simplifying with Logger Surely this could be easier, right? Yes! Powertools for AWS Lambda (Python) Logger to the rescue :-) As we already have Powertools for AWS Lambda (Python) as a dependency, we can simply import [Logger](../core/logger/). ``` from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths logger = Logger(service="APP") app = APIGatewayRestResolver() @app.get("/hello/") def hello_name(name): logger.info(f"Request from {name} received") return {"message": f"hello {name}!"} @app.get("/hello") def hello(): logger.info("Request from unknown received") return {"message": "hello unknown!"} @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST, log_event=True) def lambda_handler(event, context): return app.resolve(event, context) ``` Let's break this down: - **L5**: We add Powertools for AWS Lambda (Python) Logger; the boilerplate is now done for you. By default, we set `INFO` as the logging level if `POWERTOOLS_LOG_LEVEL` env var isn't set. - **L22**: We use `logger.inject_lambda_context` decorator to inject key information from Lambda context into every log. - **L22**: We also instruct Logger to use the incoming API Gateway Request ID as a [correlation id](../core/logger/##set_correlation_id-method) automatically. - **L22**: Since we're in dev, we also use `log_event=True` to automatically log each incoming request for debugging. This can be also set via [environment variables](./#environment-variables). This is how the logs would look like now: ``` { "level":"INFO", "location":"hello:17", "message":"Request from unknown received", "timestamp":"2021-10-22 16:29:58,367+0000", "service":"APP", "cold_start":true, "function_name":"HelloWorldFunction", "function_memory_size":"256", "function_arn":"arn:aws:lambda:us-east-1:123456789012:function:HelloWorldFunction", "function_request_id":"d50bb07a-7712-4b2d-9f5d-c837302221a2", "correlation_id":"bf9b584c-e5d9-4ad5-af3d-db953f2b10dc" } ``` We can now search our logs by the request ID to find a specific operation. Additionally, we can also search our logs for function name, Lambda request ID, Lambda function ARN, find out whether an operation was a cold start, etc. From here, we could [set specific keys](../core/logger/#append_keys-method) to add additional contextual information about a given operation, [log exceptions](../core/logger/#logging-exceptions) to easily enumerate them later, [sample debug logs](../core/logger/#sampling-debug-logs), etc. By having structured logs like this, we can easily search and analyse them in [CloudWatch Logs Insight](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/AnalyzingLogData.html). ## Tracing Note You won't see any traces in AWS X-Ray when executing your function locally. The next improvement is to add distributed tracing to your stack. Traces help you visualize end-to-end transactions or parts of it to easily debug upstream/downstream anomalies. Combined with structured logs, it is an important step to be able to observe how your application runs in production. ### Generating traces [AWS X-Ray](https://aws.amazon.com/xray/) is the distributed tracing service we're going to use. But how do we generate application traces in the first place? It's a [two-step process](https://docs.aws.amazon.com/lambda/latest/dg/services-xray.html): 1. Enable tracing in your Lambda function. 1. Instrument your application code. Let's explore how we can instrument our code with [AWS X-Ray SDK](https://docs.aws.amazon.com/xray-sdk-for-python/latest/reference/index.html), and then simplify it with [Powertools for AWS Lambda (Python) Tracer](../core/tracer/) feature. ``` from aws_xray_sdk.core import xray_recorder from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths logger = Logger(service="APP") app = APIGatewayRestResolver() @app.get("/hello/") @xray_recorder.capture('hello_name') def hello_name(name): logger.info(f"Request from {name} received") return {"message": f"hello {name}!"} @app.get("/hello") @xray_recorder.capture('hello') def hello(): logger.info("Request from unknown received") return {"message": "hello unknown!"} @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST, log_event=True) @xray_recorder.capture('handler') def lambda_handler(event, context): return app.resolve(event, context) ``` ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: Sample SAM Template for powertools-quickstart Globals: Function: Timeout: 3 Api: TracingEnabled: true Resources: HelloWorldFunction: Type: AWS::Serverless::Function Properties: CodeUri: hello_world/ Handler: app.lambda_handler Runtime: python3.9 Tracing: Active Events: HelloWorld: Type: Api Properties: Path: /hello Method: get HelloWorldName: Type: Api Properties: Path: /hello/{name} Method: get Outputs: HelloWorldApi: Description: "API Gateway endpoint URL for Prod stage for Hello World function" Value: !Sub "https://${ServerlessRestApi}.execute-api.${AWS::Region}.amazonaws.com/Prod/hello/" ``` ``` aws-lambda-powertools aws-xray-sdk ``` Let's break it down: - **L1**: First, we import AWS X-Ray SDK. `xray_recorder` records blocks of code being traced ([subsegment](https://docs.aws.amazon.com/xray/latest/devguide/xray-concepts.html#xray-concepts-subsegments)). It also sends generated traces to the AWS X-Ray daemon running in the Lambda service who subsequently forwards them to AWS X-Ray service. - **L13,20,27**: We decorate our function so the SDK traces the end-to-end execution, and the argument names the generated block being traced. Question But how do I enable tracing for the Lambda function and what permissions do I need? We've made the following changes in `template.yaml` for this to work seamless: - **L7-8**: Enables tracing for Amazon API Gateway. - **L16**: Enables tracing for our Serverless Function. This will also add a managed IAM Policy named [AWSXRayDaemonWriteAccess](https://console.aws.amazon.com/iam/home#/policies/arn:aws:iam::aws:policy/AWSXRayDaemonWriteAccess) to allow Lambda to send traces to AWS X-Ray. You can now build and deploy our updates with `sam build && sam deploy`. Once deployed, try invoking the application via the API endpoint, and visit [AWS X-Ray Console](https://console.aws.amazon.com/xray/home#/traces/) to see how much progress we've made so far!! ### Enriching our generated traces What we've done helps bring an initial visibility, but we can do so much more. Question You're probably asking yourself at least the following questions: - What if I want to search traces by customer name? - What about grouping traces with cold starts? - Better yet, what if we want to include the request or response of our functions as part of the trace? Within AWS X-Ray, we can answer these questions by using two features: tracing **Annotations** and **Metadata**. **Annotations** are simple key-value pairs that are indexed for use with [filter expressions](https://docs.aws.amazon.com/xray/latest/devguide/xray-console-filters.html). **Metadata** are key-value pairs with values of any type, including objects and lists, but that are not indexed. Let's put them into action. ``` from aws_xray_sdk.core import patch_all, xray_recorder from aws_lambda_powertools import Logger from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths logger = Logger(service="APP") app = APIGatewayRestResolver() cold_start = True patch_all() @app.get("/hello/") @xray_recorder.capture('hello_name') def hello_name(name): subsegment = xray_recorder.current_subsegment() subsegment.put_annotation(key="User", value=name) logger.info(f"Request from {name} received") return {"message": f"hello {name}!"} @app.get("/hello") @xray_recorder.capture('hello') def hello(): subsegment = xray_recorder.current_subsegment() subsegment.put_annotation(key="User", value="unknown") logger.info("Request from unknown received") return {"message": "hello unknown!"} @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST, log_event=True) @xray_recorder.capture('handler') def lambda_handler(event, context): global cold_start subsegment = xray_recorder.current_subsegment() subsegment.put_annotation(key="ColdStart", value=cold_start) if cold_start: cold_start = False result = app.resolve(event, context) subsegment.put_metadata("response", result) return result ``` Let's break it down: - **L10**: We track Lambda cold start by setting global variable outside the handler; this is executed once per sandbox Lambda creates. This information provides an overview of how often the sandbox is reused by Lambda, which directly impacts the performance of each transaction. - **L17-18**: We use AWS X-Ray SDK to add `User` annotation on `hello_name` subsegment. This will allow us to filter traces using the `User` value. - **L26-27**: We repeat what we did in L17-18 except we use the value `unknown` since we don't have that information. - **L35**: We use `global` to modify our global variable defined in the outer scope. - **37-41**: We add `ColdStart` annotation and set `cold_start` variable to `false`, so that subsequent requests annotates the value `false` when the sandbox is reused. - **L44**: We include the final response under `response` key as part of the `handler` subsegment. Info If you want to understand how the Lambda execution environment (sandbox) works and why cold starts can occur, see this [blog series on Lambda performance](https://aws.amazon.com/blogs/compute/operating-lambda-performance-optimization-part-1/). Repeat the process of building, deploying, and invoking your application via the API endpoint. Within the [AWS X-Ray Console](https://console.aws.amazon.com/xray/home#/traces/), you should now be able to group traces by the `User` and `ColdStart` annotation. If you choose any of the traces available, try opening the `handler` subsegment and you should see the response of your Lambda function under the `Metadata` tab. ### Simplifying with Tracer Cross-cutting concerns like filtering traces by Cold Start, including response as well as exceptions as tracing metadata can take a considerable amount of boilerplate. We can simplify our previous patterns by using [Powertools for AWS Lambda (Python) Tracer](../core/tracer/); a thin wrapper on top of X-Ray SDK. Note You can now safely remove `aws-xray-sdk` from `requirements.txt`; keep `aws-lambda-powertools[tracer]` only. ``` from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths logger = Logger(service="APP") tracer = Tracer(service="APP") app = APIGatewayRestResolver() @app.get("/hello/") @tracer.capture_method def hello_name(name): tracer.put_annotation(key="User", value=name) logger.info(f"Request from {name} received") return {"message": f"hello {name}!"} @app.get("/hello") @tracer.capture_method def hello(): tracer.put_annotation(key="User", value="unknown") logger.info("Request from unknown received") return {"message": "hello unknown!"} @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST, log_event=True) @tracer.capture_lambda_handler def lambda_handler(event, context): return app.resolve(event, context) ``` Decorators, annotations and metadata are largely the same, except we now have a much cleaner code as the boilerplate is gone. Here's what's changed compared to AWS X-Ray SDK approach: - **L6**: We initialize `Tracer` and define the name of our service (`APP`). We automatically run `patch_all` from AWS X-Ray SDK on your behalf. Any previously patched or non-imported library is simply ignored. - **L11**: We use `@tracer.capture_method` decorator instead of `xray_recorder.capture`. We automatically create a subsegment named after the function name (`## hello_name`), and add the response/exception as tracing metadata. - **L13**: Putting annotations remain exactly the same UX. - **L27**: We use `@tracer.lambda_handler` so we automatically add `ColdStart` annotation within Tracer itself. We also add a new `Service` annotation using the value of `Tracer(service="APP")`, so that you can filter traces by the service your function(s) represent. Another subtle difference is that you can now run your Lambda functions and unit test them locally without having to explicitly disable Tracer. Powertools for AWS Lambda (Python) optimizes for Lambda compute environment. As such, we add these and other common approaches to accelerate your development, so you don't worry about implementing every cross-cutting concern. Tip You can [opt-out some of these behaviours](../core/tracer/#advanced) like disabling response capturing, explicitly patching only X modules, etc. Repeat the process of building, deploying, and invoking your application via the API endpoint. Within the [AWS X-Ray Console](https://console.aws.amazon.com/xray/home#/traces/), you should see a similar view: Tip Consider using [Amazon CloudWatch ServiceLens view](https://console.aws.amazon.com/cloudwatch/home#servicelens:service-map/map) as it aggregates AWS X-Ray traces and CloudWatch metrics and logs in one view. From here, you can browse to specific logs in CloudWatch Logs Insight, Metrics Dashboard or AWS X-Ray traces. Info For more information on Amazon CloudWatch ServiceLens, please visit [link](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/ServiceLens.html). ## Custom Metrics ### Creating metrics Let's add custom metrics to better understand our application and business behavior (e.g. number of reservations, etc.). By default, AWS Lambda adds [invocation and performance metrics](https://docs.aws.amazon.com/lambda/latest/dg/monitoring-metrics.html#monitoring-metrics-types), and Amazon API Gateway adds [latency and some HTTP metrics](https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-metrics-and-dimensions.html#api-gateway-metrics). Tip You can [optionally enable detailed metrics](https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-metrics-and-dimensions.html#api-gateway-metricdimensions) per each API route, stage, and method in API Gateway. Let's expand our application with custom metrics using AWS SDK to see how it works, then let's upgrade it with Powertools for AWS Lambda (Python) :-) ``` import os import boto3 from aws_lambda_powertools import Logger, Tracer from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths cold_start = True metric_namespace = "MyApp" logger = Logger(service="APP") tracer = Tracer(service="APP") metrics = boto3.client("cloudwatch") app = APIGatewayRestResolver() @tracer.capture_method def add_greeting_metric(service: str = "APP"): function_name = os.getenv("AWS_LAMBDA_FUNCTION_NAME", "undefined") service_dimension = {"Name": "service", "Value": service} function_dimension = {"Name": "function_name", "Value": function_name} is_cold_start = True global cold_start if cold_start: cold_start = False else: is_cold_start = False return metrics.put_metric_data( MetricData=[ { "MetricName": "SuccessfulGreetings", "Dimensions": [service_dimension], "Unit": "Count", "Value": 1, }, { "MetricName": "ColdStart", "Dimensions": [service_dimension, function_dimension], "Unit": "Count", "Value": int(is_cold_start) } ], Namespace=metric_namespace, ) @app.get("/hello/") @tracer.capture_method def hello_name(name): tracer.put_annotation(key="User", value=name) logger.info(f"Request from {name} received") add_greeting_metric() return {"message": f"hello {name}!"} @app.get("/hello") @tracer.capture_method def hello(): tracer.put_annotation(key="User", value="unknown") logger.info("Request from unknown received") add_greeting_metric() return {"message": "hello unknown!"} @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST, log_event=True) @tracer.capture_lambda_handler def lambda_handler(event, context): return app.resolve(event, context) ``` ``` AWSTemplateFormatVersion: "2010-09-09" Transform: AWS::Serverless-2016-10-31 Description: Sample SAM Template for powertools-quickstart Globals: Function: Timeout: 3 Resources: HelloWorldFunction: Type: AWS::Serverless::Function Properties: CodeUri: hello_world/ Handler: app.lambda_handler Runtime: python3.9 Tracing: Active Events: HelloWorld: Type: Api Properties: Path: /hello Method: get HelloWorldName: Type: Api Properties: Path: /hello/{name} Method: get Policies: - CloudWatchPutMetricPolicy: {} Outputs: HelloWorldApi: Description: "API Gateway endpoint URL for Prod stage for Hello World function" Value: !Sub "https://${ServerlessRestApi}.execute-api.${AWS::Region}.amazonaws.com/Prod/hello/" ``` There's a lot going on, let's break this down: - **L10**: We define a container where all of our application metrics will live `MyApp`, a.k.a [Metrics Namespace](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/cloudwatch_concepts.html). - **L14**: We initialize a CloudWatch client to send metrics later. - **L19-47**: We create a custom function to prepare and send `ColdStart` and `SuccessfulGreetings` metrics using CloudWatch expected data structure. We also set [dimensions](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/cloudwatch_concepts.html#Dimension) of these metrics. - Think of them as metadata to define to slice and dice them later; an unique metric is a combination of metric name + metric dimension(s). - **L55,64**: We call our custom function to create metrics for every greeting received. Question But what permissions do I need to send metrics to CloudWatch? Within `template.yaml`, we add [CloudWatchPutMetricPolicy](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-policy-template-list.html#cloudwatch-put-metric-policy) policy in SAM. Adding metrics via AWS SDK gives a lot of flexibility at a cost `put_metric_data` is a synchronous call to CloudWatch Metrics API. This means establishing a connection to CloudWatch endpoint, sending metrics payload, and waiting from a response. It will be visible in your AWS X-RAY traces as additional external call. Given your architecture scale, this approach might lead to disadvantages such as increased cost of measuring data collection and increased Lambda latency. ### Simplifying with Metrics [Powertools for AWS Lambda (Python) Metrics](../core/metrics/) uses [Amazon CloudWatch Embedded Metric Format (EMF)](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch_Embedded_Metric_Format.html) to create custom metrics **asynchronously** via a native integration with Lambda. In general terms, EMF is a specification that expects metrics in a JSON payload within CloudWatch Logs. Lambda ingests all logs emitted by a given function into CloudWatch Logs. CloudWatch automatically looks up for log entries that follow the EMF format and transforms them into a CloudWatch metric. Info If you are interested in the details of the EMF mechanism, follow [blog post](https://aws.amazon.com/blogs/mt/enhancing-workload-observability-using-amazon-cloudwatch-embedded-metric-format/). Let's implement that using [Metrics](../core/metrics/): ``` from aws_lambda_powertools import Logger, Tracer, Metrics from aws_lambda_powertools.event_handler import APIGatewayRestResolver from aws_lambda_powertools.logging import correlation_paths from aws_lambda_powertools.metrics import MetricUnit logger = Logger(service="APP") tracer = Tracer(service="APP") metrics = Metrics(namespace="MyApp", service="APP") app = APIGatewayRestResolver() @app.get("/hello/") @tracer.capture_method def hello_name(name): tracer.put_annotation(key="User", value=name) logger.info(f"Request from {name} received") metrics.add_metric(name="SuccessfulGreetings", unit=MetricUnit.Count, value=1) return {"message": f"hello {name}!"} @app.get("/hello") @tracer.capture_method def hello(): tracer.put_annotation(key="User", value="unknown") logger.info("Request from unknown received") metrics.add_metric(name="SuccessfulGreetings", unit=MetricUnit.Count, value=1) return {"message": "hello unknown!"} @tracer.capture_lambda_handler @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST, log_event=True) @metrics.log_metrics(capture_cold_start_metric=True) def lambda_handler(event, context): try: return app.resolve(event, context) except Exception as e: logger.exception(e) raise ``` That's a lot less boilerplate code! Let's break this down: - **L9**: We initialize `Metrics` with our service name (`APP`) and metrics namespace (`MyApp`), reducing the need to add the `service` dimension for every metric and setting the namespace later - **L18, 27**: We use `add_metric` similarly to our custom function, except we now have an enum `MetricCount` to help us understand which Metric Units we have at our disposal - **L33**: We use `@metrics.log_metrics` decorator to ensure that our metrics are aligned with the EMF output and validated before-hand, like in case we forget to set namespace, or accidentally use a metric unit as a string that doesn't exist in CloudWatch. - **L33**: We also use `capture_cold_start_metric=True` so we don't have to handle that logic either. Note that [Metrics](../core/metrics/) does not publish a warm invocation metric (ColdStart=0) for cost reasons. As such, treat the absence (sparse metric) as a non-cold start invocation. Repeat the process of building, deploying, and invoking your application via the API endpoint a few times to generate metrics - [Artillery](https://www.artillery.io/) and [K6.io](https://k6.io/open-source) are quick ways to generate some load. Within [CloudWatch Metrics view](), you should see `MyApp` custom namespace with your custom metrics there and `SuccessfulGreetings` available to graph. If you're curious about how the EMF portion of your function logs look like, you can quickly go to [CloudWatch ServiceLens view](https://console.aws.amazon.com/cloudwatch/home#servicelens:service-map/map), choose your function and open logs. You will see a similar entry that looks like this: ``` { "_aws": { "Timestamp": 1638115724269, "CloudWatchMetrics": [ { "Namespace": "CustomMetrics", "Dimensions": [ [ "method", "service" ] ], "Metrics": [ { "Name": "AppMethodsInvocations", "Unit": "Count" } ] } ] }, "method": "/hello/", "service": "APP", "AppMethodsInvocations": [ 1 ] } ``` ## Final considerations We covered a lot of ground here and we only scratched the surface of the feature set available within Powertools for AWS Lambda (Python). When it comes to the observability features ([Tracer](../core/tracer/), [Metrics](../core/metrics/), [Logging](../core/logger/)), don't stop there! The goal here is to ensure you can ask arbitrary questions to assess your system's health; these features are only part of the wider story! This requires a change in mindset to ensure operational excellence is part of the software development lifecycle. Tip You can find more details on other leading practices described in the [Well-Architected Serverless Lens](https://aws.amazon.com/blogs/aws/new-serverless-lens-in-aws-well-architected-tool/). Powertools for AWS Lambda (Python) is largely designed to make some of these practices easier to adopt from day 1. Have ideas for other tutorials? You can open up a [documentation issue](https://github.com/aws-powertools/powertools-lambda-python/issues/new?assignees=&labels=documentation&template=documentation-improvements.md&title=Tutorial%20Suggestion), or via e-mail [aws-powertools-maintainers@amazon.com](mailto:aws-powertools-maintainers@amazon.com).