Type hinting and code completion for common event types
Helper functions for decoding/deserializing nested fields
Docstrings for fields contained in event schemas
Background
When authoring Lambda functions, you often need to understand the schema of the event dictionary which is passed to the
handler. There are several common event types which follow a specific schema, depending on the service triggering the
Lambda function.
The classes are initialized by passing in the Lambda event object into the constructor of the appropriate data class or
by using the event_source decorator.
For example, if your Lambda function is being triggered by an API Gateway proxy integration, you can use the
APIGatewayProxyEvent class.
The examples provided below are far from exhaustive - the data classes themselves are designed to provide a form of
documentation inherently (via autocompletion, types and docstrings).
In this example, we also use the new Logger correlation_id and built-in correlation_paths to extract, if available, X-Ray Trace ID in AppSync request headers:
fromaws_lambda_powertools.loggingimportLogger,correlation_pathsfromaws_lambda_powertools.utilities.data_classes.appsync_resolver_eventimport(AppSyncResolverEvent,AppSyncIdentityCognito)logger=Logger()defget_locations(name:str=None,size:int=0,page:int=0):"""Your resolver logic here"""@logger.inject_lambda_context(correlation_id_path=correlation_paths.APPSYNC_RESOLVER)deflambda_handler(event,context):event:AppSyncResolverEvent=AppSyncResolverEvent(event)# Case insensitive look up of request headersx_forwarded_for=event.get_header_value("x-forwarded-for")# Support for AppSyncIdentityCognito or AppSyncIdentityIAM identity typesassertisinstance(event.identity,AppSyncIdentityCognito)identity:AppSyncIdentityCognito=event.identity# Logging with correlation_idlogger.debug({"x-forwarded-for":x_forwarded_for,"username":identity.username})ifevent.type_name=="Merchant"andevent.field_name=="locations":returnget_locations(**event.arguments)raiseValueError(f"Unsupported field resolver: {event.field_name}")
CloudWatch Logs events by default are compressed and base64 encoded. You can use the helper function provided to decode,
decompress and parse json data from the event.
fromaws_lambda_powertoolsimportLoggerfromaws_lambda_powertools.utilities.data_classesimportevent_source,CodePipelineJobEventlogger=Logger()@event_source(data_class=CodePipelineJobEvent)deflambda_handler(event,context):"""The Lambda function handler If a continuing job then checks the CloudFormation stack status and updates the job accordingly. If a new job then kick of an update or creation of the target CloudFormation stack. """# Extract the Job IDjob_id=event.get_id# Extract the paramsparams:dict=event.decoded_user_parametersstack=params["stack"]artifact_name=params["artifact"]template_file=params["file"]try:ifevent.data.continuation_token:# If we're continuing then the create/update has already been triggered# we just need to check if it has finished.check_stack_update_status(job_id,stack)else:template=event.get_artifact(artifact_name,template_file)# Kick off a stack update or createstart_update_or_create(job_id,stack,template)exceptExceptionase:# If any other exceptions which we didn't expect are raised# then fail the job and log the exception message.logger.exception("Function failed due to exception.")put_job_failure(job_id,"Function exception: "+str(e))logger.debug("Function complete.")return"Complete."
Cognito User Pools have several different Lambda trigger sources, all of which map to a different data class, which
can be imported from aws_lambda_powertools.data_classes.cognito_user_pool_event:
The DynamoDB data class utility provides the base class for DynamoDBStreamEvent, a typed class for
attributes values (AttributeValue), as well as enums for stream view type (StreamViewType) and event type
(DynamoDBRecordEventName).
1 2 3 4 5 6 7 8 910111213
fromaws_lambda_powertools.utilities.data_classes.dynamo_db_stream_eventimport(DynamoDBStreamEvent,DynamoDBRecordEventName)deflambda_handler(event,context):event:DynamoDBStreamEvent=DynamoDBStreamEvent(event)# Multiple records can be delivered in a single eventforrecordinevent.records:ifrecord.event_name==DynamoDBRecordEventName.MODIFY:do_something_with(record.dynamodb.new_image)do_something_with(record.dynamodb.old_image)
Kinesis events by default contain base64 encoded data. You can use the helper function to access the data either as json
or plain text, depending on the original payload.
1 2 3 4 5 6 7 8 910111213
fromaws_lambda_powertools.utilities.data_classesimportevent_source,KinesisStreamEvent@event_source(data_class=KinesisStreamEvent)deflambda_handler(event:KinesisStreamEvent,context):kinesis_record=next(event.records).kinesis# if data was delivered as textdata=kinesis_record.data_as_text()# if data was delivered as jsondata=kinesis_record.data_as_json()do_something_with(data)
fromurllib.parseimportunquote_plusfromaws_lambda_powertools.utilities.data_classesimportevent_source,S3Event@event_source(data_class=S3Event)deflambda_handler(event:S3Event,context):bucket_name=event.bucket_name# Multiple records can be delivered in a single eventforrecordinevent.records:object_key=unquote_plus(record.s3.get_object.key)do_something_with(f"{bucket_name}/{object_key}")
importboto3importrequestsfromaws_lambda_powertoolsimportLoggerfromaws_lambda_powertools.logging.correlation_pathsimportS3_OBJECT_LAMBDAfromaws_lambda_powertools.utilities.data_classes.s3_object_eventimportS3ObjectLambdaEventlogger=Logger()session=boto3.Session()s3=session.client("s3")@logger.inject_lambda_context(correlation_id_path=S3_OBJECT_LAMBDA,log_event=True)deflambda_handler(event,context):event=S3ObjectLambdaEvent(event)# Get object from S3response=requests.get(event.input_s3_url)original_object=response.content.decode("utf-8")# Make changes to the object about to be returnedtransformed_object=original_object.upper()# Write object back to S3 Object Lambdas3.write_get_object_response(Body=transformed_object,RequestRoute=event.request_route,RequestToken=event.request_token)return{"status_code":200}
fromaws_lambda_powertools.utilities.data_classesimportevent_source,SESEvent@event_source(data_class=SESEvent)deflambda_handler(event:SESEvent,context):# Multiple records can be delivered in a single eventforrecordinevent.records:mail=record.ses.mailcommon_headers=mail.common_headersdo_something_with(common_headers.to,common_headers.subject)
fromaws_lambda_powertools.utilities.data_classesimportevent_source,SNSEvent@event_source(data_class=SNSEvent)deflambda_handler(event:SNSEvent,context):# Multiple records can be delivered in a single eventforrecordinevent.records:message=record.sns.messagesubject=record.sns.subjectdo_something_with(subject,message)
fromaws_lambda_powertools.utilities.data_classesimportevent_source,SQSEvent@event_source(data_class=SQSEvent)deflambda_handler(event:SQSEvent,context):# Multiple records can be delivered in a single eventforrecordinevent.records:do_something_with(record.body)