Type hinting and code completion for common event types
Helper functions for decoding/deserializing nested fields
Docstrings for fields contained in event schemas
Background
When authoring Lambda functions, you often need to understand the schema of the event dictionary which is passed to the
handler. There are several common event types which follow a specific schema, depending on the service triggering the
Lambda function.
The examples provided below are far from exhaustive - the data classes themselves are designed to provide a form of
documentation inherently (via autocompletion, types and docstrings).
In this example, we also use the new Logger correlation_id and built-in correlation_paths to extract, if available, X-Ray Trace ID in AppSync request headers:
fromaws_lambda_powertools.loggingimportLogger,correlation_pathsfromaws_lambda_powertools.utilities.data_classes.appsync_resolver_eventimport(AppSyncResolverEvent,AppSyncIdentityCognito)logger=Logger()defget_locations(name:str=None,size:int=0,page:int=0):"""Your resolver logic here"""@logger.inject_lambda_context(correlation_id_path=correlation_paths.APPSYNC_RESOLVER)deflambda_handler(event,context):event:AppSyncResolverEvent=AppSyncResolverEvent(event)# Case insensitive look up of request headersx_forwarded_for=event.get_header_value("x-forwarded-for")# Support for AppSyncIdentityCognito or AppSyncIdentityIAM identity typesassertisinstance(event.identity,AppSyncIdentityCognito)identity:AppSyncIdentityCognito=event.identity# Logging with correlation_idlogger.debug({"x-forwarded-for":x_forwarded_for,"username":identity.username})ifevent.type_name=="Merchant"andevent.field_name=="locations":returnget_locations(**event.arguments)raiseValueError(f"Unsupported field resolver: {event.field_name}")
CloudWatch Logs events by default are compressed and base64 encoded. You can use the helper function provided to decode,
decompress and parse json data from the event.
Cognito User Pools have several different Lambda trigger sources, all of which map to a different data class, which
can be imported from aws_lambda_powertools.data_classes.cognito_user_pool_event:
The DynamoDB data class utility provides the base class for DynamoDBStreamEvent, a typed class for
attributes values (AttributeValue), as well as enums for stream view type (StreamViewType) and event type
(DynamoDBRecordEventName).
1 2 3 4 5 6 7 8 910111213
fromaws_lambda_powertools.utilities.data_classes.dynamo_db_stream_eventimport(DynamoDBStreamEvent,DynamoDBRecordEventName)deflambda_handler(event,context):event:DynamoDBStreamEvent=DynamoDBStreamEvent(event)# Multiple records can be delivered in a single eventforrecordinevent.records:ifrecord.event_name==DynamoDBRecordEventName.MODIFY:do_something_with(record.dynamodb.new_image)do_something_with(record.dynamodb.old_image)
Kinesis events by default contain base64 encoded data. You can use the helper function to access the data either as json
or plain text, depending on the original payload.
1 2 3 4 5 6 7 8 910111213
fromaws_lambda_powertools.utilities.data_classesimportKinesisStreamEventdeflambda_handler(event,context):event:KinesisStreamEvent=KinesisStreamEvent(event)kinesis_record=next(event.records).kinesis# if data was delivered as textdata=kinesis_record.data_as_text()# if data was delivered as jsondata=kinesis_record.data_as_json()do_something_with(data)
fromurllib.parseimportunquote_plusfromaws_lambda_powertools.utilities.data_classesimportS3Eventdeflambda_handler(event,context):event:S3Event=S3Event(event)bucket_name=event.bucket_name# Multiple records can be delivered in a single eventforrecordinevent.records:object_key=unquote_plus(record.s3.get_object.key)do_something_with(f'{bucket_name}/{object_key}')
importboto3importrequestsfromaws_lambda_powertoolsimportLoggerfromaws_lambda_powertools.logging.correlation_pathsimportS3_OBJECT_LAMBDAfromaws_lambda_powertools.utilities.data_classes.s3_object_eventimportS3ObjectLambdaEventlogger=Logger()session=boto3.Session()s3=session.client("s3")@logger.inject_lambda_context(correlation_id_path=S3_OBJECT_LAMBDA,log_event=True)deflambda_handler(event,context):event=S3ObjectLambdaEvent(event)# Get object from S3response=requests.get(event.input_s3_url)original_object=response.content.decode("utf-8")# Make changes to the object about to be returnedtransformed_object=original_object.upper()# Write object back to S3 Object Lambdas3.write_get_object_response(Body=transformed_object,RequestRoute=event.request_route,RequestToken=event.request_token)return{"status_code":200}
fromaws_lambda_powertools.utilities.data_classesimportSESEventdeflambda_handler(event,context):event:SESEvent=SESEvent(event)# Multiple records can be delivered in a single eventforrecordinevent.records:mail=record.ses.mailcommon_headers=mail.common_headersdo_something_with(common_headers.to,common_headers.subject)
fromaws_lambda_powertools.utilities.data_classesimportSNSEventdeflambda_handler(event,context):event:SNSEvent=SNSEvent(event)# Multiple records can be delivered in a single eventforrecordinevent.records:message=record.sns.messagesubject=record.sns.subjectdo_something_with(subject,message)
fromaws_lambda_powertools.utilities.data_classesimportSQSEventdeflambda_handler(event,context):event:SQSEvent=SQSEvent(event)# Multiple records can be delivered in a single eventforrecordinevent.records:do_something_with(record.body)