Deploying AWS Lambda Function with Terraform + Custom Dependencies

Deploying Lambda Functions written in Python that use custom dependencies not included in AWS Lambda Runtime for Python with Terraform as Infrastructure as Code (IaC).

AWS Lambda Functions

In this article, we will learn:

  • How to deploy Lambda function on Amazon Web Services.
  • How to use Terraform to deploy our project infrastructure.
  • How integrate S3 notifications with our Lambda Function.
  • How to package our Lambda Function with all ours dependencies and deploy it to AWS.

The purpose of this example project is: create a Lambda Function that will be activated by AWS S3 Event trowed when an video in “.mp4” format is copied in an S3 Bucket. The Lambda Function will notify us calling an mocked REST API endpoint.

Project structure:





Terraform allow us to define infrastructure as code to manage the full lifecycle of our resources, create new resources, manage existing ones, and destroy those no longer needed.

For install Terraform in your system go through:

Starting writing the Lambda Function

This Lambda Function has some behaviors, first in order to use some external package we will using “json_checker” to check some dictionary’s schema, later we read the files that sends the S3 Event notification and finally we send the data to an REST API endpoint using:


import json
import requests

from json_checker import Checker

def lambda_handler(event, context):
current_data = {'first_key': 1, 'second_key': '2'}
expected_schema = {'first_key': int, 'second_key': str}
checker = Checker(expected_schema)

s3_events = []
for record in event['Records']:
"bucket": record['s3']['bucket']['name'],
"file": record['s3']['object']['key']

data = {
"result": checker.validate(current_data),
"s3_events": s3_events

response =

except Exception as error:
print('Error in request: ', str(error))

return {
"statusCode": 200,
"body": json.dumps({
"result": checker.validate(current_data),
"s3_events": s3_events



Defining AWS resources with Terraform

Some files will be written in order to define all the resources and relations needed by our project. First, we’ll defining some variables in order to reuse it in all our resources.

variable "path_source_code" {
default = "lambda_function"

variable "function_name" {
default = "aws_lambda_test"

variable "runtime" {
default = "python3.7"

variable "output_path" {
description = "Path to function's deployment package into local filesystem. eg: /path/"
default = ""

variable "distribution_pkg_folder" {
description = "Folder name to create distribution files..."
default = "lambda_dist_pkg"

variable "bucket_for_videos" {
description = "Bucket name for put videos to process..."
default = "aws-lambda-function-read-videos"

We need to define the role that will execute the Lambda Function, define the security policy and attach it to the role.

resource "aws_iam_role" "lambda_exec_role" {
name = "lambda_exec_role"
assume_role_policy = <<EOF
"Version": "2012-10-17",
"Statement": [
"Effect": "Allow",
"Action": "sts:AssumeRole",
"Principal": {
"Service": ""

data "aws_iam_policy_document" "lambda_policy_doc" {
statement {
sid = "AllowInvokingLambdas"
effect = "Allow"

resources = [

actions = [

statement {
sid = "AllowCreatingLogGroups"
effect = "Allow"

resources = [

actions = [

statement {
sid = "AllowWritingLogs"
effect = "Allow"

resources = [

actions = [

resource "aws_iam_policy" "lambda_iam_policy" {
name = "lambda_iam_policy"
policy = data.aws_iam_policy_document.lambda_policy_doc.json

resource "aws_iam_role_policy_attachment" "lambda_policy_attachment" {
policy_arn = aws_iam_policy.lambda_iam_policy.arn
role =

Later, we need to define the bucket involve in this process. Also we will define the relation with the Lambda Function (don’t worry about the files where the relations are defined, they are managed by Terraform in a proper way).

resource "aws_s3_bucket" "bucket_read_videos" {
bucket = var.bucket_for_videos

resource "aws_s3_bucket_notification" "bucket_notification" {
bucket =

lambda_function {
lambda_function_arn = aws_lambda_function.aws_lambda_test.arn
events = ["s3:ObjectCreated:*"]
filter_suffix = ".mp4"

Finally, we will define the Lambda Function and the middle processes that are in charge of package our application.

resource "null_resource" "install_python_dependencies" {
provisioner "local-exec" {
command = "bash ${path.module}/scripts/"

environment = {
source_code_path = var.path_source_code
function_name = var.function_name
path_module = path.module
runtime = var.runtime
path_cwd = path.cwd

data "archive_file" "create_dist_pkg" {
depends_on = ["null_resource.install_python_dependencies"]
source_dir = "${path.cwd}/lambda_dist_pkg/"
output_path = var.output_path
type = "zip"

resource "aws_lambda_function" "aws_lambda_test" {
function_name = var.function_name
description = "Process video and does face recognition..."
handler = "lambda_function.lambda.lambda_handler"
runtime = var.runtime

role = aws_iam_role.lambda_exec_role.arn
memory_size = 128
timeout = 300

depends_on = [null_resource.install_python_dependencies]
source_code_hash = data.archive_file.create_dist_pkg.output_base64sha256
filename = data.archive_file.create_dist_pkg.output_path

resource "aws_lambda_permission" "allow_bucket" {
function_name = aws_lambda_function.aws_lambda_test.arn
source_arn = aws_s3_bucket.bucket_read_videos.arn
statement_id = "AllowExecutionFromS3Bucket"
action = "lambda:InvokeFunction"
principal = ""

As you can see, in the first definition block, we are calling the following script: “/scripts/”.


echo "Executing"

cd $path_cwd
mkdir $dir_name

# Create and activate virtual environment...
virtualenv -p $runtime env_$function_name
source $path_cwd/env_$function_name/bin/activate

# Installing python dependencies...

if [ -f "$FILE" ]; then
echo "Installing dependencies..."
echo "From: requirement.txt file exists..."
pip install -r "$FILE"

echo "Error: requirement.txt does not exist!"

# Deactivate virtual environment...

# Create deployment package...
echo "Creating deployment package..."
cd env_$function_name/lib/$runtime/site-packages/
cp -r . $path_cwd/$dir_name
cp -r $path_cwd/lambda_function/ $path_cwd/$dir_name

# Removing virtual environment folder...
echo "Removing virtual environment folder..."
rm -rf $path_cwd/env_$function_name

echo "Finished script execution!"

In this moment, we are ready to deploy our Lambda Function, and for that we need to execute the followings commands:

$ terraform init
$ terraform apply

After that, we need to configure our mock endpoint using: For example: using “python-lambda” as an endpoint name.

Important, remember update your Lambda Function code with your specific endpoint URL.


Finally, copy some video file to your Bucket using the following command (remember you will be using other Bucket by sure):

aws s3 cp test_video.mp4 s3://aws-lambda-function-read-videos/test_video.mp4

After finished the file copy, the S3 Event must be triggered and your endpoint must received the information!

Don’t forget delete your unused resources…

$ terraform destroy

Well, that’s all. I hope this information will be useful in your next code time!!!

Code repository:


Data Architect | Python Developer