Often you can face the requirement to implement a generic approach to send emails from all your applications or micro-services. This post describes a possible Serverless Architecture that provides the mechanism to send emails through an API endpoint within Amazon Web Service.

Application Requirements

  • Serverless.
  • Fault-tolerance.
  • Support attachments.


The use of AWS API Gateway as an entry point to a Lambda Function is probably a common pattern in serverless applications, using the Lambda Function to handles the logic and communicates with other resources within AWS or third-party services. …

There are many use-cases where a serverless architecture has a lot of benefits and of course an API service is not the exception. In a previous post, we learned how to deploy a Flask application on EC2 using a Load Balancer and Auto Scaling Groups. Of course, you can think about deploying it into the AWS Elastic Container Service but…, what if you don’t want to manage instances or clusters?

The first piece of advice we can give you is that you should not architect differently for serverless because you should be able to deploy your service in different environments…

Dataproc is a managed Apache Spark / Apache Hadoop service that lets you take advantage of some of those open source data tools for batch, querying and streaming processing and machine learning.

This project is intended to explain the way you can run an Apache Spark script into Google Cloud DataProc. For this purpose, we will create a script to read an Apache Server log file, extract: host, datetime, method, endpoint, protocol and the status code and save the information into BigQuery.

In this project, we will use the Google Cloud SDK to manage or create the needed resources. …

Even if we already have tools such as Apache Hadoop, Apache Spark and Apache Flink there are some situations when we could need a tool that provide us a kind of abstraction between our application logic and the big data ecosystem or even to provide a unified model for both stream and data processing. There is where Apache Beam comes on the scene.

In this post, we’ll covered two simple examples running our pipelines on a local environment as well as on Google Cloud DataFlow.

Apache Beam

Apache Beam is an open source unified programming model to define and execute data processing…

Building serverless ETL pipelines using Amazon Glue.

(Image taken from: https://aws.amazon.com/glue/)

AWS Glue automatically discovers and categorise your data and make it immediately searchable and query-able using Amazon Athena, Amazon Redshift or Amazon EMR.

In this example project you’ll learn how to use AWS Glue to transform your data stored in S3 buckets and query using Athena.

Create required resources

We will use Terraform to create the role needed with the required permissions. Of course, you could create all these resources manually, but if you feel comfortable using Terraform you can use the following definitions.

// resources.tfprovider "aws" { region = "us-east-1" } resource "aws_iam_role" "aws_iam_glue_role"…

Sometimes a serverless approach using Lambda Functions won’t be the best solution for our project necessities. In the case that you would need to create your own EC2 instances a recommended solution is to use a Load Balancer (ELB) and an Auto Scaling Group (ASG) to respond effectively to increasing/decreasing workload.

In this demo project, we’ll learn:

  • How to create our Amazon Machine Image (AMI).
  • How to use Terraform to deploy our Infrastructure as code (IaC).
  • Create and deploy a Flask application using Gunicorn and Nginx.
  • How to use Elastic Load Balancer, Auto Scaling Group and EC2 instances for a…

Deploying Lambda Functions written in Python that use custom dependencies not included in AWS Lambda Runtime for Python with Terraform as Infrastructure as Code (IaC).

AWS Lambda Functions

In this article, we will learn:

  • How to deploy Lambda function on Amazon Web Services.
  • How to use Terraform to deploy our project infrastructure.
  • How integrate S3 notifications with our Lambda Function.
  • How to package our Lambda Function with all ours dependencies and deploy it to AWS.

The purpose of this example project is: create a Lambda Function that will be activated by AWS S3 Event trowed when an video in “.mp4” format is copied…

Alejandro Cora González

Data Architect | Python Developer

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store