Often you can face the requirement to implement a generic approach to send emails from all your applications or micro-services. This post describes a possible Serverless Architecture that provides the mechanism to send emails through an API endpoint within Amazon Web Service.

Application Requirements

  • Serverless.
  • Fault-tolerance.
  • Support attachments.

Architecture

The use of as an entry point to a is probably a common pattern in serverless applications, using the Lambda Function to handles the logic and communicates with other resources within AWS or third-party services. …


Dataproc is a managed Apache Spark / Apache Hadoop service that lets you take advantage of some of those open source data tools for batch, querying and streaming processing and machine learning.

This project is intended to explain the way you can run an script into . For this purpose, we will create a script to read an log file, extract: host, datetime, method, endpoint, protocol and the status code and save the information into .

In this project, we will use the to manage or create the needed resources. …


Even if we already have tools such as , and there are some situations when we could need a tool that provide us a kind of abstraction between our application logic and the big data ecosystem or even to provide a unified model for both stream and data processing. There is where comes on the scene.

In this post, we’ll covered two simple examples running our pipelines on a local environment as well as on .

Apache Beam

is an open source unified programming model to define and execute data processing…


Building serverless ETL pipelines using Amazon Glue.

(Image taken from: )

automatically discovers and categorise your data and make it immediately searchable and query-able using , or .

In this example project you’ll learn how to use AWS Glue to transform your data stored in S3 buckets and query using Athena.

Create required resources

We will use to create the role needed with the required permissions. Of course, you could create all these resources manually, but if you feel comfortable using Terraform you can use the following definitions.

// resources.tfprovider "aws" { region = "us-east-1" } resource "aws_iam_role" "aws_iam_glue_role"…


Sometimes a serverless approach using Lambda Functions won’t be the best solution for our project necessities. In the case that you would need to create your own EC2 instances a recommended solution is to use a Load Balancer (ELB) and an Auto Scaling Group (ASG) to respond effectively to increasing/decreasing workload.

In this demo project, we’ll learn:

  • How to create our Amazon Machine Image (AMI).
  • How to use to deploy our Infrastructure as code (IaC).
  • Create and deploy a application using and .
  • How to use Elastic Load Balancer, Auto Scaling Group and EC2 instances for a…


Deploying Lambda Functions written in Python that use custom dependencies not included in AWS Lambda Runtime for Python with Terraform as Infrastructure as Code (IaC).

AWS Lambda Functions

In this article, we will learn:

  • How to deploy Lambda function on Amazon Web Services.
  • How to use Terraform to deploy our project infrastructure.
  • How integrate S3 notifications with our Lambda Function.
  • How to package our Lambda Function with all ours dependencies and deploy it to AWS.

The purpose of this example project is: create a Lambda Function that will be activated by AWS S3 Event trowed when an video in “.mp4” format is copied…

Alejandro Cora González

Data Architect | Python Developer

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store