Looking for Senior AWS Serverless Architects & Engineers?
Let's TalkAutomating repetitive tasks is a basic cornerstone for maximizing efficiency and operational excellence at the industrial level. Coding your cloud infrastructure streamlines the development process, and also makes iterative development for your applications easier to manage. There are a number of sophisticated tools at your disposal as far as Infrastructure as Code (IaC) is concerned. Each of those tools has its own rules (mostly a Domain Specific Language — DSL) and concepts of implementation, so before you can become efficient, you need to spend some time understanding how they work to begin the implementation process.
Pulumi is a platform for creating, deploying, and managing cloud infrastructure along with offering multi-cloud support like most other IaC tools. The distinction between Pulumi and other tools is that with Pulumi you can program the infrastructure in the language you’re familiar with. This means you can start coding your cloud deployments without spending any time on learning a new language.
How Pulumi Works:
This IaC tool operates using a state-driven approach. Pulumi leverages existing, familiar programming languages, including TypeScript, JavaScript, Python, Go, and .NET, and their native tools, libraries, and package managers. Following a declarative approach using the SDK, you allocate a resource object whose properties and attributes indicate the desired state of your infrastructure.
When a Pulumi program executes, the flow is as follows
- A language host computes the desired state for a stack’s infrastructure. It sets up an environment where it can register resources with the deployment engine.
- The deployment engine compares this desired state with the stack’s current state and determines what resources need to be created, updated or deleted.
- The engine utilizes a set of resource providers (AWS, Azure etc.) in order to manage the individual resources.
Getting Started:
Let’s explore the workings of Pulumi with an AWS lambda function that translates any non-English content to English when a text file is uploaded into an S3 bucket.
After you have created an account, the first requirement is to install the SDK. Depending on your OS, you can install by running
- brew install pulumi — MacOS
- choco install pulumi — Windows
- curl -fsSL https://get.pulumi.com | sh — Linux
Configure your service provider’s CLI so that Pulumi can carry out the operations for deployment. Since we are using AWS, we can either export the AWS_PROFILE name if there are multiple profiles (in the case that default profile is to be used you ignore this step) configured, or after creating a Pulumi project (next step) you can run this
pulumi config set aws:profile <profilename>
Creating a Project & Deploying:
Javascript is the language we’ll work with for creating this project. Inside an empty directory, run
pulumi new aws-javascript
You will first be prompted to log in to Pulumi when you initially start working with the CLI. After login, the CLI will create a project with your inputs (tip: leave us-east-1 as default since certain AWS services are not available in all regions) and automatically install base dependencies.
A typical JS Pulumi project will consist of the following files.
- Pulumi.yaml — Defines the project.
- Pulumi.dev.yaml — Contains configuration values for the stack we initialized.
- index.js — The Pulumi program that defines our stack resources.
Let’s begin adding in the code required to create our serverless function for text translation. There are two approaches to creating and deploying the function.
Manual function:
Replace the contents of index.js with the following code.
"use strict";
const pulumi = require("@pulumi/pulumi");
const aws = require("@pulumi/aws");
const awsx = require("@pulumi/awsx");
// Create an AWS resource (S3 Bucket)
const bucket = new aws.s3.Bucket("text-slsguru");
// Create a DynamoDB table
const ddb_table = new aws.dynamodb.Table("translatedtext", {
name: "translatedtext",
attributes: [
{
name: "Id",
type: "S",
},
],
hashKey: "Id",
billingMode: "PAY_PER_REQUEST",
});
// Create IAM role for function
const lambdaRole = new aws.iam.Role("slsgurufunctiontext", {
assumeRolePolicy:`{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
`,});
// Setup policies for the role
const lambdaPolicy = new aws.iam.RolePolicy("lambdaPolicy", {
role: lambdaRole.id,
policy: `{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"dynamodb:*",
"logs:*",
"s3:*",
"translate:*",
"comprehend:*"
],
"Effect": "Allow",
"Resource": "*"
}
]
}
`,
});
// Create Lambda function
const lambda = new aws.lambda.Function("slsgurutexttranslator",{
role: lambdaRole.arn,
runtime: "nodejs12.x",
code: new pulumi.asset.FileArchive("./translator/translator.zip"),
handler:"lambdatranslate.handler",
timeout:30,
});
// Give the bucket permission to invoke lambda
const allowBucket = new aws.lambda.Permission("allowBucket", {
action: "lambda:InvokeFunction",
function: lambda.arn,
principal: "s3.amazonaws.com",
sourceArn: bucket.arn,
});
// Bucket notification to trigger the function
const bucketNotification = new aws.s3.BucketNotification("bucketNotification", {
bucket: bucket.id,
lambdaFunctions: [{
lambdaFunctionArn: lambda.arn,
events: ["s3:ObjectCreated:*"],
filterSuffix: ".txt",
}],
}, {
dependsOn: [allowBucket],
});
// Export the name of the bucket
exports.bucketName = bucket.id;
Create a folder named translator at the root of the directory and add the following code.
const aws = require("aws-sdk");
// To be installed
const { uuid } = require("uuidv4");
const s3 = new aws.S3();
const dynamodb = new aws.DynamoDB.DocumentClient();
const translate = new aws.Translate();
let translations = "";
exports.handler = async (event, context) => {
for (const rec of event.Records || []) {
const [buck, key] = [rec.s3.bucket.name, rec.s3.object.key];
console.log(`Reading ${buck}/${key}`);
const data = await s3.getObject({ Bucket: buck, Key: key }).promise();
let source_text = data.Body.toString("utf-8");
console.log(source_text);
let params = {
SourceLanguageCode: "auto",
TargetLanguageCode: "en",
Text: source_text,
};
await translate
.translateText(params, function (err, data) {
if (err) console.log(err, err.stack);
else {
console.log(data);
translations = data;
}
})
.promise();
await dynamodb
.put({
Item: {
Id: uuid(),
filename: key,
translated_text: translations,
},
TableName: "translatedtext",
})
.promise();
}
return translations;
};
Observe the index.js code; you’ll see that the lambda function code is zipped along with the required dependencies. npm init inside the translator folder and run npm i uuidv4. Then zip the contents of that folder.
Your project directory should look something like this.
Deploy by running
pulumi up
The resources get created with an auto-generated suffix id. You can disable that by adding the name attribute to the individual resources. (refer dynamoDB table creation code block)
At your Pulumi console, you can review all your created stacks and it’s related resources.
Upload a sample .txt file containing content that is not in the English language. The translated text will be sent to the created DynamoDB table.
Magic Function:
There is another approach to create serverless functions. Pulumi calls it the magic lambda function. Essentially, the function code is written inline along with other configuration settings (the aws.lambda.CallbackFunction should be used to add the settings) such as role, memorySize, timeout etc., and Pulumi handles the creation and configuration of the resources.
Replace the index.js with the following code.
"use strict";
const pulumi = require("@pulumi/pulumi");
const aws = require("@pulumi/aws");
const awsx = require("@pulumi/awsx");
// Create an AWS resource (S3 Bucket)
const bucket = new aws.s3.Bucket("text-slsguru");
// Create a DynamoDB table
const ddb_table = new aws.dynamodb.Table("translatedtext", {
name: "translatedtext",
attributes: [
{
name: "Id",
type: "S",
},
],
hashKey: "Id",
billingMode: "PAY_PER_REQUEST",
});
// Create IAM role for function
const lambdaRole = new aws.iam.Role("slsgurufunctiontext", {
assumeRolePolicy:`{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
`,});
// Setup policies for the role
const lambdaPolicy = new aws.iam.RolePolicy("lambdaPolicy", {
role: lambdaRole.id,
policy: `{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"dynamodb:*",
"logs:*",
"s3:*",
"translate:*",
"comprehend:*"
],
"Effect": "Allow",
"Resource": "*"
}
]
}
`,
});
bucket.onObjectCreated(
"slsgurutranslator",
new aws.lambda.CallbackFunction("slsgurutranslatorFunc", {
memorySize: 128,
timeout: 30,
role: lambdaRole,
callback: async (e) => {
for (const rec of e.Records || []) {
const { uuid } = require("uuidv4");
const aws = require("aws-sdk");
const s3 = new aws.S3();
const translate = new aws.Translate();
const dynamodb = new aws.DynamoDB.DocumentClient();
let translations = '';
const [buck, key] = [rec.s3.bucket.name, rec.s3.object.key];
console.log(`Reading ${buck}/${key}`);
const data = await s3.getObject({ Bucket: buck, Key: key }).promise();
let source_text = data.Body.toString("utf-8");
console.log(source_text);
let params = {
SourceLanguageCode: "auto",
TargetLanguageCode: "en",
Text: source_text,
};
await translate.translateText(params, function (err, data) {
if (err) console.log(err, err.stack);
else {
console.log(data);
translations = data;
}
}).promise();
await dynamodb
.put({
Item: {
Id: uuid(),
filename: key,
translated_text: translations,
},
TableName: "translatedtext",
})
.promise();
}
},
}),{ filterSuffix: ".txt" }
);
// Export the name of the bucket
exports.bucketName = bucket.id;
With the uuidv4 (npm i uuidv4) dependency installed, re-deploy the program by running
pulumi up
Pulumi will automatically recognize the changes and make adjustments as needed.
The difference in this Pulumi program is that we call the onObjectCreated method on the bucket resource signifying that when an upload to this bucket is done, trigger the defined function. In addition to specifying the attributes of the function, the lambda function code is authored inline with Pulumi taking care of everything else. Even the dependencies are automatically included when the function is being created on AWS.
Magic functions are an interesting concept to event-driven programming, and managing infrastructure as code. These functions are actually derived from a term Pulumi calls “Crosswalk for AWS”, which supports the creation of underlying infrastructure by using a collection of libraries (more on this can be found here). Both implementations achieve the same result but the second approach requires lesser effort and makes the entire process of creating serverless applications look fluid.
pulumi destroy - deletes all resources from the cloud.
pulumi rm stack <stackname> - will remove the stack entirely from the Pulumi service, along with all of its update history.
Conclusion:
We have explored the inner workings of Pulumi and have seen how simple it is to write and deploy cloud infrastructure in the language of your choice. Unlike other IaC tools, while using Pulumi you don’t have to spend time learning a new language or familiarizing with specific concepts of implementation. Even if you have been using other infrastructure or configuration management tools, the transition to Pulumi can be done with very little effort (refer to docs to learn more). You can also mix and match by maintaining integrations with other provisioning tools to best suit your needs.