It has been a long time since the initial release of Terraform in 2014, we have seen new providers, changes in the CLI and on the functions as well, but I think we are yet to see one of its biggest improvements, the Cloud Development Kit. I won’t assume that you are familiar with Terraform or even with Infrastructure as Code, so I’ll explain it very briefly.

Infrastructure as Code

Working with infrastructure was always a complicated process, people used to put servers in place, configure them and setup the operative system to finally install the dependencies and get an application up and running. Even with documentation and experience, the process was never without its problems and as a result, provisioning was inconsistent, slow, and prone to human error.

Creating the Project

First, you need to install the CDK CLI to be able to run all the commands, then run cdktf init to initialize the project. The init command will create all the files you need to start writing some code.

  • 2 Subnets.
  • 2 Public Gateways.
  • 1 Public Load Balancer.
  • 1 Security Group
  • 1 Instance group to handle our instances.

Setup

In the cdk-tutorial folder, open the cdktf.json. In this file, we are going to declare the providers we are going to use. In this case IBM Cloud provider.

Configuring the Provider and creating your first resource

With the dependencies in place, your IDE autocomplete feature will make your life easier than ever before. Configurations in CDK are represented as classes, so all we have to do is create instances of the resources we want to work with, inside the MyStack class. Open the main.ts file and let’s declare the provider and a VPC.

Using Variables

As in HCL, we can use variables here as well, we just need to create a new instance of the class TerraformVariable. This variable will be user in all the resources as a prefix for the name.

Referencing Resources

To create the public gateways we need the ID of the previously declared VPC, so we will create two instances of the public gateway and assign the ID of the VPC.

Explicit Dependency

We won’t be using the default Address Prefixes for this article, instead we will create two, one for each zone. But, since there is no implicit dependency between the subnet and address prefixes, we need to put it explicitly using the dependsOn property, that way we can be sure that Subnets are created after the Address Prefixes.

Data Sources

Sometimes, you have resources already created on your Cloud Service Provider account that you might want to reference. We will create an SSH Key required by the instance template we will declare later.

Load Balancer and Security Groups

For the Load Balancer and Security Groups there’s nothing new to explain in terms of CDK. Our security group will allow outgoing traffic to the ports 80 (HTTP) , 443 (HTTPS) and 53 (DNS).

Built in Functions?

We want to install nginx on instances at the time they launch. The instance templates include a property called User Data which allows us to execute some commands at launch time.

Output

When everything is created, we might need an easy way to get the load balancer url. We can do that with output values.

Where’s the backend?

We are almost ready, we just need a backend to keep the current state of the resources handled by Terraform. For demonstration purposes, we will create an etcd database.

Deploy

Before deploying you must consider the following:

  • Click Create an IBM Cloud API key.
  • Enter a name and description for your API key.
  • Click Create.
  • Take note of the API Key.
  • To obtain the Certificate and its key we can use openssl to generate a self signed certificate.
  • We can run the workflow in smaller steps by running cdktf synth to generate the configuration file in JSON format, which then we can use to run terraform commands.

Travis CI

What if... we go to the next step and set up a pipeline? Here is Travis to help us.

  • Install npm.
  • Download and install ibmcloud CLI.
  • Login and install the Cloud Databases (cdb)plugin.
  • Download the etcd CA certificate and generate our self signed certificate to able to communicate with the etcd backend.
  • The second one, called terraform apply will execute on push to master branch, this includes approved pull requests, and direct commits to master branch. In case your expectations were met and you decide to merge development into the master branch, this job will execute applying your changes.

Summary

In this post, we walked through the Terraform CDK to create some resources on IBM Cloud and defined a CI/CD pipeline with TravisCI that has two jobs: the first one for pull requests and the second one for pushing directly to master. Terraform CDK is still on alpha, so it’s probably going to be changing quite often but it is a good start.

Integration Consultant | Technology Enthusiast | Problem Solver | Always willing to learn something new