“OpenWeatherMap” Dashboard using Terraform project

Have you ever thought of building a simple but some insightful dashboard that can help someone to visualize and interpret the meaning behind data.

You came to the right spot. Thats exactly what we are going to do in this blog. I am going to provide you all the details and steps, so that you can also implement this project easily. Cost — $0.00. Yes, It’s also not going to cost you a single penny. Let’s get started.

What is this project?? It’s a weather monitoring system. This is a simple Data Pipeline (ETL) project that collects (E-extract) weather data from OpenWeatherMap using there RestAPI and make necessary Data Transformations (T) and Load (L) it into InfluxDB. Is it really worth just dump some data to a DB and not showing a meaningful purpose of it. Thats where Grafana comes into picture. We are going to use Grafana to build an insightful dashboard. Below picture depicts the same, this is the container architecture and the different layers of abstraction for analytics.

Wait what? Thats a lot of stuff, how are we going to do so many steps? Where are we going to build this data pipeline. Who are going to use it?

These are all great questions, thats where you should always start any of your data project. You should at-least have some idea what you are doing and why. Then it becomes easy to figure out the how part.

Here are some answers:

We are going to use Terraform to build our infrastructure on AWS cloud platform using docker images of these services (nodered, influxdb, grafana). This can be used by anyone who wants to monitor weather of any specific city or region. How? lets begin…..

Why Terraform?? Its a very popular IAC (Infrastructure As code) tool, that allows easy deployment of your needed resources and makes your code reusable, flexible, easily maintainable, etc. More benefits of terraform here.

https://www.opensourceforu.com/

Why Docker?? Let’s say you started working on your project and after few set-up steps you realized you made a lot of wrong choices and you need to start again. Are you gonna be happy, to spend so much time again on all the stuff that you did, I hope your answer is no. Thats exactly, where docker helps. Its gives us a flexible way to reuse the initial stage of our set-up every time we want to start fresh. More about docker benefits here.

Why AWS?? Because I am familiar with it and we needed a host to launch all resources, hence we are going to use AWS EC2 server to host our services.

Why influxDb? Why Grafana? Why Nodered? Why OpenWeatherMap API? Because all of these are freely available and easy to integrate together for our use case, also easy to learn and implement.

Lets start the setup step by step:

Step-1- download or clone my github repository.

Make sure your system should have all basic requirements to run terraform, aws, and docker commands. I would recommend to use AWS cloud9 for ready setup.

Assumptions: You are familiar with terraform basics and hence can read and understand the code. As you can observe in the github files, we used modular concepts of terraform while building infrastructure files.

High level overview and purpose of each files:

“providers.tf” file — as the name suggest, this is the file that keeps details about service provider (docker image) that we using in our application.

“variables.tf” file — a central file used to pass all variables values

“main.tf” file — This is where you define the overall structure of your application. The business logic of your application. Its very similar to “main” method defines at many programming language, if you are coming from that background.

“outputs.tf” file — contains all details that you want to print on console or use it as an output after your application deployed successfully.

“terraform.tfvars” file — it contains information about variables that you want to keep separate from developers vs security team. Its gives more flexibility to pass values to variables.

“locals.tf” file — A local value assigns a name to an expression, so you can use it multiple times within a module without repeating it. more details here.

“container”, “image” and “volume” folders has been created to distinguish details of each specific part of application within these files, this is a good practice for easy maintainability of terraform applications.

Step2-create workspace and deploy terraform resources

Below commands can help

terraform workspace new prod # if workspace not already created
terraform workspace select prod # if workspace already created and want to use it
terraform plan # test everything before deployment
terraform apply --auto-approve #to create and deploy resources

Notes: You may set up your selected ip address for best security practice. Also, ports that need to open on your application host server are these, 8086 — Influxdb, 3000 — Grafana, and 1880 — Nodered.

Step3 — Access our deployed application resources

Lets sign-up and create account within each resources to access them, if not already set-up earlier.

Create OpenWeatherMap Accounthttps://openweathermap.org/

copy API key from this account to a separate notepad file for later use

InfluxDB setup

Since you deployed your code on AWS EC2 instance that was used by cloud9 service, influxDB can be accessed using public IP address of EC2 server

in my case, http://54.196.145.142:8086 (open it in web browser)

sign-up to influxdb and make sure to use correct values for organization name and bucket name, if you are planning to use my files.

in my case, organization name: “weatherappOrg”

bucket name: “weather”

then click “Get started” → click “load your data” window button → click “Tokens” tab → click newly created token for your “username” → copy token and save it in separate nodepad file for later use.

Now, Nodered setup

as you can see in files, nodered instance is running on port 1880, nodered can be access using EC2 instance public IP address.

in my case, http://54.196.145.142:1880/ (open it in web browser)

As you can see, nodered is a service to create a flow. A file has been provided in my github repository to create flow. You can importopenweathermap-project/others/nodered-flow.json” file, you will see something like this. Make sure you are in the right flow tab on top with name “API call”

As you can see, “influxdb” part is in red color, lets fix it

In-order to fix this we need to install influxdb module in the pallet to use it in the flow. click on top right corner “three line button” and select “manage pallet”, search for influxdb and click install. this will fix that issue.

Now lets place openweathermap API key that you copied earlier in the nodepad file into our flow:

As you know, we are going to call RestAPI, we are doing it in our nodered flow using “http request” node. you can see 2 “http request” node in our flow.

double click on that and update “{API-KEY}” this part of URL with the “API key” that you took from OpenWeatherMap Account — https://openweathermap.org/.

Now lets update influxDB details in our nodered flow:

double click on “influxdb” node in the flow and update server details with correct “URL” and “token” and “Organization name” and “bucket name”. All details should be same as what you used while creating influxdb environment earlier.

Thats it.. ready to deploy your flow. click on “Deploy” button on top.

Now you can run the flow using “Get” node button of your flow, this is the first node of the flow.

This will hit both “http request” and fetch data from API and load to influxDB.

Let’s head over to influxDB, which is running on “http://54.196.145.142:8086” and check data loaded successfully or not.

Make sure, you should be able to see “purple” highlighted part in your influxDB window. Which is nothing but

from “weather” bucket name,

filter criteria “_measurement” = current,

filter “_field” = “humidity and temperature”

filter “location” = “cape Canaveral” and “Houston”

Also, if there is any missing values then it will be replaced with mean of that field. These are some awesome features of “influxDB”.

Grafana setup

As you know, grafana is an dashboard creation service, in our case which is running on “http://54.196.145.142:3000”

upon first time, you need to change default password of the account:

default username: admin and default password : admin

then click “settings” icon and select “Data Sources” then select “InfluxDB” and Flux under Query Language

Here, we need to provide all “InfluxDB” resource details, to access data in Grafana

Provide URL for InfluxDB:http://54.196.145.142:8086

Organization Name: weatherappOrg

Token: from nodepad file, where you kept earlier

Click on “Save and Test” once all information has been entered successfully. Hopefully you are able to make the connection. Feel free to reach me incase of any issues.

Finally, time to create Dashboard:

In this case, the file to create dashboard is available on Github repository: openweathermap-project/others/grafana-dashboard.json”

Let’s import this file, click on “+” icon on grafana window, browse this file in your downloaded location and hit “import” button.

Awesome, you should be able to see a dashboard like below

Hope you learned and enjoyed this project

Time for clean-up:

run below command to destroy all your resources, thats how easy it is to clean up your resources using terraform

terraform destroy --auto-approve

Make sure to delete the “AWS Cloud9” environment as well.

Conclusion:

As you might realized, with just one command you can deploy your application and destroy once you done. That’s how easy it is to implement applications using terraform and cloud capabilities.

This is a great example project showed by instructor “Derek Morgan” while teaching terraform (https://www.udemy.com/user/derekmorgan/). I thought it’s a great idea to share it with you all and many people can get benefit from it. Feel free to sign-up to his awesome course.

Future challenge : Use terraform to build similar application using AWS services, for example, you can use “dynamodb” for database work and “QuickSight” for dashboard functionality. “Glue” or “lambda” for designing flow like nodered. Its always fun to play around with these services and build such an amazing use-cases.

Please show your love if you enjoyed it.

Data Analytics Engineer, crossing paths in Data Science, Data Engineering and DevOps. Bringing you lots of exciting projects in the form of stories. Enjoy-Love.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store