From the dev team: Forecasting Azure costs with Facebook Prophet, Docker, and Terraform

Image of blue background with illustrated bar graphs Image of blue background with illustrated bar graphs

Hi there! In this post, I’ll run you through the latest feature shipped by the ShareGate Overcast development team: a forecasting view for predicting future Azure expenses.

With the help of some exciting tools like Docker, Terraform, and Prophet, we managed to deliver a solid piece of software and value to our users in about two weeks. Here’s how we did it.

Want to make sure your Azure environment is running efficiently? Our checklist contains a complete action plan for reducing Azure costs—and keeping them where they should be. At ShareGate, we managed to save 30% on our Azure bill by following these exact tips. 

Overcast logo white

Uncover all the cost-saving opportunities in your Azure environment

The challenge: build better forecasting for our Azure cost management app

Before getting too caught up in the cool tech, let’s start from the beginning.

I’m part of the development team for ShareGate Overcast, a web application that helps administrators understand and reduce their infrastructure cloud costs in Microsoft Azure.

Managing your Azure costs usually involves spending a lot of time in spreadsheets or investing a lot in building custom reports (just ask our pal Nic Fletcher, head of global IT at Tobii).

ShareGate Overcast helps you out with all that by doing all the hard work for you and displaying everything in a very simple to use interface. You can sign up try it out for free here! If you don’t have an Azure account, you can check out our interactive demo.

ShareGate Overcast

A much-requested feature by our users was an improved forecast of their costs. We’ve always offered a basic way to get an idea of future cost trends, but let’s be honest: no one was really impressed with a linear regression.

Forecasting 1.0 in ShareGate Overcast

We felt we could do much better.

Enter: Prophet

The team at our in-house experimentation lab caught wind of our mission and suggested we use Prophet by Facebook. From the official website:

Prophet is a procedure for forecasting time series data based on an additive model where non-linear trends are fit with yearly, weekly, and daily seasonality, plus holiday effects. It works best with time series that have strong seasonal effects and several seasons of historical data. Prophet is robust to missing data and shifts in the trend, and typically handles outliers well.


That does sound like exactly what we were looking for. A quick demo had us convinced that this would be an efficient way to deliver more value to our users when it comes to forecasting cloud costs.

We weren’t looking to predict the future or do anything magical that you wouldn’t be able to do with access to your data and a little math. Our goal was to find a way to quickly show you what your costs are likely to look like in a few months.

We weren’t looking to predict the future or do anything magical – our goal was to quickly show you what your Azure costs are likely to look like in a few months.

Writing the code

To accomplish this, we used Prophet’s Python API. The code we wrote is pretty much exactly what you can find in the example documentation. ShareGate Overcast already allows you to group your Azure resources into projects, so we just send the monthly historical data of those projects’ costs to Prophet, and it outputs a prediction of the current and next month’s costs.

This is very simple code and Prophet does all the heavy lifting for us.

Using Python was an interesting challenge for us, because the rest of our codebase is entirely .NET Core in C#. Unfortunately, Prophet is not available in .NET.

overcast logo
Get visibility into what’s going on in your Azure environment and how to keep costs down.

Top features


Savings recommendations


Custom views business mapping


Top cost drivers and resource administrators

The only problem we ran into was adding this to our existing ecosystem. We needed somewhere to host it in Azure and have it play nice with our other pieces of code. We figured that if we can have it run in its own little environment somewhere and call it from C# via HTTP, we should be good to go.

We decided that the easiest way to get something up and running in Azure is by using a Function App. It turns out that Function Apps support Python, albeit in preview.

We installed the Azure Function Python library and added the boilerplate to our Python code.

We get the data from the HTTP request body, crunch the numbers, and return a JSON payload.

You can also use the Function SDK to test this out locally. Just run the code on localhost and use curl or Postman to send a payload to your Function.

Running into some issues

This is pretty cool, but we’re not running this on Azure yet. Even locally, we’re going to run into a few problems.

Installing Prophet on a Windows machine is a bit problematic at the moment. It’s possible with Anaconda and a virtual environment, but Azure Functions don’t support Anaconda.

As a rule of thumb, we want the local environment to be as close as possible to the production environment. It wouldn’t be ideal to set this up with Anaconda locally, then have a different pipeline to ship to Azure.

As this is an experimental feature, we didn’t really want to invest in a complicated local setup for development. If we end up doing more Python in the future, we’ll look into investing a bit more into a proper local setup. But let’s see if we can find a way to avoid this.

Functions + Docker: have your cake and eat it too

It turns out Azure Functions also support Docker containers. This means that you can write your function in literally any language you want and run it in an Azure Function. The one thing you need is support for the Azure Function SDK in the language of your choice.

We already use plenty of Docker containers in development and production, so our ecosystem is already equipped to deal with this. To “containerize” our little Python function, all we have to do is add a Dockerfile to our project.

We can now build and run our sample with two simple commands:

To use it, we simply POST to localhost exactly as we did with the previous sample that didn’t use Docker.

Developing this way is super simple. Since Docker does caching by layer, we don’t have to completely rebuild the image every time we change a piece of code. Our code is the last layer in the image, so when we make a change and run the build command, it takes a few milliseconds. Since the Function App will be running the same sort of container, we have an almost identical environment locally and in Azure. Great!

Deploying infrastructure-as-code

Recently, the Overcast app team had to move a bunch of infrastructure. We took the opportunity to code all our existing infrastructure with Terraform to simplify deployments. It hasn’t been without problems, but this is an example where Terraform really shines.

We wanted to add a few new pieces to our existing infrastructure, and then deploy to it.

To deploy a Function App, we needed the following Azure resources:

  • Resource group: Just to keep this new feature separated from the rest of the application. It’s sort of experimental, after all.
  • Storage account: Our Function App needs this for internal data.
  • Application insights: To track telemetry, traces, exceptions, etc.
  • App Service plan: Basically, the computer on which the Function will run on.
  • Container registry: The repository where our Docker image will be hosted.

We also needed to provide our Function with the container registry credentials to go fetch the Docker image. Here’s what we’re adding with Terraform:

This is the complete file in our existing infrastructure code base. Some variables will be passed along according to the environment (Development/Staging/Production) that we’re deploying to.

Putting it all together

The last remaining piece of the puzzle was setting up an Azure DevOps release pipeline to enable the continuous delivery of our Python function to our different environments.

Pretty simple: we built the image and pushed it to our container registry, as we do when we develop locally. Then, we pushed the function to the Function App. Together with another Release Pipeline that applied the changes we made in Terraform to our existing infrastructure, we now had all the moving parts in place to offer Azure cost forecasts to our users!

It might not look too different from what we had at the start, but this is our first step in leveraging Prophet to create a more meaningful, more data-driven UI/UX for forecasts. Here it is:

The final result: new and improved forecasting in ShareGate Overcast

I hope you enjoyed this little walkthrough of the development process for ShareGate Overcast’s latest feature. I tried to go through as many different steps as I could, but if you’d like more details on a particular part, let me know in the comments!

Recommended by our team

What did you think of this article?

Stop wasting money in Azure: Watch our on-demand webinar to build an efficient Azure cost management strategy for your business