Describe the problem you are trying to solve

Global climate change is the largest challenge of our generation. Increased carbon emissions over the last century have caused more extreme weather events at rates never seen before. In Australia, these have manifested as intense fires and floods that are increasingly making areas of the country less habitable. While businesses have been quick to migrate their applications to cloud providers, optimisation has been focused purely on finances (cost of goods sold). Carbon emissions is a new dimension in cloud optimisation. Cloud providers with datacentres distributed across the globe are in a unique position to take advantage of the geographical differences in energy production and enable greener solutions to be designed. With tools like the Carbon Aware SDK and the Carbon-Optimised Workload Manager, solution architects can shift their application’s workloads geographically and temporally to reduce the carbon emitted by their applications.

Describe what your project does exactly in layperson's terms

All programs need to on a computer somewhere. This is typically on a server in a datacentre, chosen based on speed, cost, and regulatory requirements. My solution allows companies to use global emissions data to minimise their program’s carbon emissions by shifting it to a region where, or time when, electricity emits less carbon. For example, energy demand in off-peak hours is less and usage can be drawn more from renewable sources such as wind rather than baseload sources such as coal and gas. This results in fewer carbon emissions.

Describe how it uses the API/SDK

The job routing library uses the Carbon Aware SDK to determine the Azure cloud region where electricity produces the lowest gram of CO2 emitted per kilowatt. The dispatcher uses this information to send the job to a different region for processing.  Locally, the CLI can produce synthetic job data and process it using the routing library to estimate carbon emission savings. This can be used on historical data to quantify what savings could be and to evaluate the library’s efficacy.  

Describe how impactful it might be in terms of CO2 reductions and user reach (an example of low reduction but high reach would be targeting website requests,  where a solution could make a very small reduction in CO2 emissions per request, but with millions of requests the impact could be significant. And example of high reduction and low reach would be a chlorine factory, where you might only reach one plant, but that would be enough to cut as many emissions as a few million web requests above.

The efficacy of the router was evaluated by applying the algorithm on 10,000 generated jobs of constant duration that were each randomly assigned start times and geographic regions. The baseline was determined to be 5,884,285 g CO2/kw, and the geo-shifted workload was 3,726,081 g CO2/kw. This is a 36.68% reduction when applying geo shifting alone. 

Time-shifting (deferring workloads using a NotLaterThan concept) was not evaluated but is hypothesised to further increase the reduction. Jobs of varied duration were not considered however the additional temporal elements would necessitate time-shifting functionality to produce optimal results. 

The workload manager model can be applied broadly to any solution with discrete units of computation, with some considerations. First is additional data transfer time if the volume needed for compute is high. A remote server may spend longer reading the data than a local server would and thus use more energy, negating the benefit of geo-shifting. Furthermore, the assumption is that compute can be switch off when not in use. An orchestrator could manage a service’s global compute to ensure servers are not sitting idle.  

Its generic nature allows it to be adopted by solution architects globally and applied to many projects. This would amplify the realised reductions beyond what any single company could achieve. 

Describe how feasible it is for you to get to a production ready state with a bit of time or prize money; how feasible it would be for others to implement, and how likely it is that they would choose to use your tool if made available

The solution includes almost all the pieces required for the dispatcher to geo-shift workloads.  

It has a: 

  • Object model (Entity Framework Core) 
  • SQL Database (SQLite by default) 
  • Routing library (C#) 
  • Job queue (Azure Service Bus) 
  • Dispatcher (Azure function with SB trigger) 
  • CI/CD pipeline (Github Actions) 
  • Bicep file to deploy all the above  

For production readiness: Caching for API responses. 

  • Today, two API calls are made per route, but the Carbon Aware SDK can return data for a large time range. One API call could fetch and store this, then return answers intelligently from the local copy. This is quite a complex component but is necessary for production 
  • Other actions such as forecasting and time shifting can be added to the routing library and dispatcher with minimal additional work.  
  • Similarly, job metadata can be expanded to include new dimensions such as predicted duration, sovereignty, latency, and data transfer volume requirements. These would then enable the route to intelligently decide which requests to optimise to minimise the impact on the user experience/interactive operations. 
  • Unit tests. The components are architected using interfaces and dependency injection and are easy to write tests for. 
  • The way to route requests to is not defined. This is application specific, but certain pre-packaged destinations could be added and available by default. 
  • Configuration of all the components that is simple and comprehensive needs to be added. 
  • Verify results using CO2 emissions data from cloud providers to confirm that it is having a material impact. If the datacenters are running off a separate green grid, then the focus might need to shift. 
  • An orchestration model for turning off compute that is unlikely to be used.

Describe what is your vision for how the solution you have started to build in this hackathon could make a difference to the world.

I would like to see my solution be adopted by companies around the globe to make their applications and services more efficient. In my career, I have seen that optimisation far too often comes after functionality and the only dimension considered is dollars. Making information about carbon emissions more available helps increase awareness of the real-world impact that their services have. These solutions can then be integrated to help combat the problem and give companies a positive message on their journey to zero emissions.

The URL to the prototype app and/or code (e.g. Github + e.g. 

Link to the YouTube video. 

Nature icons created by Freepik - Flaticon (

Swap icons created by bqlqn - Flaticon (