Registration open: Join our upcoming course “Strategic Sustainability Roadmapping”
Join our “Sustainability Roadmapping” course

The future of sustainable digital infrastructure

Summarizing our kick-off event with Elia Group, Microsoft and the SDIA to help speed up the transformation towards a sustainable digital infrastructure

Timo Müller
Timo Müller
June 22, 2021 | 18 min

With the cloud computing revolution over the last decade, we have somewhat lost sight of the physical aspects of computation. Today, resources are seemingly endless and we can spin up new powerful instances in the cloud in seconds. It's easy to forget that the actual computation is still happening on hardware that consumes vast amounts of energy. But who is responsible for the corresponding carbon emissions and how can we report them accurately? Together with the Elia Group and our friends of the SDIA, we started a long-term collaboration to find answers to these questions. This article aims to summarize our kick-off workshop from June, 18th 2021.

The case for collaboration: Netflix & the digital infrastructure value chain

Netflix has been often accused of causing high carbon emissions through its streaming service. While many of these accusations have been debunked by the latest research in the field, one would certainly expect that keeping their streaming service up and running is one of the biggest emission factors in Netflix's value chain. But a look into their latest sustainability report tells us otherwise. Only 5% of their total footprint is attributed to the operation of their data centers (Scope 3: Mainly AWS & open connect).

Now you might wonder and ask: "So where is the problem? If one of the world's biggest video-on-demand platforms has such a little footprint, why should we even care?" But the answer is more complex and reveals the challenges of decarbonizing our digital infrastructure: Running a CDN to serve content to millions of users is a relatively energy-efficient and thus low-carbon activity. But bringing these vast amounts of data to your device is not. However, doing so is the ISPs (Internet Service Provider) task and not directly attributable to the streaming companies.

So is Netflix "greenwashing" their sustainability report? No, by reporting their data center but not internet transmission emissions, they are following best practices. And even if they wanted to dig deeper, doing so accurately would mean analyzing thousands of ISPs and energy grids in all of their markets. (Note: DIMPACT is working on exactly this and is supported by Netflix and other major streaming companies).

But is Netflix - or are we - still responsible for finding solutions to these problems? Absolutely. What we can learn from this example is that we have to start looking at infrastructure-related emissions not just through the company lens, but we have to take responsibility for our industry as a whole. And that requires collaboration along the entire energy value chain, including the grid provider, the data centers, the ISPs, and the digital economy.

Loic Tilmann (Elia Group) on the energy markets and greenwashing

To kick off our first workshop on sustainable digital infrastructure, Loic Tilmann from the Elia Group gave us a brief introduction to the status quo of energy markets and the specific challenges that this market architecture presents to everyone who is interested in decarbonizing it. To get started, have a look at the following fictive example:

An eco-conscious consumer, Jenny, subscribed to an energy tariff with 100% RE to power her electric vehicle. The market profile of this provider shows that they are adding green energy to the grid throughout the day depending on the availability of sun & wind. At times where sun & wind are low, grey energy makes up for the lack of available RE. But since these are the exact times when Jenny is charging her EV, she ends up consuming entirely grey electricity.

In today's energy markets, GOs (Guarantees of Origin), GCs (Green Certificates), and PPAs (Power Purchasing Agreements) are sold on an annual basis, which makes it cheap and easy to market your energy as "green" without the need to invest into the infrastructure that would actually enable you to provide 24x7 renewable energy. Consequently, incentives are missing for market participants to work on solutions that drive actual change.

Replacing the annual GO mechanism with a real-time system

For change to happen, we need to rework this outdated GO mechanism, replacing it with a near-realtime bidding system that helps to turn the increasing demand for actual 100% RE from corporates and consumers into a reality in our grids.

Today, companies in the ICT sector make up for almost half of all PPA's purchased by corporates, but consumers and politics demand more. Corporates are under pressure to show that their systems are not just "greened" with cheap GOs or PPAs but actually run on low carbon energy 24x7.

A market design that allows for real-time matching of production and consumption would incentivize all players along the value chain to invest in the technology that will enable a truly sustainable digital infrastructure. As the digital industry operating the service layer of this segment, we can drive such change by building vertical alliances and demanding innovation from our direct business partners (Cloud & data center providers).

A new market design would kick-off a software revolution

A sustainable digital infrastructure is within reach. We have the hard & software, we have the skillset needed to decarbonize our grid. But we need to get out of our own bubble and start the conversation with the companies running the layers that power our SaaS, mobility, etc. companies.

The Elia Group is working on a blockchain-based PoC that allows companies to match green electrons to their data and energy consumption. Such a market design would inspire a new wave of sustainable applications like a "truly green Netflix" or a dynamic bitcoin mining revenue for times when the RE quota is high.

As software developers, it does not take much to imagine a plethora of applications and innovation that would follow if such a mechanism would be available for mass markets using easy-to-access APIs.

Microsofts 24x7 carbon matching with Vattenfall in Sweden

"Big Tech" is very much aware of these challenges. The best proof for this are the ambitious attempts of Google and Microsoft in figuring out ways to actually run their data centers on 24x7 renewable energy, contrary to the industry best practice of simply matching their energy demands by buying PPAs. Avi Allisson from Microsoft shared some insights about Microsofts' activities in that space with us.

In a pilot with Vattenfall, the tech giant has been using Azure's SmartUtility technology to build software that helps to monitor the "greenness" of the energy used in some of their Swedish data centers in realtime.

Microsoft will make the tech behind the project available to teams all over the world that are running on Azure in order to speed up this development. Avi outlined 3 major roles that they internally assigned to Microsoft's mission in regards to a sustainable digital infrastructure:

  • Providing the developer tools that enable the transition to 100% 24x7 green data centers

  • Being an early adopter of such services by partnering up and driving pilots

  • Publicly advocating for better data to drive the industry forward

Locational Marginal Emissions (LME) - reducing carbon is what matters most

Avi also shared information on how Microsoft is working on finding unified measures to quantify the emission reduction potential of new RE plants. This KPI, the LME can help to procure new RE plants with the highest immediate reduction potential.

Software needs to be architected in a way that it can respond to the availability of green energy

In his presentation, Max Schulze, co-Founder and operating chair of the SDIA pointed out the need for intelligent software in order to achieve a 100% sustainable digital infrastructure.

Distributed Software architecture that is containerized gives us the flexibility it needs to shift big workloads to data centers running largely on renewable energy. Having open data on green electrons available via API would allow us to e.g. move the scaling of a Kubernetes cluster to a data center close to a wind farm where the wind is expected to blow in the next hours, heavy workloads can be moved to sunny hours et cetera.

As software developers, we need to realize that we are operating on an actual energy grid and we need to be responsive to the nature of that underlying grid. While the value chain of digital infrastructure is complex and involves many layers of hardware, Software operates on all edges of these layers and is the decisive factor in enabling a smart and sustainable grid of the future. It is on us - as Developers - to make this happen.

Advocating open data along the value chain

The SDIA is actively working on creating a true footprint of the digital services we use every day. But in order to achieve this, they rely on companies along the value chain to share their data. The Elia PoC and cross-industry collaboration like ours are fundamental to bringing transparency into a complicated market.

Kai Schmied (Elia Group) on their green data PoC and new market design

As the final speaker of our kick-off workshop, Kai Schmied from the Elia Group presented their approach to a more granular "close to real-time" market design that would address many of the aforementioned challenges.

Such a market design could be implemented using a blockchain for decentral verification of new energy blocks being added to the grid. Smart meters at power plants would share real-time data on new energy being added to the grid. Together with the energy web foundation, the team developed a prototype that can achieve the above and allow market participants to purchase and sell blocks of energy on a 15-min basis. Such a system would ensure the matching of consumption and production of green energy in real-time. Instead of purchasing annual PPAs / GOs / GCs, market participants could match their demands with actual green energy being produced at this specific moment. This would alter the market dynamic and translate the high demand for 100% renewable energy from corporate players into actual change in our grids.

Additionally, the corresponding data would allow us to create a smart software layer that would achieve a more intelligent distribution of loads leading to new applications build atop of that.

Kai demoed their PoC in a live session, simulating a market participant that can overlook in real-time how the current energy mix of their data center is made up and react accordingly.

To further advance this design, the team at Elia is looking for companies and data centers interested in working with them to further drive the project towards market readiness.

If you are interested in joining the project, please sign up here:

Questions answered during the Q&A sessions

The kick-off workshop was very much what we were hoping it to be, a lively discussion between CTOs, data center architects/operators and grid providers. Half of the participants stayed together for one full hour after the official end of our 90-min meeting (!). One of the reasons for this cross-industry collaboration is to educate our community members and the broader industry about the topic of sustainable digital infrastructure and prepare a solution space for innovation. To share some of the insights for those who could not attend the meeting, we tried to capture and transcribe some of the discussions triggered by our speakers.

  • Noam Revach: Is Microsoft still working on underwater cooling to increase efficiency?

  • Avi Allison (Microsoft): It is one of several attempts to drive down PUE. Generally speaking data center efficiency has improved by several magnitudes of order in the past years.

  • Julia: How do you plan to get rid of the diesel backup generators in data centers?

  • Avi Allison (Microsoft): There are several options and Microsoft is working on evaluating and implementing these in the upcoming years. a) Replacing the fuel in the generators with carbon-neutral fuels (e.g. hydrogen) or powering them with batteries. b) Moving data centers into areas with very reliable nets and making systems more intelligent to prevent the need for such generators at all

  • Chris Adams (The Green Web Foundation): Is demand/load for the cloud flat or does it follow certain patterns? Is there room to optimize and balance the load based on RE availability?

  • Avi Allison (Microsoft): The systems are built to keep the servers at a relatively stable level all of the time by design. So there is little room for adjustments. Mainly because a) electricity utilities prefer flat curves, and b) hardware is expensive and optimized for little excess load

  • Manon: What are the key takeaways for smaller companies that operate smaller fleets of data centers in-house or on-premise?

  • Max Schulze (SDIA): Participate in projects like that of Elia, thus making sure that the energy consumed is actually green. Plus talk to your data center providers and other market participants: Show your interest/need for sustainable solutions to the market!

  • Manon: How can we make sure that waste heat is used and hardware components are recycled and reused?

  • Max Schulze: We are actively working on this topic. You can be sure that especially with the current chip crisis, this change is already underway

  • Trevor Hinkle: When it comes to Locational Marginal Emissions (Microsoft), how do you weigh short-term emission savings versus long-term emissions savings? Are you working on publishing any of your procurement models to help others learn from this?

  • Avi Alisson: These models are still very early stage and currently focusing only on short-term energy savings. Going forward there will be lots of learnings and coops with TSO's can further improve the procurement process going forward.

  • Jonathan Evans: Why is immersion cooling in data centers still not mainstream?

  • Max Schulze: The solutions exist, but it is on us, as market participants, to demand these solutions. The cloud works a bit like a curtain between the data center and our digital economy - we have to remind ourselves that we need to demand efficiency in order to get more efficiency. Only then can we speed up the transition.

  • Timo Müller (LFCA): Why is Google faster with adopting a 24x7 strategy for their data centers than Microsoft?

  • Max Schulze (SDIA): Google has a very specific use case for their product-related data centers (search, mail, etc.). For these internal products, they are able to shift heavy loads and work with high redundancy. This is a completely different use case compared to operating Cloud Services for third parties. It is much easier to skip the diesel backup generator if you can live with 99% up-time, you can't do the same for an enterprise-level cloud system where your customers expect 99.95% up-time.

  • Daniel: In the context of variability of carbon intensity relative to location - is this within the same grid (i.e. not across interconnected grids)? I thought that a grid provides a pooling effect within which the 50Hz balancing mechanism satisfies demand “with the speed of light”. In other words - it should not matter if one consumer is closer to some renewable capacity than another because their marginal effect on the frequency propagates instantaneously across the grid and will be matched by whatever capacity can next respond — and not the capacity that is most proximate. If the above is true - then it would not make much sense to assign varying local carbon intensities based on proximity to some renewable capacity

  • Chris Adams (Green Web Foundation): @Daniel as I understand it local capacity for transmission is a problem in some of these cases. So, just like network pipes have a set capacity, we see similarities with the grid for energy itself

  • Loïc Tilman (Elia Group): Agree, except if we consider the risk of congestion that could happen on the grid. In other words, this assumption is correct if we assume a complete copper plate but in reality, we have sometimes some congestion on the grid. Therefore in some cases, the location could have an impact.

  • Daniel: On Chris' point - is it then necessary to distinguish between peak and off-peak demand response?

  • Chris Adams: Yes, I think so. I’m happy to chat in more detail why, but that Avi’s saying about locational marginal emissions is a really helpful concept

  • Brian: Hi Daniel - your make good points. Our findings are that locational actually matters a great deal because the grid is not a single pool of electrons, it is a series of connected pools limited by transmission congestion among other factors. This matters for carbon outcomes, in both zonal- and LMP-organized markets.

  • Loïc Tilman: Depends on how you define peak of peak, as congestion can happen for different situation (wind or solar, massive load unforeseen)

  • Chris Adams: @Daniel, this is a really deep rabbit hole you can go down. This podcast interview is about as accessible as I’ve been able to find it

  • Chris Adams: @Daniel let's chat after - there’s some stuff we’ve been doing with SCION to carbon aware networking that takes this into account. Branch issue two came out today, and I’ve outlined some of the ideas here about carbon aware compute, for a developer audience:

  • Brian: Max - 100% agree, we need APIs, but currently, in zonal markets, there are no APIs for LME, i.e. CO2 data is zonal. This is not granular enough to maximize the value of RE down to a meaningful, locational level. We would welcome a chance to work further with grid operators on the next steps in innovation.

What's next: Here is a quick rundown on how you can help us:

This workshop has been a kick-off. It's on us to move things forward and do more than just talking. If you stayed with me for that whole long article, you are most likely interested and passionate about the topic, and it's people like you who can make a change. Check the list below and email us suggestions to complete the list.

For CTOs of companies offering digital services

  1. Evaluate your current digital infrastructure in terms of efficiency and carbon intensity:

    1. can you move services from dedicated servers into the more efficient cloud?

    2. can you make your software more efficient (e.g. breaking a big monolith down to independently operating microservices or functions)

    3. does your company home page or blog require a server or can you make use of new architectures like serverless paradigm / JAM Stack (e.g. gatsby) to host it statically on an efficient CDN?

    4. can you isolate parts of your backend that are rarely used and move those into stateless functions running in the cloud only on demand? (FaaS)

    5. can you move heavy workloads to times when more renewable energy is available at your data center locations?

  2. Use your ability to innovate: How can software help to decrease carbon emissions? How can the specific product of your company help others to decrease emissions? How can you use your superpowers and your domain knowledge to advance the market? Have a conversation with your product team - explain the problem and sit together to brainstorm solutions. This is a highly technical and complicated matter, don't just wait for your product team to discover these pain points (they are hard to spot)

  3. Talk to partners: Apple has convinced 23 of its suppliers to commit to using 100 percent renewable electricity. In February, BT inserted a clause in a new contract with Huawei, requiring the tech giant to measure and reduce emissions to save an estimated 130,000 tonnes of carbon dioxide.

For data centers:

  1. Join the Elia Group PoC to start consuming actual green energy whenever possible. Advocate and pioneer these or similar solutions. Share the news about it and offer your customers to be at the forefront of this revolution.

  2. Start a conversation with your biggest clients and check in on their interest in more energy-efficient solutions. Are they willing to pay a premium for fewer carbon emissions? Companies are under pressure to reduce emissions, if you can offer them ways to do so, you solve pain points and build up true USPs

  3. As long as approaches like the aforementioned PoC are not yet available on mass markets, buy PPAs to cover your energy needs. And when procuring, evaluate offers on additionality and if possible LME

  4. Talk to partners: Apple has convinced 23 of its suppliers to commit to using 100 percent renewable electricity. In February, BT inserted a clause in a new contract with Huawei, requiring the tech giant to measure and reduce emissions to save an estimated 130,000 tonnes of carbon dioxide.


SDIA - Sustainable Digital Infrastructure Alliance:

The Green Web Foundation:

The Green Software Foundation:

Estimates of watching Netflix:

Designing carbon aware digital experiences:

Impact of video streaming:

Why local solar costs less:

Carbon aware kubernetes:

Small plug for a conference paper we wrote a couple years ago: A. James “A Low Carbon Kubernetes Scheduler”: It proposes a Kubernetes based global migration of workload with the sun

Google had a Workshop on "Carbon Aware Computing" the last two days:

The problems with the GHG approach for understanding cloud emissions are outlined pretty well here by David Mytton in this paper:

On another note, reexamining the "master-slave" terminology:

Activity Browser:

Download: Presentation of Max Schulze, SDIA

Download: Presentation of Loic Tilmann, Elia Group

Download: Presentation of Kai Schmied, Elia Group

Stay up to date and help us to build a truly sustainable digital infrastructure: