The Road to Sustainable IT Starts at the App & Microprocessor
We have seen the popularity of Sustainable IT in the enterprise rising. Is it truly a new CIO initiative or simply a revival of the early 2000’s ‘Green IT’ hit?
Sustainable IT is the practice of designing, using, and disposing of information technology in ways that minimize its environmental impact. Its goal is to reduce the carbon footprint and energy consumption of end-to-end IT operations, which includes everything from manufacturing and using computers and servers to managing data centers and disposing of electronic waste.
The scary thing is that I was deeply involved in the first go-around even going as far as to present at the 2009 COP15 in Copenhagen a global energy reductions model for IT. So, what’s different now? Will Sustainable IT make a real impact on any corporate ESG goals?
Let’s start by looking at what is different today. To begin with, there is an abundance of data regarding the amount of energy consumed at the macro level, e.g. at the server, the server rack, storage arrays, the data center facilities (be it on premise, co-location, or with a cloud service provider). There are metrics such as the Power Usage Effectiveness or PUE which indicates the overall efficiency of a compute organization (Note: I had a small hand in helping the Green Grid and Jonathan Koomey develop the metric 😊).
However, in a world where customers are driving to an ‘everything as a service’ business model, the granularity of measurements are not good enough to truly identify what is consuming energy. Simply put, while it is the IT infrastructure and hardware that consumes energy and creates a carbon footprint, it is applications that drive energy consumption across the whole infrastructure. While there is innovation happening to measure the detailed energy consumption at a microprocessor level, this is still not detailed enough.
The emissions digital thread for IT needs to start with the software applications!
Pulling on that thread will take the measurement journey from emissions and consumption at the lines of application code, to the server and disk storage devices, the racks that house the hardware, to the data center facilities and finally to the utility supplying the energy. The compounding effect on CO2 emissions and kilowatts consumed of these steps on the original application impact is missing. While taking on an application resource optimization project is not easy, it really does bring financial and environmental savings. It is also a cross functional effort that includes business units, development, procurement, security, facilities management and of course the sustainability office. Finally, the results of this initiative will impact the decision and priorities of any application migration or rehosting as part of any migration strategy.
To start any application resource optimization, businesses need to assume that they need to be prepared to measure each application’s resource consumption whether it’s a legacy standalone application or part of a virtual machine/container environment. By accessing the IT CMDB (configuration management database), map memory, compute, and disk storage resource consumption to validate application modernization and migration priorities. Despite the number of applications to be measured being large in some companies, the 80-20 rule seems to apply. Eighty percent of emissions are accounted for by 20% of the applications.
Often applications get chosen to be migrated to the cloud based on their software licensing costs without knowing what their environmental costs are. However, over the life of the asset, energy costs are frequently larger than the hardware and software acquisition costs. By auditing and mapping the carbon emissions footprint of each application, migration and modernization prioritizations change will result in lower emissions and energy costs. This approach will lead to application development and testing deployment teams having a better understanding of how to embed sustainability into their application portfolio.
Finally, corporate sustainability offices have turned a blind eye to the carbon footprint of infrastructure utilization. Why should they be bothered accounting for something that they believe typically accounts for 4 – 6% of a business’ total emissions? IT is often overlooked as an emissions reduction priority. However, with the advent of emerging technology trends such as AI, Digital Twins, and Edge Computing, data center energy consumption will see an explosion in energy demands to the point that within the next 5 years, the total emissions for IT infrastructure will double. This will put pressure on fossil and renewable energy sources to power the data centers, create water security issues and a larger IT asset recycling problem.
The solution to a successful sustainable IT industry lies in two places. Applications need to be optimized for efficiency as well as functionality. The better the application runs, the larger the savings multiplier will be throughout its value chain. Since applications don’t generate emissions, but rather the compute platform that they run on does, it makes sense to accelerate the investment in significantly more efficient computing by starting at the microprocessor.
We have seen the popularity of Sustainable IT in the enterprise rising. Is it truly a new CIO initiative or simply a revival of the early 2000’s ‘Green IT’ hit?
Sustainable IT is the practice of designing, using, and disposing of information technology in ways that minimize its environmental impact. Its goal is to reduce the carbon footprint and energy consumption of end-to-end IT operations, which includes everything from manufacturing and using computers and servers to managing data centers and disposing of electronic waste.
The scary thing is that I was deeply involved in the first go-around even going as far as to present at the 2009 COP15 in Copenhagen a global energy reductions model for IT. So, what’s different now? Will Sustainable IT make a real impact on any corporate ESG goals?
Let’s start by looking at what is different today. To begin with, there is an abundance of data regarding the amount of energy consumed at the macro level, e.g. at the server, the server rack, storage arrays, the data center facilities (be it on premise, co-location, or with a cloud service provider). There are metrics such as the Power Usage Effectiveness or PUE which indicates the overall efficiency of a compute organization (Note: I had a small hand in helping the Green Grid and Jonathan Koomey develop the metric 😊).
However, in a world where customers are driving to an ‘everything as a service’ business model, the granularity of measurements are not good enough to truly identify what is consuming energy. Simply put, while it is the IT infrastructure and hardware that consumes energy and creates a carbon footprint, it is applications that drive energy consumption across the whole infrastructure. While there is innovation happening to measure the detailed energy consumption at a microprocessor level, this is still not detailed enough.
The emissions digital thread for IT needs to start with the software applications!
Pulling on that thread will take the measurement journey from emissions and consumption at the lines of application code, to the server and disk storage devices, the racks that house the hardware, to the data center facilities and finally to the utility supplying the energy. The compounding effect on CO2 emissions and kilowatts consumed of these steps on the original application impact is missing. While taking on an application resource optimization project is not easy, it really does bring financial and environmental savings. It is also a cross functional effort that includes business units, development, procurement, security, facilities management and of course the sustainability office. Finally, the results of this initiative will impact the decision and priorities of any application migration or rehosting as part of any migration strategy.
To start any application resource optimization, businesses need to assume that they need to be prepared to measure each application’s resource consumption whether it’s a legacy standalone application or part of a virtual machine/container environment. By accessing the IT CMDB (configuration management database), map memory, compute, and disk storage resource consumption to validate application modernization and migration priorities. Despite the number of applications to be measured being large in some companies, the 80-20 rule seems to apply. Eighty percent of emissions are accounted for by 20% of the applications.
Often applications get chosen to be migrated to the cloud based on their software licensing costs without knowing what their environmental costs are. However, over the life of the asset, energy costs are frequently larger than the hardware and software acquisition costs. By auditing and mapping the carbon emissions footprint of each application, migration and modernization prioritizations change will result in lower emissions and energy costs. This approach will lead to application development and testing deployment teams having a better understanding of how to embed sustainability into their application portfolio.
Finally, corporate sustainability offices have turned a blind eye to the carbon footprint of infrastructure utilization. Why should they be bothered accounting for something that they believe typically accounts for 4 – 6% of a business’ total emissions? IT is often overlooked as an emissions reduction priority. However, with the advent of emerging technology trends such as AI, Digital Twins, and Edge Computing, data center energy consumption will see an explosion in energy demands to the point that within the next 5 years, the total emissions for IT infrastructure will double. This will put pressure on fossil and renewable energy sources to power the data centers, create water security issues and a larger IT asset recycling problem.
The solution to a successful sustainable IT industry lies in two places. Applications need to be optimized for efficiency as well as functionality. The better the application runs, the larger the savings multiplier will be throughout its value chain. Since applications don’t generate emissions, but rather the compute platform that they run on does, it makes sense to accelerate the investment in significantly more efficient computing by starting at the microprocessor.