GPUs driving the future of computing – ShareCafe

Written by Hamish Chamberlain – Head of Global Sustainable Equities and Richard Claude – Technology Equities Portfolio Manager

Graphics processing units (GPUs) are the drivers of the future of computing. Designed for parallel processing, the GPU is a specialized electronic circuit card that works in conjunction with a computer’s brain, the Central Processing Unit (CPU), to improve computing performance. If you’re reading this article on an electronic device, it’s possible that a graphics processing unit (GPU) is powering your display.

While GPUs were initially used in computer graphics and image processing for personal and commercial computing, the use case has expanded exponentially with the development of technology. Moore’s Law—the observation that the number of transistors in an integrated circuit doubles every two years while the cost of computing halves—has democratized the use of GPUs by making them cheaper and more readily available, thus shifting the adoption of GPUs across multiple industries. Today, high-performance GPUs are fundamental to many different technologies and will form the basis of the next generation of computing platforms.

GPUs are designed to run a large number of workloads simultaneously to increase computing efficiency and improve overall computing performance. While this is useful for end markets such as gamers who enjoy high-quality real-time computer graphics, it can also be applied to more serious use cases.

GPUs’ ability to process large blocks of data in parallel makes them ideal for training artificial intelligence (AI) and deep learning models that require intensive parallel processing of hundreds of thousands of neural networks simultaneously. The application of deep learning is broad, from enabling web services to optimizing autonomous vehicles and medical research.

While GPUs have already had a positive impact on real-world challenges, the opportunity to shape innovation across industries has yet to be fully explored. Implementation of artificial intelligence and deep learning are essential to creating a successful digital future, and this is already becoming a reality as the digitization trend grows. It is important to realize that this trend affects all industries, and as such, effective and robust technological capabilities are essential as companies begin their digital transformations.

In terms of its impact, we believe digitization plays a positive role in economic development and social empowerment, and we also see a close alignment between digitization and decarbonization. The ‘cracks’ of digitization are opening the envelope of traditional analog functionality, enhancing data transparency and providing businesses and individuals with the right knowledge to make informed decisions about consumption, production and reduction based on their current behaviors. For example, ambitious goals to reduce carbon emissions and achieve climate goals can benefit from data mining, transformation, and analysis to determine the best course of action.

We are already beginning to see digitization break through and advance traditional practices – manufacturers are incorporating technology into industrial processes to improve production, building managers use smart technology and data analytics to ensure energy is consumed only when necessary, and intelligent transportation systems analyze traffic data to reduce congestion, fuel use and emissions. Elsewhere, many digital services are beginning to replace traditional methods that often require more energy-intensive consumption, such as online meetings to reduce business travel, reducing carbon emissions globally.

One of the critical challenges in global digital transformation is the significant power required for high-performance computing. It is important for us to understand the true energy cost of technology, and what can be done to reduce total energy consumption.

There is a misconception that an increase in data center usage equates to an increase in energy demand. According to the International Energy Agency (IEA), data center energy use has actually remained flat despite increased demand for data centers and Internet traffic – Chart 1. This disparity is driven by efficient systems and processes. GPUs reduce significant power fatigue of high-performance computing in data centers. For AI applications, some GPUs can be up to 42 times more energy efficient than traditional CPUs. Meanwhile, some ultra-wideband GPU-based data centers use only 2% of rack space, making them more space-saving compared to CPU-based systems. 2 In short, GPUs pack a punch. By enabling smarter use of energy, they are partly contributing to keeping energy use to a minimum.

Graph 1: Data center energy use remains flat

Source: watchdog, global trends in internet traffic, data center workloads and data center energy use, 2010-2020, watchdog, paris https://www.iea.org/data-and-statistics/charts/global-trends-in-internet-traffic-data-centres-workloads-and-data-centre-energy-use-2010-2020

Like all industries, technology will have to do its part to tackle global climate change and reduce its environmental footprint, with the goal of achieving net zero emissions. In 2020, the International Energy Agency (IEA) released its annual Clean Energy Progress Tracking Report, which reports on key energy technologies and sectors considered critical to slowing global warming. Of the 46 sectors, the International Energy Agency ranked data centers and data transmission networks as one of only six that were on track to achieve the sustainable development scenario. However, the increase in global internet use during COVID-19, driven by increased streaming of video, conferencing, online gaming and social networking, saw this rank as “More Efforts Needed” in the 2021 report.3

Despite this setback, we believe that focusing on continuous efficiency improvements in the data center infrastructure is integral to achieving net-zero goals, enhancing the role that GPUs play in creating a sustainable digital world.

While there are many benefits to a widespread use case of AI, greater adoption of the technology leads to significant underlying ethical risks.

In cases where AI is cheaper, faster, and smarter than human labor, it can be used to replace the existing workforce; Chatbots have replaced call center employees due to the ability of artificial intelligence to process natural language, many factory workers have been replaced by automated factory machines, and robotic taxis could soon replace human drivers. We are aware of the impact this can have on employment, especially in concentrated areas, and believe it is necessary to consider the long-term consequences for society in these cases. However, we also see benefit in letting go of some of the monotonous job roles of AI. By freeing up human capital, it provides the opportunity for individuals to participate in more fulfilling roles that are not possible for AI – personal training, creative design, and teaching. In doing so, we believe that society can be enriched for the better.

It is also important to acknowledge the potentially ominous uses for which technology can be used. The US government recently restricted the export of nVIDIA’s high-end GPU chips to China in a bid to prevent some Chinese companies from purchasing GPUs to enable mass surveillance, particularly in the case of Uyghur Muslims. Any restrictions aimed at minimizing potential ethical threats to society are fully welcome.

Some companies, including nVIDIA, have also used ethical frameworks to implement “trusted AI” principles within the company’s product ecosystem. We see great value in putting ethical principles at the heart of product design and development to promote positive change and transparency in AI development.

Digitization is the cornerstone of our future. From humble beginnings, the GPU has evolved into one of the most important facilitators of innovation and digital transformation for society. We also believe that the next generation of computing is essential to achieving global sustainability goals. When analyzing individual companies, we believe the shift to a low carbon business model is a sign of long-term success, and we look to technology to enable that change.

footnotes

1 Nvidia Blog, “World-record-setting DNA sequencing technology helps clinicians quickly diagnose critical care patients”, 2022

2 Nvidia, Corporate Social Responsibility Report, 2021

3 International Energy Agency, Clean Energy Progress Tracking Report, 2022

definitions

deep learning: It involves feeding a computer system a lot of data, which it can use to make decisions about other data. This data is fed through neural networks – logical constructs that ask a series of binary true/false questions, or extract a numeric value, from all the data passing through them, and classify them according to the answers received.

Leave a Comment