Semiconductor

By Dr. Jacques Bughin

Semiconductors are positioned at the forefront of innovation and digital transformation. With companies like Nvidia leading the charge and the semiconductor sector potentially reaching a staggering valuation of $1 trillion in the next five years, we contemplate the bright future and emerging trends shaping this critical industry. 

The Future Looks Bright for Semiconductors 

For those who may not remember, “Everybody Loves Raymond” ran for nine seasons on CBS in the US, and was (apparently) voted the 35th-best sitcom of all time by Rolling Stone magazine. Borrowing the title to include Nvidia is not far-fetched. As the Financial Times recently reported, Nvidia ranks 5th in terms of the number of hedge funds holding shares and, most importantly, it is the stock that has added the most this year. 

This interest is clearly linked to the fact that the digital revolution is finally putting all the pieces together, with the cloud, big data and AI. But for this to run, one needs semiconductors. 

The role of semiconductors in electronic circuits and lasers demonstrates their undeniable importance in our modern world.

Since their inception, semiconductors have radically changed the course of technology, with the successful demonstration of the first transistor in the 1940s. The use of semiconductors as the base material for optical fibres was then widely introduced in 2000. The role of semiconductors in electronic circuits and lasers demonstrates their undeniable importance in our modern world. As the world moves into the next phase of digitalisation and the Web 3.0 era, semiconductors are once again at a crucial inflection point. This is not just due to geopolitical factors such as TSMC-Taiwan and China, or supply chain disruptions caused by the COVID-19 pandemic, which has led to delays in various industries, including automotive. Instead, it is being driven by the shift towards electronic and electric vehicles, the transition to 5G/6G wireless networks and, mostly, artificial intelligence.  

The recent surge in the share price of Nvidia testifies to the enthusiasm for semiconductors. The SMH index of 25 industry leaders is up around 25 per cent this year, with a lower beta than most technology and artificial intelligence stocks. These trends point to a bright future for semiconductors, with forecasts suggesting that the sector could reach a valuation of US$1,000 billion over the next five years. 

Bubble or beyond Moore? Five trends to consider 

However, this optimism begs the question: is this growth sustainable, or is it simply a bubble? For some players, such as Nvidia, their share price performance is closely linked to their return on assets (ROA) and return on equity (ROE), which have expanded significantly in recent months. Nevertheless, as the semiconductor industry continues to evolve at a rapid pace, it is important to identify and manage new dynamics around at least five emerging trends. 

  • Trend 1: Hello, (generative) AI. How will demand evolve, especially with the emergence of generative AI models driving semiconductor demand? 
  • Trend 2: Product evolution. Is silicon still the reigning champion, or will compounds such as gallium nitride (GaN) dominate the landscape, thanks to their superior electrical properties and energy efficiency? 
  • Trend 3: Dual transformation. Can sustainability and digitalisation coexist harmoniously, or is the energy-hungry nature of digital technologies a stumbling block? 
  • Trend 4: Hyper-competition. How will the competitive landscape evolve as technology giants increasingly design their own chips? 
  • Trend 5: Battle of the platforms. The rise of the ARM architecture is challenging the dominance of the x86 architecture. How will this reshape the semiconductor ecosystem, particularly in terms of chip architectures and supplier dynamics? 

Trend 1: Generative AI 

One of the key drivers of semiconductor demand in recent months has been the development of powerful generative AI models to complement already-burgeoning AI applications such as deep learning, computer vision, robotics, and Internet of Things (IoT). 

For these models to work, a special type of chip – AI accelerator chips (or deep learning processors) – is needed to speed up AI computations, making them significantly faster and more energy efficient than using general-purpose processors. These AI accelerators often have multiple cores and focus on low-precision arithmetic, are optimised to process data with reduced precision using new dataflow architectures, and efficiently process data through specialised pipelines. 

AI accelerator chips (or deep learning processors) – is needed to speed up AI computations, making them significantly faster and more energy efficient than using general-purpose processors.

Key supplies include Nvidia’s Tensor Processing Units (TPUs) and AMD’s Radeon Instinct accelerators. These specialised chips are optimised for deep learning tasks and are widely used in data centres for AI inference and training. Nvidia’s Tesla GPUs, for example, are powering AI applications in sectors ranging from healthcare to autonomous vehicles, demonstrating the important role of semiconductor companies in meeting the evolving demand for AI. 

The uncertainty does not come from the surge in AI demand; it comes from the fact that there is no dominant design (yet?), and the evolution of AI in terms of horizontal / vertical LLMs is not yet defined. 

Trend 2: Is silicon here to stay? The rise of gallium nitride (GaN) 

GaN is a compound semiconductor with superior electrical properties that will usher in a new era of energy-efficient electronics. GaN has a very hard crystalline structure and a wide bandgap, making it more suitable for high-power, high-frequency optoelectronic applications such as blue LEDs, microwave power amplifiers, and space applications (e.g., solar panels on satellites). 

However, it is increasingly being used in power supplies for electronic devices, converting alternating current from the grid into low-voltage direct current. GaN technology can handle larger electric fields in a much smaller form factor than silicon, while offering much faster switching. GaN is becoming indispensable, for example, in power conversion platforms, where silicon has reached its limits, or in the transition from mobile computing to Web 3.0. GaN chips are also easier and faster to manufacture than silicon chips, a major drawback of semi-finished products in the recent past, so companies are turning to GaN for smaller, more efficient electronic devices. 

Trend 3: The promise of dual transformation 

Dual transformation is the hope that sustainability and digitalisation are highly complementary. For example, digital technologies can enable people to work efficiently from home, reducing the environmental impact of commuting. At present, however, the vision of dual transformation is not justified, because digitalisation is an energy-intensive process. A single semiconductor factory can consume up to 1 TWh of energy per year and 2 to 4 million gallons of ultra-pure water per day. Semiconductor manufacturers have understood the challenge and, like native digital players, are unveiling their sustainable development initiatives. These include moving cloud workloads to GANs with access to renewable energy or improving semiconductor design. However, moving to GANs is a game changer, because it radically reduces energy consumption. 

Trend 4: Hyper-competition? 

Until now, the AI mega-users (e.g., Google, etc.) have outsourced the value chain, buying chips from third parties. But this is changing. Many tech giants, such as Apple, Tesla, Google, and Amazon are now making their own chips, designed specifically for their products. Google has just unveiled its new Pixel 6 and Pixel 6 Pro phones, which use Tensor, the first chip designed by Google to bring AI capabilities to its range of mobile phones. Apple’s new MacBook Pros 2021 are based on the company’s own M1 chips. Importantly, this development could challenge the current horizontal model of AI players such as Nvidia. 

This move towards in-house chip development could challenge the current model of outsourcing semiconductor production to third-party manufacturers. In particular, the trend could disrupt the dominance of companies such as Nvidia in the horizontal AI accelerator market, as technology giants seek greater control over their semiconductor supply chains. 

Trend 5: Platform battle: chip architectures 

The x86 architecture has dominated the microprocessor industry for more than 50 years. However, this is changing with the growing popularity of the ARM architecture. While this architecture was born out of the need for low-power chips for vertical applications, it is beginning to establish itself not only as a low-power solution, but also as a high-performance competitor to the established x86 players

Google and AWS have decided to build their own chips, choosing the ARM architecture for its performance and low power consumption, which has become so important for power-hungry data centres, consumer products, and sustainability efforts. This growing shift to the ARM architecture is changing the dynamics of the semiconductor ecosystem. Unlike the x86 platform, where companies can buy from one or two suppliers, ARM has become a broker, making its intellectual property available to multiple companies. 

About the Author

Jacques Bughin

Dr. Jacques Bughin is the CEO of Machaon Advisory and a professor of Management. He retired from McKinsey as senior partner and director of the McKinsey Global Institute. He advises Antler and Fortino Capital, two major VC/PE firms, and serves on the board of multiple companies.

LEAVE A REPLY

Please enter your comment!
Please enter your name here