$AMD Q4 2023 AI-Generated Earnings Call Transcript Summary

AMD

Jan 31, 2024

The operator welcomes participants to the AMD Fourth Quarter and Full Year 2023 Conference Call and introduces the Vice President of Investor Relations, Mitch Haws. He reminds participants that the call is being recorded and introduces the speakers, CEO Dr. Lisa Su and CFO Jean Hu. The call will primarily focus on non-GAAP financial measures, and the full reconciliations are available on the company's website. The speakers will be attending upcoming conferences, and the discussion may contain forward-looking statements that involve risks and uncertainties.

Lisa Su, CEO of the company, reported on the strong performance of the Data Center segment in the fourth quarter, with record revenue and significant growth driven by the ramp of Instinct AI accelerators and demand for EPYC server CPUs. Overall, the company saw a 10% increase in revenue in the fourth quarter and a 4% decline in annual revenue, with Data Center and Embedded segments accounting for more than 50% of the revenue. The Data Center segment saw a 38% year-over-year growth and set quarterly and annual revenue records, with significant gains in server CPU revenue share and demand for 3rd and 4th Gen EPYC processors.

In the Cloud market, there was an increase in server CPU revenue due to the expansion of 4th Gen EPYC Processor deployments by North American hyperscalers. In the Enterprise market, sales grew significantly with multiple wins from large companies in various industries. Customers are also excited for the upcoming Turin family of EPYC Processors, which will offer improved performance, efficiency, and TCO with new features such as the next-gen Zen 5 core and higher core counts. The Turin is on track to deliver overall performance and efficiency leadership when it launches later this year.

The Data Center GPU business had a strong quarter, with revenue exceeding expectations due to the faster ramp for MI300X with AI customers. The MI300 accelerator family, launched in December, has received positive feedback from customers and is being aggressively ramped up for production. AMD is working closely with major cloud providers, enterprise customers, and supercomputing companies to deploy Instinct accelerators. The El Capitan supercomputer, powered by AMD Instinct MI300A accelerators, is expected to be the world's fastest when it comes online later this year. AMD also closed new Instinct GPU wins, including at the German High Performance Computing Center and for energy company Eni. Progress was also made in expanding the ecosystem of AI developers working on AMD platforms with the release of the ROCm 6 software suite.

The ROCm 6 stack has improved performance for AI workloads and has gained support from the Open Source AI Software community. Large companies like Microsoft and Hugging Face are using AMD Instinct accelerators for their advanced language models. The Data Center GPU revenue is expected to grow significantly and the Client segment saw a 62% increase in revenue due to the launch of the Ryzen 8000 series processors. These processors offer improved AI performance and energy efficiency compared to the previous generation.

In February, major PC manufacturers such as Acer, ASUS, HP, Lenovo, and MSI will start selling notebooks powered by AMD's Ryzen 8000 series processors. The company has also launched the Ryzen 8000 G-series processors, which are the first desktop CPUs with an integrated AI engine. AMD is working with Microsoft and other partners to enable the next generation of AI PCs. They are also planning to release the next-gen Strix processors in 2024, which are expected to have three times the AI performance of the Ryzen 7040 series. In the Gaming segment, revenue declined due to lower semi-custom sales, but AMD expects it to pick up in the second half of 2023 as supply catches up with demand.

Revenue in the Gaming Graphics segment increased due to high demand for Radeon GPUs and the launch of new products. The Embedded segment saw a decrease in revenue due to inventory reduction by customers, but new solutions were launched for key markets. In 2024, the demand for Embedded products is expected to remain soft, but long-term growth is expected. Overall, the company had a strong fourth quarter and full year, but the demand environment for 2024 is expected to be mixed.

AMD believes they will experience strong growth in revenue and gross margin due to the success of their Instinct EPYC and Ryzen products. They see AI as a major transition that will impact the computing market and plan to capitalize on this by delivering AI solutions across their portfolio. They expect the Data Center AI accelerator market to reach $400 billion by 2027 and are seeing rapid adoption of their Instinct GPUs. In the PC market, they are focused on incorporating AI capabilities into their Ryzen processors. They are also working on integrating AI compute capabilities into their embedded product portfolio. AMD is confident in their position to achieve significant revenue and earnings growth in the next few years.

In the paragraph, Jean Hu reviews AMD's financial results for 2023 and provides an outlook for the first quarter of fiscal 2024. Despite a mixed market demand environment, AMD saw revenue growth in its Embedded and Data Center segments, with successful launches of new products positioning them for further growth in the AI market. In the fourth quarter of 2023, revenue grew 10% year-over-year, driven by the ramp of AMD Instinct GPUs and higher EPYC server processor revenue. Operating expenses increased as the company invested in R&D and marketing for AI growth opportunities. The Data Center segment saw significant growth in revenue and operating income, driven by higher sales of both AMD Instinct GPUs and Fourth Generation EPYC CPUs.

In the first quarter of 2024, AMD's client segment revenue increased by 62% due to Ryzen 7000 Series CPU sales. Gaming segment revenue decreased by 17% year-over-year and 9% sequentially, while embedded segment revenue decreased by 24% year-over-year and 15% sequentially. AMD generated $381 million in cash from operations and $242 million in free cash flow. They also repurchased 2 million shares and returned $233 million to shareholders in the fourth quarter. For the first quarter of 2024, AMD expects revenue to be approximately $5.4 billion, with flat Data Center segment revenue and declines in Embedded, Client, and Gaming segment revenue.

The company expects strong revenue growth in the Data Center and Client segments, while the Embedded and Gaming segments are expected to decline. Gross margin is expected to be 52%, with operating expenses at $1.73 billion and an effective tax rate of 13%. The company is not providing specific guidance for the full year, but expects similar trends in revenue and gross margin. They plan to invest in AI opportunities and aim for faster earnings growth than revenue growth. The company believes they are well positioned for strong financial performance in the future. The Q&A session will now begin.

Lisa Su and Jean Hu of AMD discuss the company's Data Center GPU revenue in the fourth quarter of 2021 and provide guidance for the first quarter of 2022. They were pleased with the performance of the Data Center GPU business in Q4 and exceeded their expectations for AI product demand. Going into Q1, they expect server and client seasonality to be high-single-digit to low-double-digit, and for Embedded business to have a low-double-digit sequential decline. They also note that the supply chain is operating well and customer demand for MI300X is strong.

Lisa Su, during her prepared remarks, mentioned that the company is currently in the fifth year of the gaming console product cycle and has inventory at customers. As a result, they expect a sequential decline of more than 30% in Q1 for gaming. In terms of traditional server demand, the company saw strong progress in the second half of the year, but they anticipate a mixed market in 2024, with some cloud optimization and enterprise caution. However, they are confident in their strong portfolio and expect to continue to grow share in the traditional server business, with the adoption of Genoa, Bergamo, Siena, and the upcoming Turin product. The next question came from Timothy Arcuri, who was not mentioned in the paragraph.

Lisa Su, CEO of AMD, is confident in the progress and potential of their Data Center GPU business, which has seen significant customer traction and growth in the past few months. She believes that the AI market is growing quickly and that AMD has the ability to gain market share in this space. While it is too early to make specific projections, Su sees significant growth potential for the business in the coming years. When asked about a specific target for 2024, Jean, an AMD executive, declined to provide guidance but hinted at a possible 20% growth for the company.

The question is about AMD's progress with the MI300 and their ability to ramp up production and attract customers. The analyst also wants to know about the company's plans for their traditional server business.

Lisa Su, CEO of AMD, discusses the roadmap for their MI accelerator family and their plans for future generations. They are leveraging chiplet technologies to differentiate themselves and plan to release multiple generations in sequence. Due to high demand for compute, they are accelerating their roadmap and will have a competitive offering for both training and inference in the next few years. As for the $400 billion TAM (total addressable market) for 2027, Su did not provide specifics on how they arrived at that number, but mentioned that they are working closely with customers to achieve it.

Lisa Su, CEO of AMD, has predicted that the generative AI market will reach approximately $400 billion in the next three years. This estimate takes into account the expected growth in units, as well as the increasing importance of memory and content in the market. Su also mentions that inference, or the use of AI models in real-world applications, is expected to surpass training in terms of market size. The TAM, or total addressable market, for this sector includes both GPUs and ASICs as accelerators for AI tasks. Different types of models will require different types of silicon, from smaller models to large language models.

The speaker discusses the use of GPUs for training and inferencing on large language models. They mention the success of the MI300 chip and the progress made in optimizing software for specific workloads. They also highlight the importance of working directly with large customers and the support from the open-source community, specifically mentioning the work of Hugging Face.

AMD's partnership with OpenAI on Triton and their work on open source models has helped them make rapid progress in real-time optimization on their hardware. They updated their revenue expectations from $2 billion to $3.5 billion to give a better understanding of their supply chain. They have a strong order book and have worked with their supply chain partners to secure significant capacity. They will continue to work with customers on deployments and will update the number as the year progresses. The next question was about the MI300.

In response to a question about the trajectory of the company's revenue and customer adoption of the MI300, Lisa Su stated that revenue is expected to increase each quarter but be more second half weighted. She also mentioned that the company is engaged with all of its large customers and that adoption cycles vary, but it is still early in the process. She emphasized that the company's long-term roadmap and engagement with AI-specific companies are also important factors.

Toshiya Hari asks Jean Hu about the trajectory of gross margin for the first quarter and beyond. Hu explains that Q1 is expected to have a 120 basis point increase due to higher Data Center contribution offsetting a decline in Embedded business. Going forward, the Data Center business will continue to grow and contribute to gross margin expansion, with the potential tailwind of Embedded business coming back in the second half. However, the first half may see a headwind due to a decline in Embedded business.

The speaker discusses the competitive environment for the company, stating that it is always competitive. They mention their success in the Instinct and server CPU markets, with plans to continue gaining share in the future. They also mention their upcoming products, Zen 4 and Turin, which they believe will be highly competitive.

Lisa Su discusses the growth of the Data Center GPU side and the potential impact of cloud digestion on the EPYC business. She expects the CPU business to continue growing in 2024, driven by refresh cycles and the energy efficiency of newer generations. The MI business has been revised from $2 billion to over $3.5 billion, with potential for further growth. The launch of a competitor's B100 later in the year may impact the competitive landscape.

The speaker discusses the factors that have contributed to AMD's success, including customer demand, strong partnerships with suppliers, and a competitive roadmap. They also address the potential impact of Intel's adoption of chiplet technology and advancements in manufacturing.

AMD is focused on innovation in architecture and design, and believes they have a strong roadmap for both CPU and GPU development. The value proposition for their MI300 accelerator includes a performance per dollar benefit and more bandwidth and memory capacity compared to the competition. Customers are using MI300 for both inferencing and training, with some using it for large language model workloads.

The speaker discusses the importance of being a strong partner when implementing AI systems, which can be used for both training and inference. They then answer a question about the potential for $1.5 billion in revenue in 2024, stating that it is possible but their focus is on executing their current $3.5 billion plan. The next question is about the ramp-up of the MI300, to which the speaker responds that it is not a pull-forward of demand, but a result of customers gaining confidence in deploying a significant number of MI300 this year.

Lisa Su, CEO of AMD, is asked about the $400 billion number that has been mentioned and clarifies that it is for accelerator chips and does not include systems. She also mentions that memory will be a significant part of this number. When asked about the customer concentration for the MI300 revenue, she states that she doesn't see one or two customers dominating the revenue and expects broad adoption. In response to a question about Intel's roadmap compared to TSMC's, she believes that Intel may close the gap in the next couple of years, but AMD will maintain its lead.

Lisa Su, CEO of AMD, is confident in their partnership with TSMC and their ability to execute well. She believes that even if they are not ahead in the process race, their architectural roadmap and diverse portfolio of products will allow them to continue solving problems for their customers. The call has ended and there are no further comments from the AMD team.

This summary was generated with AI and may contain some inaccuracies.