2 Sources
[1]
Hewlett Packard Enterprise Company (HPE) Q3 2024 Earnings Call Transcript
Paul Glaser - Head, Investor Relations Antonio Neri - President & Chief Executive Officer Marie Myers - Chief Financial Officer Good afternoon, and welcome to the Third Quarter Fiscal 2024 Hewlett Packard Enterprise Earnings Conference Call. My name is Gary, and I'll be your conference moderator for today's call. At this time, all participants will be in listen-only mode. We will be facilitating a question-and-answer session towards the end of the conference. [Operator Instructions] As a reminder, this conference is being recorded for replay purposes. I would now like to turn the presentation over to your host for today's call, Paul Glaser, Head of Investor Relations. Please proceed. Paul Glaser Good afternoon. I'm Paul Glaser, Head of Investor Relations for Hewlett Packard Enterprise. I would like to welcome you to our fiscal 2024 third quarter earnings conference call, with Antonio Neri, HPE's President and Chief Executive Officer; and Marie Myers, HPE's Chief Financial Officer. Before handing the call to Antonio, let me remind you that this call is being webcast. A replay of the webcast will be available shortly after the call concludes. We have posted the press release and the slide presentation accompanying the release on our HPE Investor Relations web page. Elements of the financial information referenced on this call are forward-looking and are based on our best view of the world and our businesses as we see them today. HPE assumes no obligation and does not intend to update any such forward-looking statements. We also note that the financial information discussed on this call reflects estimates based on information available at this time and could differ materially from the amounts ultimately reported in HPE's quarterly report on Form 10-Q for the fiscal quarter ended July 31, 2024. For more detailed information, please see the disclaimers on the earnings materials relating to forward-looking statements that involve risks, uncertainties, and assumptions. Please refer to HPE's filings with the SEC for a discussion of these risks. For financial information we have expressed on a non-GAAP basis, we have provided reconciliations to the comparable GAAP information on our website. Please refer to the tables and slide presentation accompanying today's earnings release on our website for details. Throughout this conference call, all revenue growth rates, unless otherwise noted, are presented on a year-over-year basis and adjusted to exclude the impact of currency. Finally, Antonio and Marie will reference our earnings presentation in their prepared remarks. Thank you, Paul, and welcome to your new role leading investor relations at HPE. And thank you, all, for joining us today. HPE delivered a strong third quarter performance. We generated impressive revenue growth with notable acceleration of AI systems revenue conversion, as well as higher operating margin from the prior quarter. Net revenue was $7.7 billion, up 10% year-over-year and at the high end of our guidance. Non-GAAP diluted net earnings per share rose $0.01 from a year ago to $0.50 in Q3, $0.02 above the high end of our guidance. We generated a free cash flow of more than $660 million, and will pay a dividend of $0.30 per share. Based on our year-to-date performance, we are raising our full year GAAP and non-GAAP earnings per share guidance. Marie will provide further details in her remarks. Lastly, we are also pleased to have received the first payment of $2.1 billion in proceeds from the sale of part of our equity position in H3C. Overall, the demand environment this quarter has improved. We saw sequential and year-over-year orders growth, but with some geographic variation. Demand was strong in North America, Asia-Pacific, Japan and India, while Europe and the Middle East lagged. We are aggressively going after the opportunities presented by better market conditions and are well-positioned in a competitive and dynamic environment as we close our fiscal year. I am very proud of the progress we have made in delivering on our edge-to-cloud vision over the last several years, which is generating this performance momentum. We have accelerated innovation across all pillars of our strategy. Networking, hybrid cloud, and AI delivered through a unified cloud-native and AI-driven experience as a part of our HPE GreenLake cloud platform. Today, almost 37,000 unique customers use our HPE GreenLake cloud to manage their hybrid IT estate, which drives our annualized revenue run rate subscription growth. In Intelligent Edge, we have invested in building an industry-leading AI-driven networking portfolio. HPE Aruba Networking is a recognized market leader in the campus and branch segment. The AI market requires a modern, high-performing networking fabric as a core foundation to deliver a more efficient data center cloud infrastructure as the world transitions to accelerated computing. We are excited to significantly expand our networking business with the pending acquisition of Juniper Networks. The acquisition of this high-margin business will accelerate our edge-to-cloud vision with a full networking IP stack, from silicon to infrastructure, to the operating system, to security, to software and services in a cloud-native and AI-driven approach. We expect our compelling value proposition will begin to deliver returns to our shareholders in the year post close. In Hybrid Cloud, we redefine the cloud space by delivering an experience that is hybrid by design, with HPE GreenLake at the core of our strategy. We have transitioned our HPE server and HPE storage products to cloud-native and software-defined solutions while adding unique software and services to our HPE GreenLake cloud platform. Our innovation gives customer choice and flexibility across all workload types, while managing their public on-prem colos and edges in one unified hybrid cloud operations experience. Our AI business is built on decades of large-scale infrastructure expertise, including technologies like direct liquid cooling that they're powering our largest AI systems for large language model builders, service providers, and supercomputing users. We have rapidly expanded our AI portfolio, including the introduction of HPE Private Cloud AI, specifically engineered for enterprise customers, with the expectation of significant market expansion as we are still in the early stages of adoption. Pursuing this strategy has diversified the HPE portfolio to parts of the market with higher margins. Our differentiation not only makes us highly relevant to customers and partners, but also drives profitable growth for our shareholders. I have a few observations about Q3 performance in our key segments that I will share, and then I will let Marie fully review more detailed results. Our Server segment, again, outperformed expectations in Q3, thanks to an acceleration in converting AI system orders to revenue. We converted about $1.3 billion in AI systems revenue this quarter, a 39% increase from Q2. Revenue from our traditional server business also climbed with a double-digit increase in product orders, both sequentially and year-over-year, reflecting an improvement in the market for traditional compute. We continue to pursue profitable deals within our target server margin range, underscoring stability in our operating profit profile. In AI, our momentum is very clear. Customer demand for HPE AI systems rose sequentially, with opportunities increasing in both enterprise and sovereign AI clouds as customers explore more use cases. AI system orders climbed $1.6 billion in the quarter to a cumulative $6.2 billion since Q1 2023, an increase of approximately $3.5 billion over the last year. Customers are exploring new ways to use AI, adding to our already robust pipeline and creating even more runway for our broad AI offerings. Enterprise interest in generative AI is high, and while adoption is still in the initial stages, it is accelerating. Customers tell us that they see the possibilities and are building the business cases. We see use cases across multiple verticals, from healthcare to financial services to manufacturing. As the use cases mature, they need expertise to help guide the implementation across their enterprise business, not just IT. Direct liquid cooling continues to be a key differentiator and demand driver with large-scale AI customers. The expertise in IP required to build and run large direct liquid cooled systems creates a significant margin rich services opportunity for day zero, day one, and day two operations. HPE has one of the largest water cooling manufacturing and services footprint in the world. In the sovereign space, just recently, the U.S. Department of Energy National Renewable Energy Laboratory announced that its HPE-built Kestrel supercomputer came fully online over the summer. Kestrel will be five times more powerful than NREL's previous supercomputer, Eagle, and is 100% direct liquid cooled. This system will enable research advancing energy efficiency, sustainable transportation, renewable energy, and energy system integration, including by leveraging the latest innovation in AI large language model, modeling, and simulation. You saw at HPE Discover that HPE is deepening our strong partnership with NVIDIA. In June, we jointly announced NVIDIA AI Computing by HPE, a portfolio of codeveloped AI solutions and joint go-to-market integrations that enable enterprises to accelerate adoption of generative AI. One of those solutions, HPE Private Cloud AI, which just became generally available yesterday, is a turnkey solution that makes it easy for enterprises of all sizes to gain an energy-efficient, fast and flexible option for sustainably developing and deploying generative AI applications. To this adoption, HPE Private Cloud AI will be available in four modular configurations. These start with small-for-small model inferencing needs on NVIDIA L40 and scale up to extra large on NVIDIA Grace Hopper 200s for a configuration that allows for multiple use cases, running inferencing retrieval, augmentation generation, and large language model fine-tuning. We are offering customers two important choice points: self-managed or fully-managed service, with the ability to purchase as-a-service through an operating expenditure model or as a capital expense. With three clicks and less than 30 seconds to deploy, HPE Private Cloud AI dramatically simplifies DevOps, ITOps and FinOps for enterprise customers, allowing them to easily establish and meter their environments, monitor and observe their infrastructure and applications, and lifecycle manage all aspects of their private cloud AI system. And just last week, we further expanded our NVIDIA partnership by adding NVIDIA NIM Agent Blueprints to HPE Private Cloud AI for multiple generative AI use cases. This includes a digital human workflow for customer service, a generative virtual screening workflow for computer-aided drug discovery, and a PDF data extraction workflow for enterprise RAG that uses vast quantity of business data for more accurate responses. Integrating this catalog of pretrained, customizable AI workflows into our HPE Private Cloud AI stack enables customers to easily deploy key AI use cases to accelerate time to value. With a series of announcements about HPE Private Cloud AI, we are well-positioned to serve our enterprise customer needs. Since the announcement less than three months ago, we have seen very high customer interest, with requests for proof of concept demos exceeding our expectations. We are increasing sales resources and enabling our partner ecosystem to meet the high demands for demos. We believe HPE Private Cloud AI is going to be an important growth driver for our hybrid cloud business, and we are filing numerous patents to ensure our leadership is recognized and protected. This innovation and customer interest positions HPE Hybrid Cloud well, as AI is an accelerator for hybrid cloud solutions. We are on a positive trajectory with orders and we are beginning to realize returns from an ongoing investment in both our product portfolio and our specialized sales motions. There is more work to do, but we are pleased that our revenue and profitability both improved quarter-over-quarter. Customers require new data protocols in file and object to store and manage data to train AI applications. We expect that extending our HPE Alletra Storage software-defined offerings to these new data protocols will lead to a higher proportional software and related services in our portfolio. We are seeing double-digits orders growth in HPE Alletra Storage and in HPE GreenLake hybrid cloud SaaS offerings. In just the last three months, almost 3,000 new customers began using our HPE GreenLake cloud, and we added almost 10,000 more customers in the last year. We are pleased with the momentum we are seeing with customers who have turned into HPE GreenLake. We are proud to announce a new agreement with Deloitte to utilize the HPE GreenLake cloud to help transform and centralize Deloitte's IT infrastructure, which includes AI computing. We continue to introduce new hybrid cloud offerings by adding more profitable software and services, which is clearly reflected in our ARR mix now at 71%. For example, at the end of August, we closed our acquisition of Morpheus Data. We believe this acquisition solidifies HPE's leadership position as the first vendor with a full suite of software capabilities across the hybrid cloud stack. We look forward to integrating Morpheus Data multi-cloud automation and orchestration capabilities into our HPE GreenLake cloud platform to complement the AI-driven multi-cloud and multi-vendor observability from HPE's OpsRamp acquisition and our own organic innovation. In Intelligent Edge, we believe we are beginning a market recovery as customers finished digesting previous orders post-COVID. Revenue improved sequentially on gains in services and SASE. Momentum is building and we saw sequential orders increase in all regions with particular strength in North America. On the products front, orders growth was led by wireless LAN, SASE, and data center networking products. Conversely, campus-switching products orders have been slower to recover. Last month, we reinforced HPE Aruba Networking cyber defenses with the new AI-powered network detection and response and campus-based zero trust network access. The solution leverages telemetry from HPE Aruba Networking Central's data lake to train and deploy AI models to monitor and detect unusual activity in vulnerable IoT devices, helping to support mission-critical business processes. Our security solutions are attracted to customers like Nobu Hotels, which is leveraging a secure AI power network combining the HPE Aruba Networking Central platform and HPE Aruba Networking ClearPass to implement a zero trust strategy. This installation will help provide secure, seamless and hyper-personalized guest experiences, including an AI concierge across Nobu's global footprint on luxury hotels. We are excited for what comes next with our pending Juniper Networks acquisition and the comprehensive networking profile we will create. As I have said before, we expect that Juniper will be accretive to our margin profile and non-GAAP EPS in year one. The deal recently received regulatory approvals in the European Union, UK, India, and several other jurisdictions and we remain on track to close in late calendar year 2024 or early calendar year 2025. Plans are well underway to ensure successful integration post close. To conclude, I am very pleased with our third quarter performance. Our impressive revenue growth reflects the strength of our portfolio and the growing excitement customers have for our newest innovations across AI, network and hybrid cloud. HPE is playing a crucial role in helping customers adopt this transformative technology across their business. As we innovate, we also continue to stay disciplined in the way we manage our business and cost structure. In Q3, we delivered profitable growth for our shareholders in a competitive and dynamic environment. Next month, we look forward to hosting some of you at our incredible Wisconsin manufacturing facility, where we will build many of our industry-leading direct liquid cooled AI systems. The work we do there plays a key role in driving the transition to direct liquid cooled systems and the successful AI revenue conversion we saw this quarter. I hope you will take advantage of this opportunity to experience this. Now, I will hand it over to Marie to go through the segment results in greater detail. Marie? We are pleased with our performance this quarter and we did what we said we would do. We delivered strong top-line revenues, grew revenues sequentially in each segment, prudently managed costs, improved profitability sequentially, and delivered non-GAAP diluted net earnings per share that exceeded the high end of our guidance range. As Antonio said, we are pleased to have received the $2.1 billion in proceeds from the partial sale of the H3C equity position. Today's results highlight our ability to deliver amidst a dynamic macro environment. While some customers remain cautious and prioritize mission-critical projects, we are encouraged by the recovery in enterprise demand we are seeing in North America, followed by modest improvement across the other geographies. We remain excited about HPE's position across AI, hybrid cloud, and networking. HPE is well-positioned for the AI opportunity. This quarter, our AI systems backlog increased and we grew AI systems revenues approximately 40% sequentially. We continue to win deals with both model builders and sovereigns and are well-positioned to address enterprise AI demand. In traditional servers, we are seeing signs of a recovery as both demand and revenue increase sequentially. In hybrid cloud, we see an improvement in storage, led by the strong demand for our HPE Alletra MP offering, and we continue to drive ARR growth. We are encouraged by the early and strong customer response to our Private Cloud AI offering that we announced at Discover, which we expect to drive AI adoption in the enterprise. Lastly, results were solid in networking. Improving sequential demand in WLAN, data center networking and switching, along with continued growth in security and services, keeps us optimistic heading into the fourth quarter. We continue to make progress towards our strategic goals. Our recently announced acquisition of Morpheus Data that expands our hybrid cloud capabilities and our confidence in closing the Juniper acquisition by the end of calendar year 2024 or early calendar year 2025 are excellent examples. Overall, I am pleased with our performance in Q3 and look forward to carrying our momentum through the end of our fiscal year. Now, let me go through the details of the quarter. Revenue grew 10% year-over-year and 7% quarter-over-quarter in constant currency to $7.7 billion, near the high end of our guidance range for the quarter. Our as-a-service momentum continued this quarter. We grew ARR 39% in constant currency year-over-year to more than $1.7 billion, led by AI through HPE GreenLake and networking and storage subscriptions. With enterprise AI customers, we are noticing a strong appetite for a consumption model, both to alleviate investment pressures as well as to retain flexibility to grow workloads, though this is still early days. We continue to lift HPE GreenLake's value proposition with an increasing mix of higher-margin software and services revenue. This quarter, we expanded the software and services mix of ARR approximately 300 basis points year-over-year to 71%, due to the stronger sales of AI services tied to hardware sales and Aruba Central platform subscriptions. Our non-GAAP gross margin was 31.8%, which was down 410 basis points year-over-year, driven by a lower mix of Intelligent Edge revenue and a higher mix of AI server revenue. The 130-basis-point sequential decline was driven by the higher AI server revenue mix. We have balanced gross margin pressures by executing on strong cost controls and by maintaining pricing discipline in a competitive AI server market. Our non-GAAP operating expenses decreased approximately 7% year-over-year and 1% quarter-over-quarter despite a seasonal increase in marketing expenses associated with our annual Discover event. Since joining as CFO, I have taken a rigorous, programmatic approach to streamlining our cost structure to drive operating expense improvements. And we expect to see the benefits of these actions in the second half of the year. This is already evident in our results as we drove a 50-basis-point sequential improvement in our non-GAAP operating margin, offsetting pressures we saw at the gross margin line. Our non-GAAP operating margin was stable at 10%. Profitability improvements and better-than-expected OI&E drove GAAP-diluted net EPS of $0.38 per share and our non-GAAP diluted net EPS of $0.50 per share, both above the high end of our guidance ranges. Our non-GAAP diluted net EPS excludes $149 million in net costs, primarily from stock-based compensation expense, amortization of intangibles and acquisition, and other related charges. Now, let's turn to the segment results, starting with Servers. Strength in both AI systems and traditional servers drove healthy revenue growth and stable operating margins. Server revenues were $4.3 billion in the quarter, up 35% year-over-year and 11% sequentially. In traditional servers, we saw steady growth and are seeing signs of a recovery. We saw strength in North America where our installed base is spending more, though EMEA and APJ customers continue to evaluate spend. Our Gen11 product continues to ramp ahead of expectations and now represents a growing proportion of total server revenue. And we have been able to manage an inflationary component environment through dynamic pricing and by leveraging strong supplier relationships. In AI systems, demand remains strong, though large deals continue to be lumpy. AI systems product and services orders rose $1.6 billion sequentially, driving our cumulative orders since Q1 '23 to $6.2 billion. We are pleased with our current AI systems backlog, which has increased quarter-over-quarter to $3.4 billion. Demand remains healthy from the model builders. We are winning deals in this space and following a framework to manage risks and profits. While still early days, we continue to see positive signs from enterprise customers. In fact, more than 80% of enterprises are experimenting with GenAI initiatives, which supports our view that the number of customers will continue to trend favorably. This quarter, our enterprise AI pipeline more than doubled sequentially. And sovereign AI is an adjacency for HPE, right beside our market leadership in supercomputing. We continue to see increasing demand from this set of customers who are embracing AI. Within model builders and sovereign AI, customers is a growing desire for liquid cooling. However, adoption relies on data center readiness. We view HPE's multi-decade design and manufacturing expertise, intellectual property, patent portfolio, and global reach and dedicated services as clear differentiators as the market moves in this direction. Q3 was again a strong quarter for AI system revenue conversion. AI system revenues were $1.3 billion in the quarter, up approximately 40% from the prior quarter. We are pleased with the stability of our operating margins within our Server business. Our Q3 Server operating margin was 10.8%. This was up 70 basis points year-over-year, but down 20 basis points sequentially. AI systems make up a higher share of our total Server revenue compared to one year ago, 10% in Q3 FY '23 versus 30% this quarter. This underscores our disciplined focus on profitability in a competitive AI server market. Our operating margin performance was in line with expectations. For the full year, we continue to expect our Server margins to be at the low end of our target range of 11% to 13%. We will remain disciplined in cost and price as we pursue profitable growth. Now, moving to Hybrid Cloud. Both revenue and profitability improved quarter-over-quarter. Segment revenues of $1.3 billion were down 7% year-over-year, but up 4% sequentially. As we have previously discussed, we are managing both a sales model transition and product transition within the storage business. Our product model transition is to a more cloud-native, software-defined platform with HPE Alletra, which offers a unified storage architecture, comprehensive AIOps, and cross-stack analytics and aligns to customer preferences for a hybrid cloud model. Translating this storage growth to revenues will take time because of the higher mix of ratable software and services, which is deferred into future periods. We continue to see strength for our Alletra MP offering with sequential improvement led by our continued business transformation efforts, particularly in go-to-market. We are seeing signs of improving demand for block storage and early traction in file, and we are closely monitoring the impacts of commodity costs on demand. In our Private Cloud business, we are having constructive conversations with our customers to evaluate their virtualization strategy. At Discover, we announced efforts to develop our virtualization capabilities, which will be available within our Private Cloud Business Edition solution. Lastly, customers are reacting positively to our recently announced Private Cloud AI offering in partnership with NVIDIA, which unifies AI skills, data, architecture and solutions into one fully managed platform and accelerates time to value for enterprises looking to begin their AI journeys. We are seeing traction in both customer demos and pipeline, and as of yesterday, our Private Cloud AI offering is globally available to order. Our Hybrid Cloud operating margin was 5.1%, down 30 basis points year-over-year, but up 430 basis points sequentially, predominantly due to better OpEx controls. Moving to Intelligent Edge. Revenues were $1.1 billion, down 23% year-over-year on tough compares, but up 3% sequentially. Demand was steady quarter-over-quarter, backlog remains at normal levels, channel inventory remains healthy, and we believe that we have moved past the trough. On the order side, we are seeing a recovery that is in line with our industry peers. For the second consecutive quarter, we saw order improvements in each of our geographies, led by strength in North America, followed by modest growth at EMEA and APJ. By product, we saw sequential order improvements in data center networking and in in-campus. We grew both services and SASE orders mid- to high-single-digits year-over-year as customers remain excited about our Aruba Central platform that is part of our HPE GreenLake offering. On the revenue side, we drove year-over-year strength in data center networking, SASE, and services, though saw declines in our campus and switching due to difficult annual compares. Our sequential revenue grew approximately 3%, consistent with our expectation that Q2 would be the trough. The segment operating margin of 22.4% was down 520 basis points year-over-year, driven by tough year-over-year revenue compares, offset slightly, but a better year-over-year gross margin rate. Better OpEx was the primary reason for our 60-basis-point sequential improvement in operating margin. Our OpEx plan has put us on a path to achieve operating margins in the mid-20% range by Q4. Now, turning to Financial Services. Our HPE Financial Services revenue were $879 million, up 1% year-over-year, and financing volumes were $1.5 billion. Year-to-date, $800 million of $4.5 billion in financing volume went to AI wins with both cloud and enterprise customers, which illustrates that AI is driving demand across our portfolio. Operating margin of 9% was up 80 basis points year-over-year and down 30 basis points sequentially. Our portfolio remains healthy, and we continue to improve the investment-grade mix. Our Q3 loss ratio remained steady at below 0.5% and our return on equity is a solid 17.4%. Turning now to cash flow and capital allocation. We generated $1.2 billion in cash flow from operations and $669 million in free cash flow. We are on track for $1.9 billion in free cash flow. We remain confident in our ongoing ability to generate strong free cash flow even as we pursue strategic buys given the rising component cost environment. Our Q3 cash conversion cycle was a positive four days, which is a reduction of 19 days from Q3 '23. Our days of inventory and days of payables were both higher on AI systems orders that outpaced revenue conversion and on our strategic purchases for our server business. We continue to believe working capital to be neutral to free cash flow as we expect declines in inventory led by strong AI system revenue conversion to balance declines in accounts payable as we exit the year. We are pleased to have received the $2.1 billion in proceeds from the partial sale of the H3C equity position. We remain committed to our balanced capital allocation framework and are focused on managing our balance sheet and maintaining our dividend. We returned $221 million in capital to shareholders in Q3 and $607 million year-to-date. Now, let's turn to guidance. As Antonio mentioned, we are making steady progress on securing the necessary regulatory approvals required for our pending Juniper acquisition and look forward to closing by the end of calendar year '24 or early calendar year '25. For the fourth quarter, we expect revenues in the range of $8.1 billion to $8.4 billion, GAAP diluted net EPS to be between $0.76 and $0.81, and our non-GAAP diluted net EPS to be between $0.52 and $0.57. For revenue, we expect to see sequential growth similar to the last couple of quarters with revenue to remain indexed towards server. We continue to manage the business at the operating income level and, therefore, expect a sequential decline in operating expenses in order to deliver our EPS targets. On a segment basis, we expect the following. For Server, we expect to convert AI systems to revenue at a strong pace. As mentioned, the market remains competitive and large deals remain lumpy. We continue to be time-to-market for GPUs and are looking forward to shipping H200 chips for the AIST supercomputer in Q4. We also expect a continued demand in traditional servers, driven by Gen11 adoption and higher units. We are maintaining our expectation of achieving the low end of our 11% to 13% operating margin range for the full year. For Hybrid Cloud, we expect a slight revenue increase to close the year, though expect pressures from rising commodity costs, particularly in SSDs. ARR growth should continue as our storage business accelerates and shifts to subscription under HPE GreenLake. And we expect operating margins to continue to trend favorably to the mid-single-digit range. For the Intelligent Edge business, we anticipate a slight sequential revenue increase for the fourth quarter on the recovery in campus and WLAN, as well as strength in data center networking. Benefits from our cost reduction efforts materialized in our third quarter results, and we expect a similar trend in the fourth quarter, supporting our outlook for a full year operating margin in the mid-20% range. For the full year, we are tracking towards the high end of our revenue guidance of 1% to 3% growth in constant currency. We expect to balance gross margin pressures from a higher mix of AI systems with continued operating discipline and expect to come in at the low end of our operating profit growth guidance of 0% to 2%. We now expect OI&E to be a $50 million to $100 million headwind versus our prior expectation of $150 million headwind. We have tightened our non-GAAP diluted net EPS expectations for the full year to $1.92 to $1.97. For GAAP EPS, we are now seeing increased costs related to the Juniper transaction, with our expectations now at $1.68 to $1.73. Lastly, we remain committed, in the long term, to our balanced capital allocation framework, our dividend, and to our investment-grade rating. In the near term, we expect to continue share repurchases at a pace in line with Q3, as we prudently manage our balance sheet ahead of the Juniper transaction closing. To summarize, our Q3 results were strong and demonstrated good sequential revenue growth. We will now begin the question-and-answer session. [Operator Instructions] The first question is from Meta Marshall with Morgan Stanley. Please go ahead. Meta Marshall Great. Thanks, and congrats on the results. Maybe just on the Server margins, given the AI revenue contribution, they were a little bit higher than expected. Could you just break down? Is this from kind of better ProLiant margins, better kind of proprietary Cray margins, just better pricing discipline? Just a little bit more insight into the margin strength we saw there? Thank you. Marie Myers Yeah, sure. Good afternoon, Meta, and thanks for your question on Server margins. First of all, as I said in my prepared remarks, we did, in fact, ship about $1.3 billion of AI servers in the quarter. So that constituted around about 30% of Server revenue. And despite that, our margins were at 10.8%. And to your point, what was it driven by? So, first of all, we're well on track on the shift to Gen11, and in itself, Gen11 has richer configurations and, therefore, comes with a higher margin profile. Also, I think we've been pretty successful at passing through those commodity costs, and despite the fact that we're in a pretty competitive both CPU and GPU environment. And then lastly, I'd say you would have seen that our OpEx was down sequentially and you're seeing the impact, frankly, of that OpEx discipline show up in the margins as well, Meta. The next question is from Samik Chatterjee with JPMorgan. Please go ahead. Samik Chatterjee Hi, thanks for taking my question, and I apologize in advance if there's some background noise because I'm at the airport. I guess, we're getting the most questions today on the gross margin. And Marie, I know you outlined the factors there in terms of the AI server mix. But hoping you can flesh that out a bit more in terms of was there anything outside of the AI server mix that impacted gross margins at the company level in the quarter itself? And as we think about sort of your operating discipline going into the next quarter, should we be expecting sort of further moderation in the gross margins and more operating discipline to help maintain that operating margin outlook? And if I can just add on there, the AI order execution that you're seeing, you mentioned lumpy deals with customers. Is that impacting your thinking on gross margins for the aggregate company as we look forward as well in terms of where -- which verticals those orders are coming from? Thank you. Marie Myers Hey, Samik, good afternoon. So, I think I've got most of your questions. I'll hit it up in terms of gross margins and what we've seen. So first of all, I would say just if you step back and think about gross margins at a high level, from a year-on-year perspective, just to remind everybody that obviously the contribution from our networking revenue is lowest. Obviously, that's impacted gross margins from a year-on-year basis. Now, when you look sequentially, as I mentioned in prepared remarks, the AI mix of servers, we converted at a much faster pace. And obviously, that's really driven margin -- gross margin in the quarter. Now, I would add to your point, we've been offsetting it with prudency on OpEx. You've seen that we're going to continue to do that. I think I mentioned that in my guide comments. And then, also, we continue to have a mindset about being very disciplined on cost and price as we pursue profitable growth. I think we talked about the fact we're selective on deals. And you see that frankly, if you look at the quality of our receivables, you'll see it in our receivables. Now, a couple of things to sort of bear in mind as we go forward. As we see enterprise AI gain momentum, that's going to have a more favorable impact on gross margins. So, we do expect that we'll see improving profitability as the market moves in that direction. And then finally, Samik, just to sort of bear in mind that we're getting closer to the close of Juniper. We expect to see that the end of calendar '24, beginning of '25. In itself, that transaction is going to have a significant impact on both gross and operating margins. And we expect that more than 50% of the company's operating profit will come from networking. So that's how I'd sort of leave it with you in terms of thinking about margins. And as we said, we'll continue to focus on managing OpEx and being very prudent on deals. The next question is from Amit Daryanani with Evercore. Please go ahead. Amit, your line is open on our end, perhaps it's muted on yours. Amit Daryanani Sorry. Good afternoon. Thanks for taking my question. Marie, I was hoping you can talk a bit more about the free cash flow numbers. It was down a fair bit year-over-year. Maybe you can just talk about what is driving the downtick in free cash flow. Is there a way to think about the headwind you're seeing from the AI ramps versus the strategic pre-buys? Just talk about what those buckets are? And then, related to free cash flow, I think last quarter, it was at least $1.9 billion for the full year. I think this time it's $1.9 billion. So, seems like a bit of a downtick. Maybe I'm overthinking it, but could you just talk about how do you think about that October quarter free cash flow number as well? Thank you. Marie Myers Yeah, sure. No worries, Amit. And so, yeah, I think you probably are -- just in terms of where we're at in cash flow, let me hit Q3 and then I'll go into Q4. So, from a Q3 perspective, it's really driven by a couple of things. Firstly, the timing of working capital, and also just some of the normal seasonality that we see. And then, as we get into Q4, we do expect to see some of the benefits of working capital. So, we'll see a reversal in CCC, and that will benefit free cash flow. And then, obviously, we're going to see that stronger conversion of AI revenue. I think I mentioned in my prepared remarks that we're going to see a sequential improvement in the shipment of AI revenue. So, you can see that will also have an impact on our free cash flow. So, for the full year, Amit, I'd just say we are still on track for $1.9 billion. It's just as you could imagine, we're in Q3 now, we're getting into Q4, we sort of tightened up the expectations given where we're at in the year. So, still on track for $1.9 billion, Amit. The next question is from Toni Sacconaghi with Bernstein. Please go ahead. Toni Sacconaghi Yes, thank you. You did mention repeatedly the lumpiness of AI server deals, and I'm wondering if there was any unusually large deals in Q3, or whether you're anticipating your AI bookings to be significantly different in Q4. And then just on the free cash flow, Marie, net income is going to be about $2.5 billion to $2.6 billion. Free cash flow is going to be about $1.9 billion. That's about 75% realization. I think since HPE split-off, free cash flow realization has actually been even lower than that. How do we think about that going forward? I presume free cash flow realization will be less than [1%] (ph) for next year and maybe even the following year because of Juniper integration charges? Or where is the pathway to where we can see free cash flow to net income being positive? Antonio Neri Yeah. Thank you, Toni. This is Antonio. I'm going to take the first part. The quick answer is no. There have not been very, very large deals or lumpy deals. It's been more spread and more uniform across the service provider space. And on the enterprise side, because obviously, we talk about this, the percentage of bookings relative to the $1.6 billion was -- as a mix, was in the mid-teens. So, very consistent with the prior year's quarter. However, obviously, the dollars are much larger, right, because now this quarter, we book $1.6 billion. So, I actually argue this is a good thing. And we don't expect significant super-large deals, I'd call it, in Q4 based on what we have visibility in the pipeline, but more a continuation of what we saw in Q3. And Marie, you want to talk about the free cash flow? Marie Myers Yeah. No, just, Toni, I mean, I think in terms of your comments on free cash flow, from an FY '24 perspective, I mean, in terms of bridging net earnings to free cash flow, it's the normal puts and takes for the year, so working capital, CapEx, et cetera. and sort of employee benefits. So, there is no specific charges in there in terms of our working capital -- sort of our net earnings for the year to free cash flow. In terms of '25, what we said, Toni, I think in the transaction stays the same. And look, honestly, we'll be guiding '25 when we do our next earnings call. So, I'll provide more color around free cash flow for '25 as we get into the next call. The next question is from Mike Ng with Goldman Sachs. Please go ahead. Mike Ng Hey, good afternoon. Thank you for the question. I just had a question about the mix of products and services for the AI systems orders and revenues that you guys disclosed on Slide 12. I guess, I was struck by two things. First, the growing share of services as a percentage of AI system orders, should we expect that to continue over time? And what are some of the key services you're selling with AI systems? And then, second, the very little services revenue that's being recognized to-date, as you recognize that services revenue in AI systems, should that improve the margins for server margins and AI system margins? And how much can that improve margins by? Thank you. Antonio Neri Yeah, thanks, Mike. Yeah, listen, we are very pleased with the services attached momentum on the AI systems portion of our business, which I believe will continue to grow as we grow the enterprise segment of the market, because that segment of the market comes with more rich services day zero, day one and day two services like we call it. And yesterday was the first day it became available out of HPE Private Cloud AI, and that has quite a bit of services component with it. And so -- but right now, as we started disclosing last quarter, the services component of that which you saw in one of the slides, as Marie was providing her remarks, much of that is pretty much all deferred. So, unless we are doing an installation and that gets recognized immediately, most of that is the maintenance that gets recognized over the length of the contract. And therefore, over time, we expect that will be contributing positive to our gross margins in the segment that we recognize that revenue, which obviously is the Server segment of the market. So, yes, but I'm positive on both gross margin accretion as we recognize the revenue and more services as we start selling the HPE Private Cloud in the enterprise space. The next question is from Simon Leopold with Raymond James. Please go ahead. Simon Leopold Thank you very much. I wanted to see, Antonio, if you could talk a little bit about the trends with traditional servers, given we hear these arguments that AI-accelerated platforms would cannibalize traditional servers, but you're seeing good growth and good order patterns. How should we think about the risk that maybe that cannibalization eventually happens, or how are you really thinking about traditional versus AI? Thank you. Antonio Neri Yeah. Thank you, Simon. We have seen no signs of cannibalization into the traditional server market. And remember, I always try to bring a segment point of view, right? So, the segment point of view in the AI space, you have three segments. You have the service providers, model builders, which obviously include the hyperscalers, and there we have not sold traditional servers in a long time once we made the decision in 2017 to not participate in that market. And then, you have the sovereign space, which is now going up in term of interest, but the sales cycles are longer because of the government engagements and the procurement, but there, generally speaking, there is no traditional service per se. It's a combination of architectures and GPUs and CPUs in a unique form factor. And then last but not least, we have the enterprise. And the enterprise, while it's growing, has been very much focused on the AI applications. And for customers to move a traditional workload, call it, legacy workloads and the like, to a accelerated compute, the question is why you will do that when, A, you need to use the accelerated compute to either fine-tune the model or to do inferencing; and second, from a PCO perspective, there is no clear view that, that will be cost less. And so that's why when I think about workloads and customer segments, we don't see signs of cannibalization from the AI deployments into the traditional workloads. The next question is from Wamsi Mohan with Bank of America Merrill Lynch. Please go ahead. Wamsi Mohan Yeah, thank you so much. Antonio, I was wondering if you could just share some color on why the -- on what the AI backlog composition is across maybe your portfolio where you're seeing more strength versus not. And within the enterprise demand commentary that you're calling out, can you share some color on what kind of projects are being evaluated? I know you called out some verticals like healthcare and financial services. Curious if you could provide some color on that as well. Thank you so much. Antonio Neri Yeah. So, first of all, the pipeline we have in front of us is multiples of the current backlog, which is a positive news, because that tells you the momentum will continue in the next few quarters. Second is that the backlog composition, as I said, in the mid-teens is the enterprise space and the rest is the traditional service provider space. On the service provider space is basically compute capacity to train models or to do hosting for that matter, in large colos. And then, on the enterprise space is really focused on the use cases where they see clear line of sight for the return on that investment, and there are several use cases by segment that customers by verticals that they are driving. Obviously, many of them are very obvious. And now we are seeing a little bit more sophistication in some of those use cases and the maturity of. And that's why our Private Cloud AI offering is targeting those type of customers, because ultimately comes with entire stack from the, what I call, the workloads at the top, specifically designed for the verticals, down into the training models, all the way down to the infrastructure. They are sized for that type of deployment. And then, on the sovereign AI, obviously, we see now a significant interest. We are working across multiple geos on several opportunities. A lot of them are basically to open AI clouds for sovereign reasons or privates and compliance reasons on data. And a lot of them actually want to look a little bit like supercomputers in many ways because many of those systems are designed to do both AI large language models, and that's very obvious with some of the deployments we have done in the European Union and the one we're going to do now for the U.K. And other ones are basically for traditional supercomputing. So, the infrastructure in the end is the same. All of these systems are very much liquid cooled systems. And so, that's an opportunity for us. But on the enterprise side, I think you can see now expansion from traditional bots and customer service into other areas in finance, manufacturing, marketing, where they can see the clear return on that investment. And we're helping them even upfront through a partner ecosystem to define those use cases, because ultimately, it goes beyond just deploying IT, but they're really to realize the business value. And the final question is from Ananda Baruah with Loop Capital. Please go ahead. Ananda Baruah Hey, guys, yeah, good afternoon. Really appreciate you taking the question. Maybe, Antonio, actually just dovetailing from there, like I'd love any more insight you can give a context around, like what's the HPE sort of sweet spot right now for business you win in GenAI, like what types of deployments or workloads? And then, how do you see that -- do you see that changing, or how do you see that evolving over the next few years as well? And that's it for me. Thanks, guys. Antonio Neri Well, I think right now, one of the key sweet spots is we now have to build and deploy and run these large systems. That requires a unique expertise. That's why you see the services portion being attached to those assistants. And ultimately, you need expertise both in the manufacturing space. And again, we're going to host our AI Day in one of, what I believe is, the largest footprints in the world where you can see how this gets done. And then, on the services side, you should not underestimate the services expertise needed to run. But for enterprise, where is the next big thing is -- in my view, is all about the simplicity. And several of the patterns we are actually filing and getting done are actually in areas like ease of use, automation, obviously, security. These are all spaces where we are actually building all those capabilities in our offer. And remember, all of this gets built inside HPE GreenLake as we deploy these optimized infrastructure and configurations. And that's why for me, GreenLake is an important component of our AI strategy, because ultimately, we manage a lot of the deployment on-prem through enterprise customers, specifically, through HPE GreenLake. And that's an accelerator and a way to upsell, cross-sell, build, ultimately, customers' confidence and control of the data, which is the fundamental value when it comes down to AI. And then, next year, once we close the Juniper transaction, we're going to add another key component, which is the networking piece. And it's very important that we recognize that AI, A, is a hybrid workload. The core foundation of that across hybrid is the network. And HPE will have unique IP and capabilities in that space in addition to the traditional server storage, which is now certified for AI, and then the GreenLake software and services attached to it. And that's how I want to think about it. Independent businesses are all accretive to AI, but then when we get to a solution, HPE will have the full-stack solution to offer to our enterprise customers. Again, I will say we delivered a strong quarter. We drove very strong revenue growth. We said what -- we did what we said. And honestly, I'm very confident about the next quarter and what comes next after the Juniper acquisition. I'm super-pleased that we also closed the first tranche of our H3C put option. Obviously, that took a lot of work in an environment that's complex. And as you think about our ability to deliver profitable growth is there. I understand the questions around margins, but when I think about margins, on the server side, we are consistently driving a stable around 11% or so operating margins. We think about that way more than gross margin because ultimately it's all about cash. And then, ultimately, on the networking and hybrid cloud is about both gross margin, because of our content is more software and services, while we'll deliver on the bottom-line. So, I think our strategy all coming together, but it's very competitive dynamic there and we have to execute every day with discipline, which is what we did again this quarter. And again, we raised guidance for the full year on the EPS side of the house. So, thank you very much for your time. And I look forward to hosting some of you at our facility in Wisconsin on October 10. Ladies and gentlemen, this concludes our call for today. Thank you.
[2]
Credo Technology Group Holding Ltd (CRDO) Q1 2025 Earnings Call Transcript
Dan O'Neil - Vice President, Corporate Development and Investor Relations Bill Brennan - President and Chief Executive Officer Dan Fleming - Chief Financial Officer Ladies and gentlemen thank you for standing by. At this time, all participants are in a listen-only mode. Later, we will conduct a question-and-answer session. [Operator Instructions] I would now like to turn the conference over to Dan O'Neil. Please go ahead, sir. Dan O'Neil Good afternoon. Thank you for joining our earnings call for the first quarter of fiscal 2025. Today, I'm joined by Bill Brennan, Credo's Chief Executive Officer; and Dan Fleming, our Chief Financial Officer. As a reminder, during the call, we will make certain forward-looking statements. These forward-looking statements are subject to risks and uncertainties that are discussed in detail in our documents filed with the SEC, which can be found in the Investor Relations section of the company's website. It's not possible for the company's management to predict all risks nor can the company assess the impact of all factors on this business. For the extent to which any factor or combination of factors may cause actual results to differ materially from those contained in any forward-looking statements. Given these risks, uncertainties and assumptions, the forward-looking events discussed during this call may not occur and actual results could differ materially and adversely from those anticipated or implied. The company undertakes no obligation to publicly update forward-looking statements for any reason after the date of this call. to conform these statements to actual results or to changes in the company's expectations except as required by law. Also during this call, we will refer to certain non-GAAP financial measures which we consider to be important measures of the company's performance. These non-GAAP financial measures are provided in addition to and not as a substitute for or superior to, financial performance prepared in accordance with US GAAP. A discussion of why we use non-GAAP financial measures and the reconciliations between our GAAP and non-GAAP financial measures is available in the earnings release we issued today, which can be accessed using the Investor Relations portion of our website. Welcome to our Q1 fiscal '25 earnings call. I'll start with a review of our Q1 performance and then discuss our future outlook. Our CFO, Dan Fleming, will then provide detailed Q1 results and share expectations for Q2. For Q1, Credo reported revenue of $59.7 million and non-GAAP gross margin of 62.9%. Our product revenues of $57.3 million were up 30% compared to the prior quarter, establishing a new quarterly record for the company. Product revenues were driven by rapidly expanding AI deployments. Credo is a pure-play high-speed connectivity company. We deliver a differentiated set of solutions, including active electrical cables or AECs, Optical DSPs, Line Card PHYs, SerDes Chiplets and SerDes IP licenses or Ethernet port speeds ranging from 100 gig up to 1.6 terabits per second. The data center market is dynamic and evolving rapidly, which we believe will create even more opportunities for Credo. Although we are primarily targeting the leading hyperscalers, we are now observing increased spending by the next tier of data center operators. We see a group of emerging hyperscalers deploying an increasing amount of infrastructure to take advantage of opportunities presented by AI. With these customers, we find we have immediate credibility from our prior success with traditional hyperscalers and that has indeed helped us to win new programs across our product lines. This is translating into material revenue across several growing customers. Notably, we expect to see a new 10% customer in Q2. Credo aims to extend its reach into new markets as data rates rise. Later this year, we intend to enter the 64 gig PAM4 PCIe Gen 6 market offering Retimer and AEC solutions that are optimized for signal integrity, latency, power efficiency and cost effectiveness. Now regarding our AEC product line, during the first quarter, AEC has continued to be our main source of revenue and we anticipate that AECs will play a crucial role in driving growth in fiscal '25 and beyond. Today, we're in production with solutions for port speeds up to 800 gig and we expect to deliver power optimized 3-nanometer products in 2025 for the 1.6T port market. We've delivered AECs in a wide variety of form factors and with a range of functionality designed to meet the diverse needs of our customers. We think our approach of offering system-level products blending customized hardware and software with fast turnaround time is crucial to maintaining our competitive edge. And as a result, we've developed deep relationships with our customers to deliver very innovative AEC solutions. Based on these customer relationships, market feedback and the inherent advantages of AECs compared to alternative solutions, we have seen AECs become the de facto solution for in-rack connectivity at 50 gig per lane speeds and above. In addition, increasing rack power densities and the migration to liquid cooling are effectively reducing the physical length required for backend network connections and thereby increasing the opportunity for AECs. During Q1, our existing and new customer relationships continue to expand and develop providing us with more confidence in the growth prospects of our AEC business going forward. Given that, we continue to expect our ramp of AECs to drive an inflection point in our sequential growth in the back half of fiscal '25. Now I'll turn to our Optical DSP business. I'm pleased to report we're making continued progress with our optical DSP business on multiple fronts. With AI deployments accelerating adoption of Credo solutions. Our optical module customers shipped AOCs and transceivers based on Credo DSPs to both US and international end customers. Based on Q1 results and our outlook, we remain on track to achieve our goal for Optical DSPs to be at least 10% of our fiscal '25 revenue. Beyond our existing production programs, we remain excited about future growth prospects in this category for several reasons. We are currently working with numerous optical module manufacturers to develop AOCs and transceivers and notably, we've secured our first design win with an industry-leading module manufacturer. Last fall, Credo introduced the concept of LRO or Linear Receive Optics which maintains a DSP only in the transmit path of an optical module as an innovative way to reduce power in both 800 gig and 1.6T optics. The LRO concept is increasingly being adopted within the industry and a new hyperscaler has decided to implement this strategy in their architecture. This application will target 800 gig module power of less than 10 watts, significantly below the typical 15 watts seen with full DSP architectures. We expect to start seeing LRO deployments in calendar '25. At 800 gig and 1.6T port speeds, we see energy efficiency becoming a more critical factor as power delivery and cooling infrastructure becomes even more challenging. Notably, power efficiency drove our decision to move directly to 3-nanometer for Credo's 1.6T DSPs and we plan to tape out power optimized solutions with both full DSP and LRO options later this calendar year. Taking into account customer feedback and rising momentum, we anticipate sustained growth in our optical business. Now regarding our Line Card PHY business. In the first quarter, our Line Card PHY business was once again an important contributor to our overall product revenue growth driven by strong contributions from 400 gig and 800 gig solutions. Our Line Card PHY revenue comes from Retimer and MACsec encryption products for port speeds up to 1.6 terabits per second. In this segment, our customers include networking OEMs for traditional switching applications and more recently, server ODMs for emerging AI appliance applications. We see new demand for our Line Card PHY solutions within Ethernet-based AI appliances for scale-out networks as rapidly increasing GPU performance places greater signal integrity demands inside the server. These emerging opportunities combined with traditional switch opportunities should lead to TAM growth into the future for our Line Card PHY solutions. Lastly, I'll review our SerDes licensing and chiplet businesses. We continue to make progress with customers and expand our funnel within our SerDes IP licensing and chiplet businesses. For fiscal '25, while we continue to expect quarterly variability due to the nature of revenue recognition, we see growth opportunities driven by a combination of license, royalty and chiplet revenues. With connectivity speeds rising fueled mainly by the needs of AI applications, Credo is poised for future growth. We offer a wide range of SerDes solutions up to 224 gig speeds in a wide range of process geometries from 28 nanometer to 3-nanometer. We continue to win due to our compelling combination of performance, power and exceptional technical support. To summarize, I'm very pleased with our team's performance in Q1, specifically in terms of the strong execution of our product ramp and our ongoing success engaging with customers. The rise of generative AI is driving greater demand for cutting-edge, power-efficient, high-speed connectivity solutions and Credo is dedicated to advancing our range of solutions to address this growing demand. This month, Credo will have strong presence at the CIOE Optical Conference in Shenzhen, followed by the ECOC Optical Conference in Germany. We expect these events will add to the momentum we've built since OFC in March. And next month, we'll be very visible at the OCP conference in Silicon Valley showcasing a wide array of advanced solutions for AI clusters. Moving forward, we continue to see an inflection point in the second half of fiscal '25 driven by existing and new customer engagements across the entire range of our connectivity solutions. I'll now turn the call over to our CFO, Dan Fleming, and he will provide additional details. Dan Fleming Thank you, Bill, and good afternoon. I will first review our Q1 results and then discuss our outlook for Q2 of fiscal year '25. In Q1, we reported revenue of $59.7 million, down 2% sequentially and up 70% year-over-year. Our IP business generated $2.4 million of revenue in Q1, down 14% year-over-year. IP remains a strategic part of our business. But as a reminder, our IP results may vary from quarter-to-quarter driven largely by specific deliverables to pre-existing or new contracts. While the mix of IP and product revenue will vary in any given quarter over time, our revenue mix in Q1 was 4% IP below our long-term expectation for IP, which remains 10% to 15% of revenue. Our product business generated $57.3 million of revenue in Q1, up 30% sequentially and up 77% year-over-year. Our product business, excluding product engineering services, generated a record $53.8 million of revenue in Q1, 21% higher than our previous product record and up 32% sequentially. Our top two end customers were each greater than 10% of revenue in Q1. Our team delivered Q1 non-GAAP gross margin of 62.9% and just below the low end of our guidance range and down 323 basis points sequentially as a result of the lower IP contribution in the quarter. Our IP non-GAAP gross margin generally hovers near 100% and was 96.8% in Q1. Our product non-GAAP gross margin was 61.5% in the quarter up 784 basis points sequentially and up 472 basis points year-over-year, primarily due to increasing scale. Total non-GAAP operating expenses in the first quarter were $35.4 million, below the midpoint of our guidance range and up 8% sequentially due to a 14th week in the quarter. Our non-GAAP operating income was $2.2 million in Q1 compared to non-GAAP operating income of $7.5 million last quarter. Our non-GAAP operating margin was 3.7% in the quarter compared to a non-GAAP operating margin of 12.3% last quarter, a sequential decrease of 8.6 percentage points. We reported non-GAAP net income of $7 million in Q1 compared to non-GAAP net income of $11.8 million last quarter. Cash flow used in operations in the first quarter was $7.2 million down sequentially primarily due to changes in working capital, driven by a ramp in product shipments. CapEx was $5.9 million in the quarter driven by R&D equipment spending. And free cash flow was negative $13.1 million, a decrease of $32.4 million year-over-year. We ended the quarter with cash and equivalents of $398.6 million, a decrease of $11.4 million from the fourth quarter. We remain well capitalized to continue investing in our growth opportunities while maintaining a substantial cash buffer. Our Q1 ending inventory was $31.6 million, up $5.7 million sequentially. Now turning to our guidance. We currently expect revenue in Q2 of fiscal '25 to be between $65 million and $68 million, up 11% sequentially at the midpoint. We expect Q2 non-GAAP gross margin to be within a range of 62% to 64%. We expect Q2 non-GAAP operating expenses to be between $36 million and $38 million. And we expect Q2 diluted weighted average share count to be approximately 182 million shares. As we move forward through fiscal year '25, we continue to expect sequential growth to accelerate in the second half of the year. We expect non-GAAP operating expenses to grow at half the rate of top line growth. And as a result, we look forward to driving operating leverage throughout the year. Thank you. [Operator Instructions] One moment for our next question. Our first question will come from the line of Toshiya Hari from Goldman Sachs. Your line is open. Toshiya Hari Hi. Good afternoon. Thanks so much for taking the question. My first question is on the AEC business and how we should be thinking about the acceleration in growth you guys spoke to as it pertains to the second half. Bill, I think on past calls, you've talked about your second customer ramping in the AEC space and your engagements with additional customers in that space as well. Can you kind of speak to the key drivers that you see contributing to the acceleration in growth in the second half of the year? And how you're thinking about your AEC opportunity outside of in-rack connectivity as you move into calendar '25 and '26. Bill Brennan Sure. So I'll start with, I think, we're pretty happy with the fact that AEC adoption is continuing pretty broadly. It has really become a de facto standard for the lengths that we address, which is primarily in-rack at this point. But may expand to rack-to-rack 5 to 7 meter cables in the future as we see rack densities increasing. So as that happens, typically, connections that were made with 10 meter, 20 meter optical solutions can now be made with 5 to 7 meter AEC solutions. So we do see broad adoption continuing and it's really with US hyperscalers, global hyperscalers. What we introduced a new term with emerging hyperscalers as well as service providers. So I think as we stand right now, we're really well positioned to see future growth with both 400 gig and 800 gig AEC solutions that are in development now engagements with customers. And in the future, as there's a move towards 1.6t, I think, we'll be in a really good position to address that market, especially because we'll really bring much differentiated power compared to competitors. I will say that as Ethernet back-end networks are becoming more mainstream, we're seeing a focus shift within our customer base to really network quality. And it's important to point out, when they have a single hardware failure or a link flap, it can cost 30 minutes productivity and really cost them tens of thousands of dollars. And so if they look at the pareto of things that would make their clusters more efficient, having solutions that have really high reliability is becoming really a primary objective. And so when we think about AECs with the mean time before failure of 100 million hours and bit error rates of five orders or more better than IEEE requirements. And the fact that we've had billions of operating hours that will be flatless in a sense. There's really what we see is a shifting priority to move to these solutions, to move these active copper solutions. And so to further that, as I mentioned before, I think there's a desire to even figure out networking topologies that allow them to make a portion of the back-end network connections with longer 5 to 7 meter ADCs that span two to three racks. And ultimately, I think, we see this really driving an uplift in the AEC market and an expansion of the TAM long-term. I hope that gives you color, we're quite bullish on this space. Toshiya Hari Yes. That's really helpful. Thanks, Bill. And then as a quick follow-up on the Optical DSP side of your portfolio. It's really nice to hear the customer traction. I think you guys reiterated that for fiscal '25, it should be more than 10% or at least 10% of your business. Longer term, given your current engagements, the customer back and forth you're having, what are your market share aspirations in this business? I know you're playing the role of disruptor, but curious how you're thinking about your market presence over time. And then within the context of LROs, I think, on the last call, you talked about 1.6T potentially being a catalyst for increased adoption. Is that still the case? And if so, is that pretty much a '25, '26 dynamic? Thank you. Bill Brennan Sure. So I'll say that we're quite happy with the Optical DSP business. We are on track to achieve a goal that you reiterated, which is 10% or higher. Our fiscal '25 revenue. For sure, we're building momentum. When we think about our position in the market today, there are more than 2 million modules with Credo DSPs that have been deployed in data centers. And this fiscal year, we'll ship more units than we've shipped in all previous years combined. So there is real momentum that is building. We've got programs identified that will drive sustained fast growth in fiscal '26 and beyond. And the bottom line is, as we sit here today, we've got a compelling set of solutions of 50 gig and 100 gig per lane ultimately measured by signal integrity, power and cost. So I think that the stage is set for there to be continued momentum that Credo builds in this market. From a market share standpoint, we're still small and rising. And the bottom line is it's one design at a time, converting one customer at a time is really the focus. And as I mentioned earlier, we feel very, very good about engaging in a first program and it will lead to multiple programs with a module manufacturer that's previously been considered a lock for the incumbent DSP competitor. The competitive landscape is shifting. And the bottom line, as we look towards the 1.6T market, we're going to deliver both full DSP and LRO solutions and the key here for us is to deliver kind of a new definition of what competitive is for power. We believe we're going to deliver a full DSP solution that's on the order of half the power of the devices that have been introduced into the market really almost prematurely in a sense that the power is so high in comparison to what the optical launch market is looking for. And so we're agnostic as it relates to full DSP and LRO. The bottom line is we'll deliver a full DSP at the 1.6T speed that ultimately is in the 10 watt range or less. And that will enable any optical player that actually build a standard OSFP or QSFP-DD and fit within the power ceiling there. We'll offer options with LRO to take it even further down from an energy efficiency standpoint. So we'll ultimately let our customers make that decision. Thank you. One moment for our next question. Our next question comes from the line of Tore Svanberg from Stifel, Nicholas & Company. Your line is open. Tore Svanberg Yes. Thank you. Bill, you just said something that caught my attention. You said you're going to ship more DSPs in fiscal '25 than all previous years. I mean, obviously, that includes the AEC business. But can you just elaborate a little bit on that because that seems like a pretty high number. Bill Brennan Yes, the reference was not really in regards to AECs. This is really in reference to our optical DSP business. And so I will say that we continue to be engaged with first US hyperscaler in production as well as we see a return to spending in the international space as well. And so I think that, from our perspective, that's no surprise. I think we've alluded to the ramp that's going to take place in this fiscal year. And I think we're well positioned to continue that in fiscal '26. Tore Svanberg Got it. Thanks for clarifying that. My second question is, you mentioned penetration or entrance into the PCIe retimer market. Could you just talk a little bit about why now? I mean, clearly, this is a market that's been around for a few years, but it seems to be expanding. So help us understand a little bit the timing of entering this market? And when should we expect some early revenues for Credo in this market? Bill Brennan Yes. So I think that we've talked about the intent to enter the PCIe market, specifically at the Gen 6 speed, which is 64 gig PAM4. We opted not to pursue the Gen 5 market and we probably made a bad call on that. But we felt like from a SerDes standpoint that entering the market at 64 gig PAM4 will enable us to deliver the same kind of compelling benefits that we brought to Ethernet and really specifically the 50 gig and 100 gig level. And so when we talk about signal integrity, we talked about energy efficiency, talk about having the lowest cost basis of any competitor in the market. And really with PCIe, there's an opportunity here from a latency standpoint and a DSP architecture standpoint for us to really be differentiated as well in a sense of having a SerDes that's a full-blown DSP with latency numbers that are much lower than competitive solutions that have been announced. And so we think the timing is right. I'll also mention that you'll see us enter the market and then accelerate the market to Gen 7. We've already got 128 gig silicon that has been tested by a lead partner in the market. And so as we see AI driving the demand for higher and higher bandwidth, this is something we'll really lean into with PCIe. Thank you. One moment for our next question. Next question comes from the line of Suji Desilva from Roth Capital. Your line is open. Suji Desilva Hi, Bill. Hi, Dan. In terms of the customer concentration, you talked about a new expected 10% customer in F2Q. I just wanted to get a sense if that customer is kind of starting from the ground floor in F1Q or whether it's been a gradual growth. I just want to understand the contribution from that new ramp. Bill Brennan Yes, sure. This is not a new customer. This is a customer that we've worked with for going on a couple of years now. And so we've seen that they've been really receptive to the solutions that we're delivering. And as their spending plan has increased, we've seen them become a much more significant customer for us. But, yes, they've been a customer in the past. But we're encouraged by the fact that this would be one that we would consider an emerging hyperscaler. And we've talked about in the past how the market really driven by AI solutions starting to look like more than just the top 5 US hyperscalers. And so I think this is like a first mover of the emerging hyperscalers. But we've been in a good position and we're in a good position with others as well from the standpoint that they've adopted an architecture that deploys AECs. Suji Desilva Okay, Bill. Good to hear the customer base diversifying. And then you talked a little bit about the opportunity as racks densify, if you would, and being able to handle closer racks and shorter reach. Can you just give us some sense of metrics and kind of the cooling technology, maybe the Hopper to Blackwell transition that makes that possible and some of the metrics to think about in terms of how the TAM increases for you guys? Bill Brennan Yes. Sure. I think we've all seen what leading edge solutions look like and the increase in densities that you're seeing just with that transition that you mentioned. But I think from our perspective, this is really going to be a customer-by-customer architecture decision. But theoretically, we could see the TAM expanding within a customer if they deploy a solution that implement direct rack-to-rack AECs, we can see a doubling in TAM easily when we look at it from that perspective. Thank you. One moment for our next question. Our next question comes from the line of Matt Ramsey from TD Cowen. Your line is open. Sean O'Loughlin Hey, guys. It's actually Sean O'Loughlin on here for Matt. And he sends his regards, but we'll get to that later. I wanted to ask a quick question on the Optical DSP sort of product engagement. You mentioned the module maker, which sounds like a really positive momentum there. But is this my naivety or should I be surprised that there's not more sort of connectivity between you guys and the hyperscaler customers themselves on the Optical DSP solutions given you have the relationships at the AEC level, and it's such an important power level conversation. Is there any engagement on the hyperscaler side that's kind of helping you maybe get pulled into some of these module maker designs. Bill Brennan Absolutely. Yes, we've talked about this in the past that this is a market that's a bit unique in a sense that if you only engage with the optical module manufacturers, you really aren't guaranteed anything. And so we've had a multiyear effort in working directly with hyperscalers and we've been successful with some of them and even doing a joint development program where they specify the DSP that is to be used. But this is really an ongoing effort it's really a three-party conversation between hyperscalers, module makers and Credo. And so that's very much part of our strategy. And from the standpoint of breaking it down, we actually on a weekly basis internally, we break it down per hyperscaler as it matches up with module makers. So it's a very focused strategy that we've got. Sean O'Loughlin Got you. That's really helpful. And then just one clarification on the IP license revenue. I think last quarter, you had mentioned that you expected that to come in sort of at the higher end of the long-term or the long-term model, but license revenue obviously is lumpy and came in a little lower this quarter. Is that still your expectation for the full year or are we just thinking about it incorrectly? Thanks guys. Dan Fleming Yes. For the full year, our expectation hasn't changed. So we expect it to be in that long-term range of 10% to 15% for the full year. And you're right in that it was lighter than expected at only 4%. And as the quarter evolved, it was offset by strong turns bookings within the quarter to offset the lightness. But you're thinking about it right. As you say, it's very hard to, it's hard for us to forecast IP in 90 day increments, but we do have confidence over a longer period of time like fiscal year '25 that will be within that range of 10% to 15%. Thank you. One moment for our next question. Our next question comes from the line of Karl Ackerman from BNP Parabas. Your line is open. Karl Ackerman I have two. Thank you very much. Gentlemen, first off, I wanted to discuss how active copper cables have received a lot of attention recently which use a redriver instead of retimer that's used in active electrical cables. Each, of course, has their own trade-offs. But do you view these applications as catalystic to each other or is the market opportunity for passive copper cable is large enough for both applications. And as you address that question, how am I using an AEC with a half free time DSP improve the power and cost between perhaps active electrical and active copper cables. And I have a follow-up. Thank you. Bill Brennan Yes. So we see the market for passive copper as well as what you refer to as active or ACC or what we refer to as amplified solutions. We see the market for both of those not being big, long-term. As it relates to some of the references in the market to ACCs. I think that's really been driven by NVIDIA's strategy. And so it's really not something that I would say is a broad market type of opportunity. And even with the introductions that they've made in the past three to six months. I think it's questionable as to what role ACCs will play or amplified solutions will play. The key is that we don't see anybody in kind of the rest of the market meaning hyperscalers that are looking at building their own ecosystems. We don't see really anybody considering those solutions. And the reason is because really not following industry standards at this point. And so when we talk about interoperability and we talk about basically things like signal integrity, having the AEC or the fully retimed, fully equalized solution, that's really the way you deliver the kind of -- the kind of interoperability, the performance that would be expected. And so that's where we really haven't seen any competition and that's globally really, especially at the 100 gig per lane level. As it gets to 200 gig per lane, 1.6T, I think it plays even a smaller role. And that I think generally in the market, the game is over. Fully retimed AECs are really the choice by the broad market. And when I refer to de facto, that's really the only solution that's being considered by many of the customers that we talk to. And so we can kind of go down the path of doing LRO for AECs that has not really been a priority for the customer base just because the power levels we're delivering are meeting the objectives. But the opportunity would exist in the future if that becomes a priority amongst our customers. Karl Ackerman Yes. Very clear. Thank you for that. I wanted to focus back on licensing revenue. I know it's hard to predict, but is there a seasonality component to the consumer USB licensing revenue? And I guess, more importantly, has your view changed at all on the licensing revenue being towards the 10% to 15% of your fiscal year revenue in fiscal '25 and perhaps obviously look for fiscal '26. Thank you. Dan Fleming I would say we've not witnessed any sort of seasonality with IP revenue. In fact, if you look at the last five quarters, it goes each quarter is quite different than the last in terms of revenue mix or revenue contribution in total. So it's really difficult to draw too many conclusions from that. Again, as we've said in the past, we view IP as a very strategic imperative for us where we're not incentivized to chase low value deals. We want to make sure that we get a good ROI on all the IP that we sell and we win where power tends to be an overriding feature that's required in a solution. So it's tough to predict quarter-to-quarter as we all know. Longer term, we stated 10% to 15%. We think that will be that way this year. Next year, we haven't talked about, but I would expect that would be the case as well. Longer term, when we're talking about much bigger shipments to customers with products, ultimately, that will probably will reset that expectation probably lower in the future, maybe FY'27 and beyond. Thank you. One moment for our next question. Our next question comes from the line of Thomas O'Malley from Barclays. Your line is open. Thomas O'Malley Hey, guys. Thanks for taking my question. This one is for Dan. I just want to be a little bit more specific on a question that's kind of come up a couple of times here. So in the July quarter, product came in much better and licensing came in much lower. Could you just describe first what drove that product uptick? And also if you look at the gross margins, they were much better as well. I know you said volume on the call, but I'd be surprised if volume drove all of that. And then looking into the October quarter guidance, like is your assumption that IP goes back to that kind of $10 million a quarter range just because that obviously matters for what product does in a quarter. I think that's what people are trying to figure out. So both of those would be super helpful. Dan Fleming Yes. So as far as product being a little stronger than normal, it's ordinary that we have some amount of turns bookings, so they came in stronger than we would have expected entering the quarter. So and there's nothing really unanticipated or to talk about in terms of what it was versus what we've already talked about. Our largest AEC hyperscale customer contributed very significantly to the quarter. And we expect them to really drive the ramp throughout the fiscal year. So maybe I could talk about gross margin impact as well since you mentioned that expansion of gross margin at the product level. So that's kind of, from my perspective, that's one of the really the headlines for Q1 or one of the key takeaways. If you look at product gross margin excluding product engineering services, it was up over 900 basis points sequentially to 59.6%. So recall, though, for last quarter, we had mentioned that we had some onetime reserves in our Q4 number. So that lowered the product gross margin in Q4 a bit. But we're very happy in Q1 with the excellent progress we've made toward our long-term gross margin expectation of 63% to 65%, which has not changed. So and as I've mentioned in the past, our long-term model, that you saw for the IP portion being 10% to 15% versus product, that means the product gross margin used to be right around 60%. So, in Q1, we were already in that same ZIP code. So throughout FY'25, we expect to see some quarterly fluctuations you might see. But it really is, it's driven by scale. There's, of course, some product mix impact as well. But the overarching impact of improving margins this year thematically is improving scale. And 32% was the sequential increase from Q4 to Q1 in terms of product shipments. So you do gain a lot of scale in that when that occurs. Thomas O'Malley And then just the second part about your expectations embedded in the guidance. You're kind of saying 10% to 15% for the year still obviously lighter in Q1, so you would expect some acceleration. But just what is your expectation for October? Dan Fleming Yes. So for October for IP, you might expect it to be a slightly larger contributor to revenue than it was in terms of revenue mix than Q1. But as you saw in Q1, one of the important takeaways is in order to achieve our gross margin targets that we set out in Q2, we don't need an oversized contribution of IP that's not critical for us to achieve the gross margin goals. But expect IP to contribute modestly more to our revenue mix in the quarter Q2 than it did in Q1. Thank you. One moment for our next question. Next question comes from the line of Quinn Bolton from Needham. Your line is open. Quinn Bolton Hey, guys. Congratulations on the nice results and outlook. I guess, Dan, maybe to beat on the gross margin here a little bit. Would you expect product margins to decline in the October quarter? Because I'm struggling to sort of see why margins wouldn't be at least at the high end of your 62% to 64% range if product gross margin is flat and IP is a slightly higher percentage of the mix in the October quarter? Dan Fleming I understand where you're coming at for that question. The one thing I'll say is everything is not always linear for us. We had quite a big step-up in gross margin percentage in Q1. We've proven over the course of time, it's not always linear. We don't have a specific expectation that it will decline or decline by much. But we remain over -- our overarching theme is to remain conservative in the way we look at things and guide things. So hopefully that helps you a little bit, Quinn. Quinn Bolton Got it. Yes, that does. And then just wanted to come back to the license and IP. I think in the script you'd mentioned royalty as part of that IP business and I know you had a big consumer license a few years ago that wondering if you could just give us any update on specific royalties that you're looking for as part of that IP revenue stream? Does royalty become something to call out here at some point over the next year or so or do you think most of that IP is from straight licensing of your SerDes like it's been historically? Bill Brennan I think you're right on with the assumptions you're making on the royalty and as it relates to our USB customer. We think that as we think about things going forward, we don't think that's going to be a part of the business that we look at as kind of material in comparison to licensing revenues themselves. So we think that the weighting is going to still stay the same in a sense that the bulk of the revenues in this category will come from licenses. Thank you. One moment for our next question. Our next question comes from the line of Richard Shannon from Craig-Hallum. Your line is open. Richard Shannon Great, guys. Thanks for taking my questions. Maybe a very quick two-parter for Dan here. You called out record product revenues, record in couplers, but I don't think it was in AECs. Can you declare whether you had a record AEC quarter? And then also can you give us the percentages of the two 10% customers as well. Dan Fleming Yes. So we did have record product revenue. We don't break out by product line, but you can safely assume that AEC is a large driver of our story for the full year. So that was the most significant contributor to revenue. Turning to 10% customers. We had two 10% end customers in Q1, which I have mentioned in the prepared remarks. And they were, in fact, our first two AEC hyperscalers at the top. When our Q is filed in a day or so, you'll see that our first AEC hyperscale customer remained a 10% customer during Q1 right at 10%. And the second AEC hyperscale customer was our largest customer at 62%. So we expect that second AEC hyperscaler to continue their meaningful ramp throughout fiscal '25, as we talked about. Of course, it might not always be a linear ramp. But as Bill mentioned earlier, we expect a new 10% customer in Q2. So even with a meaningful ramp at our second AEC hyperscaler, we expect significant revenue diversification from both a customer and product perspective throughout the calendar year or throughout the fiscal year as that plays out. Richard Shannon Okay. Great. Thanks for that Dan. And I'll follow up with a question for Bill here. Hi, Bill, I probably typed in incorrectly here in response to an earlier question here, but you talked about some next-gen AI clusters and racks here kind of moving away from, I think, if I caught you correctly, optics and moving towards the active copper in some manner here. Maybe you can just maybe delve into that a little bit more and maybe also comment on whether you think that is a dynamic that can sustain for more than one generation of systems, I can go two or three generations, my suspicion is not, but I'd love to get your comments on that. Thanks. Bill Brennan Absolutely. And so when we work with our customers more and more and really listen to their challenges, one of the big things that stands out is just the quality of AECs compared to solutions that are laser-based optics. And again, get back to this link flap phenomenon that's been seen in AI clusters and it's becoming more and more of a topic in the industry. Just to reiterate, if you've got a single connection that fails. And even if it flaps, it just goes down and it comes back up, it can shut down the entire cluster for up to 30 minutes and ultimately measured in dollar terms. $30,000 to $50,000. And so if you have three to five of these episodes per day, it becomes something that you really want to pay attention to. And so as we as we work with customers, more and more, the focus is becoming on network quality and when we think about AECs, they just don't have link flaps like laser based optics have. And so it's absolutely something that will benefit the AEC TAM long-term. And so you've got this other factor that's happening where racks are being built in a much more dense fashion. So now there becomes a possibility to network and really connect racks and lean on the very high quality reliability of the AEC products. And so moving to an architecture where they're connecting not just in the rack, but rack-to-rack, and we're really talking 5 to 7 meters. We're seeing much more activity along these lines. And so again, it's a customer-by-customer driven architecture objective. But this is something that we see that definitely is encouraging as it relates to kind of a new factor that is driving demand in this direction. I will say that in addition to what we're doing with our AEC family. We're also looking at other system-level solutions that, quite frankly, helped to address some of these areas of feedback that we're hearing. And so our experience in the AEC market has taught us a huge amount about these challenges and we see potential to use this knowledge in the optical space specifically to enhance AI cluster performance and energy efficiency. And so you've seen innovations from Credo along the lines of LRO that was specifically for power, but there's other things that we're working on to help improve network quality. And that's not something I'm going to the details about, but within the next say 9 to 12 months we'll probably talk more about it as we deliver these solutions to the market. Thank you. One moment for our next question. Our next question comes from Tore Svanberg from Stifel, Nicholas & Company. Your line is open. Tore Svanberg Yes, thank you. I just had two follow-ups. Bill, as we sort of look at the second half of fiscal '25, the ramps that you're expecting, could you talk a little bit about how broad those ramps are, I'm not talking about by customer here, but I'm thinking more about the different AEC form factors and especially when considering the 400 gig AI AEC back-end network solution that you have been sampling in the last few quarters. Bill Brennan Sure. So I guess that would relate to our business with Amazon. And we feel good about where we are in the ramp there. As we've said previously, their ramp is expected to drive a significant portion of our growth this year. Dan alluded to the fact that we don't expect it to be a linear ramp quarter-to-quarter, but we do expect them to be our largest customer. With that said, we see increasing demand across the board for our solutions and realize from a range of customers that we feel good about in a sense that it will drive customer diversity over the upcoming quarters through the end of this year and even into fiscal '26. So as we've talked about before, we see Microsoft returning to historical levels. We've talked about a third hyperscaler. That relationship is really progressing and we expect that they will become a significant customer and really towards the tail end of just the real tail end of fiscal '25, but really into fiscal '26. We've mentioned AECs continue to gain traction broadly. Optical is going to contribute to that inflection point this year and into '26 as we see that continuing as our fastest growth product line. And even the other product categories, we haven't really spent too much time talking about like Line Card PHYs or SerDes Chiplets and licensing it's expected that they're going to be a material contributor in our second half and into fiscal '26. And so that really gives you color near term. A lot of the fun that we have in Credo is talking about things that are really outside the topic of these calls, which is really very near-term focused on fiscal '25. We have a lot of fun talking about next-generation solutions that we're developing. There's a lot of energy that we've got around PCIe Gen 6 and Gen 7 and exactly how are we going to add our unique and compelling value to that market. And this idea of continuing to innovate at the system level is something that we're highly engaged on internally at Credo. We've had just great experience in the markets that we've pioneered and we're going to continue that. And we expect to really bring meaningful and compelling solutions to the market at again the system level. Tore Svanberg That's very helpful. Just last question for Dan. Dan, you said that OpEx would grow at half the rate of revenue growth. Is that a specific sort of target for fiscal '25 or was that more a reference to sort of your long-term financial model? Dan Fleming It's related to fiscal '25. And in fact, I mean, it's even embedded in our Q2 guide, right? Revenue at the midpoint is up 11%. Gross margin flat, OpEx up 5%. So we expect that year-over-year, fiscal '24 to '25, that will be the case, and it will extend probably into fiscal '26 as well as we attain kind of that long-term operating model. Our operating margin should be in the 30% to 35% range next fiscal year. Thank you. And there are no further questions at this time. Mr. Brennan, I turn the call back over to you. Bill Brennan Absolutely. Thanks, everybody, for joining. We really appreciate the questions and we look forward to the call backs. Thanks. And with that, this concludes today's conference call. You may now disconnect. Everyone, have a great day.
Share
Copy Link
Hewlett Packard Enterprise (HPE) and Credo Technology Group report their latest quarterly earnings, revealing contrasting performances in the tech sector. While HPE faces headwinds, Credo shows promising growth.
Hewlett Packard Enterprise (HPE) reported its Q3 2024 earnings, revealing a mixed performance amidst a challenging market environment. The company's revenue for the quarter came in at $6.8 billion, falling short of expectations and representing a 6.5% year-over-year decline1. Despite this, HPE managed to deliver non-GAAP earnings per share of $0.49, surpassing analyst estimates.
CEO Antonio Neri acknowledged the impact of macroeconomic headwinds on customer spending, particularly in the high-performance computing and AI segment. However, he emphasized the company's strategic focus on AI-native solutions and the potential for future growth in this area1.
HPE highlighted its progress in key strategic areas, including:
The company maintained its fiscal year 2024 outlook, projecting revenue growth of 2-4% in constant currency and free cash flow of $1.9 billion to $2.1 billion1. Despite near-term challenges, HPE remains optimistic about its long-term prospects in the evolving tech landscape.
In contrast to HPE's challenges, Credo Technology Group reported robust results for its Q1 2025. The company achieved record revenue of $53.3 million, representing a significant 15% sequential increase and a 2% year-over-year growth2.
Credo's success was driven by strong demand for its high-speed connectivity solutions, particularly in the data center and AI infrastructure markets. The company's non-GAAP gross margin improved to 62.1%, showcasing its operational efficiency2.
Key factors contributing to Credo's positive performance include:
CEO Bill Brennan expressed confidence in the company's growth trajectory, citing ongoing design wins and the expanding market for high-speed connectivity solutions. Credo provided guidance for Q2 2025, projecting revenue between $58 million to $62 million, indicating continued momentum2.
The contrasting performances of HPE and Credo Technology highlight the varying dynamics within the tech sector:
As the industry navigates through these challenges and opportunities, companies' ability to adapt to changing market demands and capitalize on emerging technologies will likely determine their success in the coming quarters.
Summarized by
Navi
Google launches its new Pixel 10 smartphone series, showcasing advanced AI capabilities powered by Gemini, aiming to challenge competitors in the premium handset market.
20 Sources
Technology
2 hrs ago
20 Sources
Technology
2 hrs ago
Google's Pixel 10 series introduces groundbreaking AI features, including Magic Cue, Camera Coach, and Voice Translate, powered by the new Tensor G5 chip and Gemini Nano model.
12 Sources
Technology
3 hrs ago
12 Sources
Technology
3 hrs ago
NASA and IBM have developed Surya, an open-source AI model that can predict solar flares and space weather with improved accuracy, potentially helping to protect Earth's infrastructure from solar storm damage.
6 Sources
Technology
10 hrs ago
6 Sources
Technology
10 hrs ago
Google's latest smartwatch, the Pixel Watch 4, introduces significant upgrades including a curved display, enhanced AI features, and improved health tracking capabilities.
17 Sources
Technology
2 hrs ago
17 Sources
Technology
2 hrs ago
FieldAI, a robotics startup, has raised $405 million to develop "foundational embodied AI models" for various robot types. The company's innovative approach integrates physics principles into AI, enabling safer and more adaptable robot operations across diverse environments.
7 Sources
Technology
2 hrs ago
7 Sources
Technology
2 hrs ago