The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2024 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On October 11, 2024
11 Sources
[1]
Why Nvidia, AMD, Arm Holdings, and Other Artificial Intelligence (AI) Stocks Sank on Tuesday | The Motley Fool
Even as the adoption of AI accelerates, there will be inevitable bumps in the road. Since early last year, investors have been bullish about the potential of artificial intelligence (AI), scooping up shares of companies best positioned to profit from this next-generation technology. However, as the bull market crosses the two-year mark, many are taking a step back to survey the landscape, and some are looking for any excuse to take profits. With that as a backdrop, chip designer Arm Holdings (ARM -6.90%) slumped 6.7%, AI chipmaker Nvidia (NVDA -5.06%) tumbled 4.9%, chipmaker Advanced Micro Devices (AMD -5.11%) sank 4.8%, semiconductor device supplier Broadcom (AVGO -3.56%) fell 3.7%, and chip foundry Taiwan Semiconductor Manufacturing (TSM -2.55%) dipped 2.6%, as of 12:50 p.m. ET on Tuesday. The catalyst that sent these AI stocks lower were reports the U.S. government is considering new curbs on chip exports. The Biden administration is considering limiting sales of advanced AI processors from Nvidia, AMD, and other companies, according to a report that first appeared in Bloomberg, citing "people familiar with the matter." This would mark the most recent step by regulators to address concerns that advanced technology like AI could be used against the U.S. and its interests. The government is discussing a cap on the number of export licenses for certain countries, citing national security as the reason for the potential move. It's worth noting that the U.S. already has strict limits on the level of AI chip technology it allows to be sold to some countries, including China and 40 other countries in Asia, the Middle East, and Africa. Currently, U.S. chipmakers are required to obtain government licenses to sell advanced semiconductors to customers in certain countries. The current deliberations would extend the existing curbs, which might be set on a country-by-country basis, with an emphasis on countries within the Persian Gulf region. The considerations are still in the early stages, and no final decision has been reached, but the plans have been "gaining traction in recent weeks," according to the report. Limiting the sales of advanced AI chips to certain countries has potential implications for all of these AI-centric stocks: While investors fear a hit to Nvidia's (and others') sales, history suggests they may be overreacting. There were similar concerns on multiple other occasions when the U.S. government considered or announced curbs on chips to countries like China. Despite those fears, Nvidia went on to generate triple-digit growth for five consecutive quarters. Furthermore, recent reports suggest the company's Blackwell chips are sold out for the next 12 months. This suggests that, potential curbs aside, demand for AI chips remains robust. Then, there are valuations to consider. Arm Holdings, AMD, Nvidia, Broadcom, and TSMC are selling for 96 times, 46 times, 46 times, 36 times, and 28 times forward earnings, respectively. For those looking for a good value, TSMC is likely the only one worth buying, but this doesn't account for the accelerating growth trajectory resulting from AI. Using the more appropriate forward price/earnings-to-growth (PEG) ratio -- which factors in that growth -- reveals that each of the remaining stocks boasts a multiple of less than 1, the standard for undervalued stocks. It's still early days for the adoption of generative AI, and while some experts peg the market value at $1.3 trillion, others believe the total could be much higher. For investors looking to profit from AI, the best strategy is to buy the best AI stocks you can find and hold tight for the long term.
[2]
Prediction: This Artificial Intelligence (AI) Stock Will Outperform Nvidia Over the Next Decade | The Motley Fool
Unfortunately for Nvidia, the company abandoned the deal as long-winded court cases revolving around antitrust concerns seemed to have no end in sight. Following the failed acquisition, Arm pursued an initial public offering (IPO) -- hitting the Nasdaq last September. Since going public, Arm stock has surged 138% on the backdrop of the artificial intelligence (AI) movement. But even after such a meteoric rise, I see much better days ahead for Arm. In fact, I think Arm stock will handily outperform Nvidia over the next decade. Below, I'll detail why I'm so bullish on Arm and explain how rising competition in the chip realm could ignite Nvidia's first uphill battle in quite some time. The semiconductor industry has many different components. Not all chip companies make graphics processing units (GPUs) like Nvidia or Advanced Micro Devices. There are far more applications for chips, and Arm dominates a pretty singular pocket of the market. At its core, Arm designs chip architecture for mobile devices, consumer electronics, networking equipment inside data centers, and other Internet of Things (IoT) devices. The company makes money from licensing out its intellectual property (IP), and earns a royalty based on its various architectures. As illustrated in the graphic above, Arm's architecture is deeply embedded across various applications. This provides the company with an enviable level of flexibility regarding new chips hitting the market in the future. In other words, companies running on Arm's architecture are less inclined to develop a new hardware and software system that is incongruent with Arm's architecture. Furthermore, the slide above shows that Arm's market share has increased across the board over the last two years. With that in mind, I think the company is well positioned to continue benefiting from new chip-based devices, since Arm's IP is already leveraged across so many devices around the world. For this reason, I see Arm as less vulnerable to competitive forces in the chip space compared to peers such as Nvidia. Like Arm, Nvidia has a massive presence in its core end market. The company's A100 and H100 chipsets have helped Nvidia acquire an estimated 88% of the GPU market. However, I see some obvious risks that could expose Nvidia over the next several years, and I would not be surprised to see the company begin to lose market share. First, companies including Microsoft, Alphabet, Tesla, Amazon, and Meta Platforms are all investing in their own custom chip designs. Moreover, these companies have been labeled by Wall Street analysts as Nvidia's largest customers -- accounting for nearly half of the company's revenue. While you could argue that more competition is a good thing for Nvidia, I don't see it that way in this case. These companies will probably remain customers of Nvidia for the next several years, but the introduction of their own hardware could wind up being a bargaining chip in the long run. What I mean by that is that more GPUs on the market will likely weaken Nvidia's pricing power. In turn, I think Nvidia's revenue and profit growth could have a dramatic slowdown -- a dynamic that growth investors won't want to see. But rising competition isn't the only risk facing Nvidia. Given the company's near-monopoly position, there is a possibility that the Department of Justice (DOJ) could investigate Nvidia's business practices and force the company to loosen up its ecosystem. With so many unknowns revolving around Nvidia's future, I'm skeptical that the stock is a no-brainer at this juncture. There have been many periods of expansion and contraction in Arm's trading activity. But with a forward price-to-earnings (P/E) multiple of 96, it's hard to say the stock is cheap. Here's how I think about it: The market is clearly placing a premium on Arm stock for a reason. I think there are two core themes to unpack. At a macro level, AI appears to be here for the long run, and technology's biggest companies are committed to spending billions on future artificial intelligence initiatives. While spending will change from year to year, the secular tailwinds presented by AI should bode well for Arm. At a company-specific level, Arm's unique position in the chip space and its lucrative business model suggest that the company's growth will remain robust over time. For these reasons, I see Arm as the superior investment over Nvidia in the next decade. While the stock isn't a bargain, I think it still looks like a compelling opportunity for long-term investors.
[3]
Nvidia and Advanced Micro Devices Just Gave Magnificent News to AI Chip Investors | The Motley Fool
However, there is a considerable debate among investors as to whether this hypergrowth is sustainable, or whether the AI buildout is going to pop like the dot-com bust. This week, CEOs of AI chip leaders Nvidia and AMD made announcements, each of which gave even greater weight to the bull case for their stocks and AI chips stocks generally. AI stocks pulled back hard over the summer after a strong 18 months or so of performance, as skepticism worked its way into the story. After the Magnificent Seven, who are the main buyers of AI chips, reported good but not blowout earnings in July, investors appeared concerned that these big chip buyers weren't seeing a requisite return on their investments in Nvidia chips. Most big tech stocks and AI chip plays sank in response. Giant hedge fund Elliott Management piled onto the skepticism, giving especially bearish commentary on what it perceives as an AI "bubble." Elliott wrote in its latest letter to investors that AI stocks were overhyped, declaring AI applications aren't, "ever going to be cost-efficient, are never going to actually work right, will take up too much energy, or will prove to be untrustworthy." Elliott dismissed the technology as only good for a few things such as summarizing reports and helping with computer coding. That's certainly a point of view that should be considered. It may even be true of the current models that are out today. However, virtually everyone participating in the technology industry believes in the benefits. If benefits weren't likely going to be there, it seems unlikely every major technology company would be greatly expanding its AI investments today as they are. For his part, Oracle Chairman Larry Ellison dismissed these concerns, declaring the race for AI supremacy "goes on forever, to build a better and better neural network." Ellison believes that AI capabilities will improve with more compute and better models, and that the large tech companies can't afford to cede the AI lead to competitors. With big tech armed with a ton of cash, he doesn't see the buildout ending for five to 10 years. This week then saw two massive announcements from the number one and two AI chip companies that should allay near-term fears about the durability of the AI trade. At the beginning of the month, Nvidia's CEO Jensen Huang said demand for its next generation chip Blackwell was "insane." Fast forward to last week, and analysts at Morgan Stanley revealed Blackwell is already sold out for the next 12 months, after the firm hosted Nvidia executives at their offices. Then on Thursday, AMD held its "Advancing AI" event during which it unveiled its new EPYC 9005 CPUs and Instinct MI325X GPUs. During the presentation, CEO Lisa Su increased her projection for the market size for AI accelerators. Last year, Su surprised investors by forecasting the AI accelerator market would increase from $45 billion in 2023 to a whopping $400 billion in 2027. So, has the past year made her more skeptical or anxious about all that spend, as Elliott surmises? Just the opposite, in fact. During the conference, Su raised her guidance for the AI accelerator market to reach a whopping $500 billion by 2028, saying, "Since [last year], AI demand has continued to take off and exceed expectations. It's clear that the rate of investment is continuing to grow everywhere, driven by more powerful models, new use cases, and actually just a wider adoption of AI use cases." If Su's and Huang's projections hold, more companies will benefit than just Nvidia and AMD. Any company with a strong competitive position in the related foundry, semicap equipment, server, or AI-integrated software industries should also see a benefit from this medium-term demand. Additionally, electricity and transmission providers should also see strong growth, as AI data centers consume a huge amount of energy. While the dot-com bubble burst in the year 2000, remember that there was a five-year "boom" that preceded it. The boom coincided with the period after the Federal Reserve cut interest rates between 1995 through 1998. Looking at today's situation, the AI boom is only about two years old, and the Fed similarly just began a rate-cutting cycle in September. To this investor, it appears we may be more in the "mid-90s" analogy rather than the precipice of an enormous bubble bursting. There's also a case for the boom to go on for a longer time than the internet boom did, as the companies investing AI, the Mag Seven, are all extremely strong financially -- much stronger than a lot of the newer start-up tech companies of the mid-1990s. In addition, for all their success, the Mag Seven really don't trade at the crazy valuations seen by big tech in the late 1990s. That doesn't mean the AI buildout couldn't become a bubble -- it could. But it still seems early, and AI stocks are too reasonably priced for a huge fall, barring any outside exogenous shocks.
[4]
1 Top Artificial Intelligence (AI) Stock to Buy Right Now (Hint: It's Not Nvidia) | The Motley Fool
One key Nvidia competitor looks increasingly attractive in the current environment. When it comes to finding a top artificial intelligence (AI) stock to buy, investors may feel little motivation to look beyond Nvidia. Indeed, CEO Jensen Huang's decision to develop AI chips before Nvidia's competitors made it the dominant chip company in a lucrative and fast-growing niche. Naturally, investors want to own such a stock. However, despite Nvidia's market lead, many investors will balk at the stock's valuation. Indeed, given its stock price, Nvidia looks to have little room to exceed the market's high expectations. This situation may prompt investors to look for opportunities in other AI stocks, and one Nvidia competitor could become an excellent alternative. Investors looking for AI opportunities outside of Nvidia may want to consider one of its leading competitors, Advanced Micro Devices (AMD -4.00%). To counter Nvidia's lead in AI chips, AMD has released AI accelerators of its own. Its MI300 series of chips serves as an alternative for many customers who cannot obtain or afford Nvidia's more expensive AI chips. In fact, Oracle just chose AMD's MI300X chips to power its latest OCI Compute Supercluster instance. This win is a sign AMD has become a more formidable competitor in this market. Following up on the MI300X chips, AMD plans to release the MI325X chips late this year. These will allow up to 288GB of memory and will accommodate bandwidths of up to 6 terabytes per second. Moving forward, AMD will introduce its upgraded CDNA architecture in 2025, which should increase computational throughput. Given the company's history of catching and sometimes surpassing its competitors, these advances could be significant for AMD and the AI chip industry at large. Admittedly, AMD's current financials are unlikely to impress investors. The $11 billion of revenue for the first half of the year rose only 6% yearly. Also, the $388 million in net income for that period shows it has only begun to recover from the $112 million in losses in the year-ago quarter. However, the data center segment, which manages its AI chips, earned revenue of $5.2 billion in the first half of the year, a 98% yearly increase. Also, the data center segment's proportion of total revenue grew from 24% in the first two quarters of 2023 to 46% in the first half of this year. This is likely mimicking the growth pattern of Nvidia, which now derives 88% of its revenue from the data center segment. Three years before, the data center segment was not even Nvidia's largest, a testament to how much AI accelerators can transform a semiconductor company. Furthermore, massive revenue drops in AMD's gaming and embedded segments contributed to its tepid growth rate. Assuming AMD can return those segments to a growth mode (or at least stop the declines), the company's financial picture should improve. So far, AMD's growth has lagged behind its rival, as its stock has risen by just 15% compared to Nvidia's more than 155%. Also, the recent return to profitability has arguably made the P/E ratio a poor representation of its valuation. Nonetheless, AMD stock sells for 12 times sales, compared to Nvidia's price-to-sales (P/S) ratio of 33. If AMD can close much of its competitive gap, that stock may be ready to skyrocket as AI chips become an increasingly dominant revenue stream for the company. Given this potential, the time may be now for AMD. Indeed, Nvidia's early lead in AI chips caught the company by surprise. However, it has worked to compete and innovate in this field, and the revenue numbers confirm that. Like with Nvidia, AMD's data center segment is becoming increasingly dominant thanks to its growing sales in AI chips. Additionally, AMD's stock price growth has been tepid this year, likely due to significant declines in the gaming and embedded segments. Still, the market has priced Nvidia's stock for perfection, implying limited upside in the near term. In contrast, AMD has tremendous upside if it can maintain growth rates with AI chips and return these struggling segments to growth. That could set up the conditions needed for AMD's stock to outperform Nvidia, even if it has not closed the technical gap.
[5]
The Newest Artificial Intelligence Stock Has Arrived -- and It Claims to Make Chips That Are 20x Faster Than Nvidia | The Motley Fool
Nvidia's stock has enjoyed incredible gains over the last five years. It's only natural for competitors to pop up. The artificial intelligence chipmaker Nvidia (NVDA -0.01%) has amassed close to a $3.2 trillion market cap, making it one of the world's largest chipmakers. It now consumes more than 6% of the broader benchmark S&P 500 index. Over the last five years, Nvidia has grown annual revenue by 458% and the stock is up an incredible 2,009%. Given the potential for AI to disrupt life as we know it, it's understandable that investors are so excited about the stock. But the lure of these kinds of gains is naturally going to attract competition. Now, one of Nvidia's competitors is planning an initial public offering (IPO) and claiming to manufacture chips that can vastly outperform Nvidia at a fraction of the price. Let's take a look. Last week, the AI chipmaker Cerebras filed its registration statement with the Securities and Exchange Commission (SEC) with the intent to go public. In a press release from 2021, Cerebras said it had a valuation of $4 billion after a $250 million series F financing round. The company is targeting a $1 billion IPO at a $7 billion to $8 billion valuation. In its registration statement, Cerebras cites Nvidia as a competitor, as well as other large AI companies such as Advanced Micro Devices, Intel, Microsoft, and Alphabet. Here is a description of what Cerebras does: We design processors for AI training and inference. We build AI systems to power, cool, and feed the processors data. We develop software to link these systems together into industry-leading supercomputers that are simple to use, even for the most complicated AI work, using familiar ML frameworks like PyTorch. Customers use our supercomputers to train industry-leading models. We use these supercomputers to run inference at speeds unobtainable on alternative commercial technologies. Cerebras' pitch is that bigger is better. That's because the company has designed a chip that is the size of a full silicon wafer, and the largest ever sold. The company believes that the size advantage leads to less time moving data. Furthermore, Cerebras has a flexible business model in which clients can buy Cerebras products to have at their facilities or through a consumption-based subscription through the company's cloud infrastructure. Cerebras clearly wants investors to compare, or at least associate, the company with Nvidia. Nvidia is mentioned 12 times in the registration statement. Cerebras also provides a side-by-side comparison of its Wafer-Scale Engine-3 chip versus Nvidia's H100 graphics processing unit (GPU), which is considered the most powerful GPU on the market. Cerebras CEO Andrew Feldman publicly said the company's inference offering is 20 times faster than Nvidia's at a fraction of the price. In 2023, Cerebras generated about $78.7 million of revenue, up 220% year over year. Through the first half of 2024, Cerebras has grown revenue to $136.4 million. The company still hasn't earned a profit, having reported a nearly $67 million loss through the first half of 2024. These numbers also pale in comparison to Nvidia, which recently reported second-quarter revenue of $30 billion and a profit of roughly $16.6 billion. With big publicity from news publications and claims of being 20 times faster than Nvidia, I think it's safe to say that Cerebras already has and will continue to make a splash. Depending on the excitement investment bankers can drum up during the company's road show and market conditions, I wouldn't be surprised to see Cerebras go public at a higher valuation than expected. AI has been all the buzz and the IPO market has been flat for a few years now, so there could be pent-up demand on Wall Street. Will Cerebras overtake Nvidia? Only time will tell. Its product offerings are impressive, but it still has a ways to go to get its financial profile in line with Nvidia. Furthermore, there may be some advantages to Nvidia having smaller chips and it remains to be seen whether Cerebras can compete with Nvidia's software language CUDA -- although the company does say that its software program "eliminates the need for low-level programming in CUDA." While everything sounds great, there is likely still a "show me" component to this story. After all, the bulk of Cerebras' revenue comes from one customer. Nvidia also has a leading market share in the AI chip space and relationships with many large clients. Who's to say Nvidia couldn't use its size -- and likely resource -- advantage to develop a similar large wafer chip? There's a lot left to play out, but this could be one of the more interesting developments for market watchers to pay attention to.
[6]
With This IPO on the Horizon, Has Nvidia Met its Match? | The Motley Fool
Nvidia (NVDA -0.01%) isn't the only game in town when it comes to artificial intelligence (AI) chips. Customers can seek high-performance chips -- and at a lower price -- from rivals like Advanced Micro Devices or Intel, for example. But Nvidia has stood out, and claimed the lion's share of the AI chip market, thanks to its superior performance. Customers are willing to pay more to get ahead in the highly competitive AI race, and that's led to demand for Nvidia's chips surpassing supply. And all of this has translated into triple-digit increases in earnings for the company quarter after quarter and outsized share price gains. Nvidia reported a record $30 billion in revenue in the recent quarter, and the stock has soared 172% so far this year. Soon, though, another rival may launch an initial public offering (IPO), a way to raise capital and possibly set it along the path to major growth in the AI chip market. Cerebras Systems late last month filed a registration statement with the Securities and Exchange Commission (SEC) in preparation for a potential IPO. The company sells an AI processor that could rival Nvidia's current top seller, the H100, and even Nvidia's new Blackwell chip. With this IPO on the horizon, has the AI giant met its match? Let's find out. First, let's consider Nvidia's Blackwell chip, set for release this year. The Blackwell graphics processing unit (GPU) is loaded with 208 billion transistors -- the idea is the more transistors a chip has, the more impressive the speed, memory, and general performance. Customers using the latest NVLink can ensure high levels of communication among as many as 576 GPUs. Cerebras takes a completely different approach that sets it apart from other chip designers. The company bypasses the idea of linking together many GPUs to do a particular job and instead builds giant chips -- the size of an entire silicon wafer. The Cerebras Wafer-Scale Engine, or WSE-3, is the biggest chip ever commercialized. It's 57 times bigger than Nvidia's GPUs and has 52 times more compute cores, 880 times more on-chip memory, and 7,000 times more memory bandwidth. This impressive size allows Cerebras to keep the job on the chip, so customers don't have to link together many GPUs. As a result, they can solve problems faster and use less energy, the company says. "We believe Cerebras has built the world's fastest commercially available AI training and inference solution," the company said in its SEC filing. An analyst at Futurum Group took a deep dive into Nvidia's upcoming Blackwell chip versus WSE-3 technology and reported that Cerebras could indeed be viewed as the fastest -- thanks to top performances in memory and training, and power efficiency over time. Cerebras is seeing mind-boggling growth, with revenue skyrocketing 1,474% to more than $136 million in the first six months of this year compared with the year-earlier period. The company's gross margin slipped to 41% from 50%, but this was due to a change in the mix of hardware and services revenue -- in the prior year, the company sold more of the higher-margin services offerings. All this shows us that Cerebras' product could be a formidable rival for Nvidia and other chip designers. But does this mean Nvidia has met its match? Not necessarily. First, it's important to note that Cerebras' IPO may not happen overnight. The company hasn't yet set the number of shares to be offered or the price range and says the offer is "subject to market conditions." So we're in the early stages of the process. On top of this, Reuters reported Cerebras may postpone the IPO after facing delays in a U.S. security review of the UAE-based tech group G42's investment in the company -- the wire service cites people familiar with the matter. This is amid concern that third parties could leak U.S. technology to China -- a country not authorized to access the technology directly. It's also worth noting that G42 is Cerebras' biggest customer, and this reliance on one player represents some risk. Last year, G42 spent more than $65 million as a Cerebras customer, representing 83% of the company's total revenue. Finally, it's key to remember that Nvidia has built a solid position in the market, with customers investing in its complete system; this isn't something a customer drops overnight to switch to a competitor. Nvidia's focus on innovation means it's likely to launch fresh, high-performance products often enough to keep customers satisfied. In fact, it's already promised an annual update to its GPUs. Cerebras' technology sounds exciting and could progressively gain market share, especially after a potential IPO. It is definitely an AI company to watch. But I wouldn't worry about Nvidia losing its top spot anytime soon, meaning now is still a great time to buy and hold shares of this AI giant.
[7]
Should You Forget Nvidia and Buy These 2 Other Artificial Intelligence (AI) Stocks Right Now? | The Motley Fool
Nvidia has been the center of the AI story for the past two years. But are these other AI winners better values today? Nvidia has been the poster child for the artificial intelligence (AI) revolution thus far, but there will be other AI winners as well. Now, Nvidia should continue to lead the AI chip market for a while. But the stock has surged 539% over the past three years, and trades at 62 times earnings. Moreover, a slew of competition is heading its way, as the majority of its large customers increasingly look to develop their own in-house accelerators. In light of Nvidia's high valuation and incoming wave of competition, the following AI-related stocks may be better values today. The artificial intelligence buildout is likely to continue strongly for the next few years. And while Nvidia has a near-monopoly on AI chips today, will that continue? A large majority of AI training happens in the cloud today, and all major cloud providers are now investing heavily in their own custom AI chips. Even large non-cloud tech companies like Meta Platforms are increasingly designing its own custom AI chips in order to lower costs. While each cloud provider and Meta have solid chip design capabilities, they all need third-party intellectual property to make a significant portion of their custom AI ASICs (application-specific integrated chips). This is where Broadcom (AVGO -2.27%) comes in, providing some key technologies that go into these third-party chips that are produced by Taiwan Semiconductor Manufacturing. Broadcom already counts Alphabet, Meta Platforms, and TikTok owner ByteDance as customers. Moreover, it's been rumored that even generative AI model leader OpenAI has recruited Broadcom to design a custom chip for its own models. Broadcom noted its custom ASIC revenue grew more than 3.5 times, or 350%, year over year last quarter. And CEO Hock Tan said he'd flipped his view about the AI market being dominated by neutral merchant chips like Nvidia's. Now, he sees custom AI ASICs being taken up at a faster pace, saying, "they [the cloud providers] will all go in that direction totally into ASIC or, as we call it, XPUs, custom silicon." Broadcom also has other strong businesses in AI networking chips for Ethernet and VMware software, which the company bought in late 2023. These high-growth businesses are now set to make up more than 50% of Broadcom's revenue in short order, which could lead to a growth acceleration. Meanwhile, Broadcom's diverse business across many types of chips and software give it ample growth opportunities to expand its empire where it sees fit. Since management has a growth-by-acquisition strategy, that means Broadcom has lots of options as to where it can expand its increasingly AI-focused empire. With a growing 1.2% dividend as a kicker, Broadcom looks like a solid value even after its recent run. No matter if Nvidia leads or other competitors emerge in the AI chip space, next-generation AI chips and memory will all be produced by the same handful of semiconductor equipment companies. Of all these competitively advantaged, high-profit equipment stocks, Lam Research (LRCX 1.01%) looks like a solid bet right now after its recent 10-for-1 stock split and its 15% dividend increase. After surging earlier this year, Lam sold off with other chip names over the summer. Shares currently reside 29% below their highs set back at last July, and at just 23 times earnings estimates for the fiscal year ending in June 2025. That makes Lam one of the cheaper semicap equipment companies, despite the fact that it's primed for strong earnings growth. Lam specializes in etch and deposition machines that enable the stacking of chip transistors and other chip elements in a vertical fashion. Until recently, the NAND flash industry has made widespread use of verticalization to stack NAND modules on top of each other; while the NAND business became vastly oversupplied in the last few years, leading-edge logic chips, like AI GPUs and CPUs, as well as DRAM memory, are just beginning to use more vertical structures now. Lam's machines help DRAM makers produce the vertical through-silicon vias (TSV) in high bandwidth memory (HBM) for AI applications, and should see increasing capital spend. HBM itself is also a larger die size, meaning there are fewer dies per wafer, which likely means more wafer expansions and more equipment needed for DRAM chipmakers. Meanwhile, Lam's atomic-layer deposition tools are required for gate-all-around transistors, the new type of transistor architecture all logic chipmakers are just beginning to use at the upcoming 2nm node next year. Lam expects an extra $1 billion in revenue from gate-all-around nodes this year, and strong growth above that into next year. Next year, as gate-all-around and HBM kicks into high gear, Lam's NAND flash segment should also see incremental growth coming off just about the worst downturn in NAND flash history. With NAND equipment investment having nowhere to go but up and its AI-focused segments primed to take off, Lam looks to be a solid growth story for the next couple of years at least.
[8]
2 Best Artificial Intelligence Stocks to Buy in October | The Motley Fool
These companies could have more growth ahead than investors realize. The artificial intelligence (AI) market has created some monster growth stocks already, but companies involved in enabling this technological revolution are still seeing growing opportunities. Here are two AI leaders that could be profitable investments over the next year and beyond. C3.ai's (AI 3.85%) recent growth has been overshadowed by the stellar performance at Palantir Technologies, but investors shouldn't overlook C3. It recently reported accelerating revenue increases for the sixth consecutive quarter, which could set the stage for excellent returns over the next year. C3.ai continues to expand its sales in North America and Europe. In the latest fiscal quarter, it closed 71 agreements. New deals were forged with several clients, including GSK (formerly GlaxoSmithKline), Dolce & Gabbana, and the U.S. Department of Defense. It's also expanding its footprint across state and local governments, with 25 agreements across several states, including Texas, California, and Florida. All these deals are clear signs that C3.ai's momentum is real. Customers are seeing cost savings using generative AI, and its customer service could solidify long-term relationships with these clients. Despite the momentum, the stock has drifted down for most of the year. One factor hurting it is C3.ai's weak profitability. Management's guidance calls for a full-year adjusted loss from operations between $95 million and $125 million, which is a lot compared to its revenue guidance of $370 million to $395 million. Still, the stock appears poised to rebound. The company's net loss is improving year over year, and it's reasonable to expect a profit down the road as the business continues to grow. If investors give the company credit for strong revenue growth, as they did for Palantir over a year ago, C3.ai's share price could rocket higher over the next year. Nvidia (NVDA -0.01%) has been one of the best ways to invest in the AI boom in recent years. It is a pure-play on the growing demand for AI-optimized computing hardware. With data centers still in the early stages of upgrading components for AI workloads, Nvidia is still a solid buy. Thomas Siebel, the CEO of C3.ai, made a comment on his company's last earnings call that speaks to the opportunity for Nvidia. Siebel said that it is very difficult to model the demand trends happening in the AI market right now. He said his company is seeing interest in enterprise AI from organizations it didn't anticipate, including law firms and medical diagnostic companies. Nvidia is seeing similar trends. While cloud service providers generated nearly half of its $26 billion in data center revenue last quarter, it is also experiencing strong demand from AI start-ups building generative AI applications for consumers, healthcare, education, and advertising. AI developers want to use Nvidia because it is the largest supplier of graphics processing units (GPUs), and its chips are used to power every cloud service. There are 4.7 million developers using Nvidia's CUDA computing platform, which provides access to software development kits and other tools designed to work with its GPUs. The stock is trading at an attractive forward price-to-earnings ratio of 34 based next year's consensus earnings estimate. That is a steal for a company expected to grow earnings by 36% on an annualized basis. One factor holding down the valuation of the stock is that semiconductor companies can experience pauses in demand, which happened to Nvidia in 2018 and 2022. But investors who buy its stock with the intention of holding for the long term should see market-beating returns. The demand for AI technology in data centers will continue to grow over the next decade. This should benefit Nvidia, since it controls over 70% of the AI chip market, and it has expanding revenue opportunities in data-center networking hardware and software.
[9]
Should You Forget Nvidia and Buy These 3 Millionaire-Maker AI Stocks Instead? | The Motley Fool
AMD, Supermicro, and Microsoft might be compelling alternative AI plays. Over the past decade, Nvidia's (NVDA 1.63%) rally of approximately 29,530% would have turned a $20,000 investment into $5.9 million. That millionaire-making rally was initially driven by the market's robust demand for its gaming GPUs, but its recent growth spurt was sparked by the explosive AI market -- which drove more companies to buy its newest data center GPUs to process complex machine learning and artificial intelligence (AI) tasks. The market's demand for Nvidia's data center GPUs is still outstripping its available supply. Analysts expect the chipmaker's revenue and earnings per share (EPS) to grow at a compound annual growth rate (CAGR) of 50% and 56%, respectively, from fiscal 2024 to fiscal 2027 (which ends in January 2027). Its stock still looks reasonably valued at 36 times next year's earnings -- but it could still face more unpredictable regulatory, macro, and competitive headwinds. So while Nvidia is a great long-term investment in the long-term expansion of the AI market, it might be smart for investors to look beyond the sector's bellwether and check out a few other underappreciated AI stocks. I believe these three other millionaire-making AI stocks are compelling alternatives to Nvidia right now: AMD (AMD -4.00%), Super Micro Computer (SMCI -1.73%), and Microsoft (MSFT -0.39%). In 2014, AMD's stock dropped to $3 as it failed to keep up with Intel in the CPU market and Nvidia in the GPU market. But if you had invested $20,000 in AMD back then, your investment would be worth $1.1 million today. AMD bounced back under CEO Lisa Su, who drove the chipmaker to sell more custom APUs (which merge CPUs and GPUs) for gaming consoles, design more powerful and more power-efficient chips, and shift its production to Taiwan Semiconductor Manufacturing. It also acquired the programmable chipmaker Xilinx, rolled out more server chips, and expanded its lineup of data center GPUs for accelerating AI tasks. As AMD turned itself around, Intel struggled with persistent delays, production issues, and chip shortages. From 2023 to 2026, analysts expect AMD's revenue to grow at a CAGR of 20% as its EPS increases at a CAGR of 102%. That growth should be fueled by its market share gains against Intel in the CPU market, the stable growth of its gaming GPU business, and AI tailwinds for its cheaper data center GPUs. AMD's stock might seem a bit pricey at 50 times next year's earnings -- but its strengths could justify its premium valuation and drive its stock even higher over the next decade. Super Micro Computer, more commonly known as Supermicro, was once considered a slow-growth supplier of traditional servers. But if you had invested $20,000 in Supermicro 10 years ago, your investment would have grown to over $2.7 million this March...before shrinking back to about $400,000 today. Supermicro is an underdog in the server market, but it carved out a niche by producing high-end liquid-cooled servers. That focus made it an ideal partner for Nvidia, which granted it early access to its high-end data center GPUs to produce dedicated AI servers. It now generates over half of its revenue from its AI servers. From fiscal 2024 to fiscal 2026 (which ends in June 2026), analysts expect Supermicro's revenue and EPS to grow at a CAGR of 46% and 39%, respectively. Those are stunning growth rates for a stock that trades at just 12 times next year's earnings, however, its valuations are being compressed by a short-seller's allegations of accounting issues, a delayed 10-K filing, and rumors of a regulatory probe. Supermicro's stock could bounce back quickly if it resolves those pressing issues. If you'd invested $20,000 in Microsoft 30 years ago, your investment would be worth nearly $2.5 million today. Much of that growth occurred after Satya Nadella took the helm as the tech giant's third CEO in 2014. Under Nadella, Microsoft transformed its desktop-based applications into cloud-based services, expanded Azure's cloud infrastructure platform, and rolled out more mobile versions of its apps for Android and iOS devices. It also expanded its Xbox business, rolled out more hardware devices, and tightly integrated OpenAI's generative AI tools into its ecosystem. That forward-thinking transformation turned Microsoft into a growth stock again. From fiscal 2024 to fiscal 2027 (which ends in June 2027), analysts expect its revenue and EPS to grow at CAGRs of 14% and 15%, respectively, as its cloud, mobile, AI, and gaming businesses continue to flourish. Its stock still looks reasonably valued at 27 times next year's earnings, and it could be a much more balanced play on the AI market than Nvidia.
[10]
Should You Forget Nvidia and Buy These 2 Tech Stocks Instead? | The Motley Fool
If you're worried about investing in Nvidia at its current valuation, you may want to take a closer look at these two other names. Nvidia (NVDA 1.63%) seems well on its way to end another year with stunning gains. Shares of the semiconductor giant have shot up more than 150% so far in 2024 after a stellar performance last year, and there is a good chance that it can continue to fly higher. After all, Nvidia management points out that the demand for its artificial intelligence (AI) chips continues to remain solid, with its upcoming Blackwell processors experiencing stronger demand than supply going into 2025. So, it won't be surprising to see Nvidia delivering phenomenal growth in revenue and earnings next year as well, and that could lead to more stock market upside. However, certain investors may be looking for alternatives to capitalize on the AI boom because of Nvidia's valuation. The stock's earnings multiple of 59 is well ahead of the Nasdaq-100 index's earnings multiple of 32. Though Nvidia's healthy revenue and earnings growth can help justify its valuation, investors with a lower risk appetite may want to look at cheaper options. Here are two names that look like ideal alternatives to Nvidia stock. Micron Technology (MU 3.92%) is a manufacturer of memory chips and counts the likes of Nvidia among its customers. As the demand for Nvidia's AI graphics processing units (GPUs) has boomed, Micron has also witnessed robust growth in sales of its high-bandwidth memory (HBM) chips that are used in these GPUs. This is the reason why Micron Technology's revenue for the fourth quarter of fiscal 2024 (which ended Aug. 29) increased a whopping 93% year over year to $7.75 billion. The chipmaker also posted a non-GAAP (adjusted) profit of $1.18 per share as compared to a loss of $1.07 per share in the same quarter last year. More importantly, Micron expects revenue of $8.7 billion for the current quarter, which would be a jump of 84% from the same period last year. For comparison, Nvidia expects 80% year-over-year revenue growth in the current quarter. Of course, Nvidia has a much larger revenue base than Micron, as its revenue in the previous quarter shot up 122% year over year to $30 billion, but investors should note that they can buy Micron at a much cheaper valuation. Additionally, Micron's AI-related opportunity is more than just memory used in data centers. The company is also on track to benefit from the increasing integration of generative AI features in smartphones and personal computers. On its latest earnings conference call, Micron management pointed out that AI-enabled smartphones are carrying 12 gigabytes (GB) to 16GB of dynamic random access memory (DRAM) as compared to the 8GB DRAM available in flagship smartphones last year. On the most recent earnings call, Micron CEO Sanjay Mehrotra cited a similar development in the PC market: "As an example, leading PC [original equipment manufacturers] have recently announced AI-enabled PCs with a minimum of 16GB of DRAM for the value segment and between 32GB to 64GB for the mid and premium segments, versus an average content across all PCs of around 12GB last year." It is worth noting that both of these markets are on track to witness huge growth in shipments because of generative AI. In smartphones, generative AI-enabled devices are expected to clock an annual growth rate of 78% through 2028, according to IDC. Meanwhile, shipments of AI PCs are forecast to grow at an annual pace of 44% between 2024 and 2028, according to Canalys. And finally, Micron estimates that the size of the HBM market could jump from $4 billion in 2023 to $25 billion in 2025. In all, Micron has multiple lucrative growth drivers thanks to AI, which explains why its bottom line is forecast to take off remarkably from fiscal 2024's reading of $1.30 per share over the next couple of years. As such, Micron looks like a solid bet for investors looking to make the most of the growth in the AI semiconductor market but are wary of buying Nvidia right now because of its expensive valuation. Taiwan Semiconductor Manufacturing (TSM -0.73%), popularly known as TSMC, is the world's largest foundry that manufactures chips for major chipmakers and consumer electronics companies, including Nvidia. In fact, TSMC has played a central role in Nvidia's success in the AI chip market thanks to TSMC's advanced process nodes that have allowed it to produce fast and power-efficient chips. For instance, Nvidia's A100 GPU, which was used for training ChatGPT, was fabricated using TSMC's 7-nanometer (nm) process. That was followed by the hugely popular H100 processor, manufactured using a more advanced 4nm process from TSMC. It is worth noting that the demand for TSMC's process nodes is so strong that its packaging capacity of advanced chips is booked out for 2025, thanks to Nvidia and AMD, which will be using TSMC's lines to manufacture AI processors. Not surprisingly, TSMC is now working to expand its output. The company is expected to end 2024 with an advanced chip packaging capacity of 45,000 to 50,000 units per month, which would be a significant increase over its 2023 monthly packaging capacity of 15,000 units. This should allow TSMC to fulfill more orders from AI chipmakers and other key clients such as Apple, and that's probably the reason why there has been a nice increase in TSMC's earnings projections for the next couple of years. The chart above tells us that TSMC's earnings are on track to increase by more than 20% in both 2025 and 2026, following this year's projected jump of 27% from last year's reading of $5.19 per share. Given that TSMC stock is now trading at 22 times forward earnings (a big discount to the U.S. tech sector's average earnings multiple of 45), investors are getting a good deal on this AI stock considering the potential upside it may be able to deliver. Assuming TSMC's earnings indeed hit $10.35 per share in 2026, and it trades at 30 times forward earnings at that time (in line with the Nasdaq-100 index's forward earnings multiple), the chip giant's stock price could hit $310. That would be a 72% increase from current levels, giving investors a solid reason to buy this stock while it is trading at an attractive valuation.
[11]
Is Broadcom Stock the Next Nvidia?
With its shares up by over 530% in the last five years, Broadcom (AVGO -2.27%) has been a solid pick for long-term investors. But past performance doesn't guarantee future returns. And investors will be wondering if the next half decade will be as kind as the previous one. Let's dig deeper into how this semiconductor company compares to the market leader, Nvidia, in the artificial intelligence (AI) market. An alternative to Nvidia? Since the launch of OpenAI's ChatGPT in late 2021, many tech companies have enjoyed a surge in demand for data-center equipment needed to run and train large language models (LLMs). Nvidia has been the top beneficiary of this trend. But other companies like Broadcom have also boomed even though they serve different sides of the industry. While Nvidia's bread and butter comes from general-purpose AI chips like its flagship B100, designed to maximize speed and processing power, Broadcom focuses on application-specific integrated circuits (ASICs). These custom chips are tailor-made for a client's specific workload, making them more affordable and efficient. Broadcom makes ASICs for major tech companies like Alphabet and Meta Platforms. And this niche allows it to hold its own in an AI hardware industry otherwise dominated by Nvidia. Broadcom also provides software solutions to help clients build their own private cloud-computing networks. Business momentum is respectable but not fantastic Broadcom's growth is respectable although not mind-blowing. In the second quarter, revenue jumped 43% year over year to $12.5 billion based on growing demand for its hardware and software-infrastructure solutions. However, this includes sales from VMware, a software company Broadcom acquired for $69 billion in late 2023. Excluding the VMware acquisition, Broadcom's annual-growth rate falls to just 12%. For comparison, Nvidia's Q2 sales jumped 122% year over year to $30 billion, driven primarily by organic growth in its data-center chip business, which supplies its graphics processing units (GPUs) to AI clients. Gross margins are another significant difference between the two. While Nvidia keeps 75% of its sales before operating expenses, Broadcom keeps just 62%. To make matters worse, Nvidia also wants a slice of Broadcom's ASICs business. In February, it announced plans to create a new business unit focused on designing custom chips for cloud-computing companies. And if this new segment is successful, it will present even more competition for Broadcom's AI hardware business, possibly sending margins even lower. Still, there are some reasons why Broadcom might deserve a premium over its faster-growing cousin. For starters, Broadcom is much more diversified. While Nvidia gets almost all of its growth from its high-end AI GPUs, Broadcom has its fingers in many different pies, including software, routers, and internet-connectivity devices. Broadcom is a mass-market chipmaker; its sales come from many sectors of the economy, making it very resistant to a potential downturn in any specific one. The company's diversification is an asset as more analysts begin to fear that generative AI technology might not live up to the hype. With all this in mind, the stock could have a place in a diversified portfolio, but investors may also want to sit on the sidelines to see if growth eventually picks up.
Share
Share
Copy Link
As the AI chip market continues to grow, Nvidia faces increasing competition from AMD and newcomer Cerebras, while potential export restrictions loom.
Nvidia, the current leader in AI chip manufacturing, is facing increasing competition in the rapidly expanding artificial intelligence (AI) market. The company's near-monopoly position, with an estimated 88% of the GPU market 1, is being challenged by both established players and newcomers.
Advanced Micro Devices (AMD) has emerged as a formidable competitor to Nvidia. AMD's CEO, Lisa Su, recently increased projections for the AI accelerator market, forecasting it to reach $500 billion by 2028 3. The company's data center segment, which includes AI chips, saw a 98% year-over-year increase in revenue for the first half of the year 4.
Cerebras, a startup preparing for an IPO, claims to manufacture chips that are 20 times faster than Nvidia's at a fraction of the price 5. The company's unique selling point is its Wafer-Scale Engine-3 chip, which is the size of a full silicon wafer. Cerebras is targeting a $1 billion IPO at a $7 billion to $8 billion valuation.
Adding to the competitive landscape, the U.S. government is considering new curbs on chip exports. These restrictions could limit sales of advanced AI processors from companies like Nvidia and AMD to certain countries, citing national security concerns 1.
Despite the challenges, demand for AI chips remains robust. Nvidia's next-generation Blackwell chips are reportedly sold out for the next 12 months 3. However, investor sentiment has been cautious, with AI stocks experiencing a pullback over the summer due to concerns about the sustainability of the AI boom 3.
Major technology companies, including Microsoft, Alphabet, Tesla, Amazon, and Meta Platforms, are investing in their own custom chip designs 2. This trend could potentially weaken Nvidia's pricing power in the long run.
While some skeptics, like hedge fund Elliott Management, question the long-term viability of AI applications, industry leaders remain optimistic. Oracle Chairman Larry Ellison believes the race for AI supremacy will continue for 5 to 10 years 3, suggesting sustained growth in the AI chip market.
Reference
[1]
[2]
[3]
[4]
As Nvidia dominates the AI chip market, investors seek alternative tech stocks with potential for growth. This article examines overlooked AI opportunities and analyzes the pros and cons of various AI-related investments in 2024.
4 Sources
Recent market fluctuations have sparked discussions about AI stocks. Despite concerns of a bubble, experts see potential in key players like Nvidia, Microsoft, and Apple. This article explores investment opportunities in the AI sector.
6 Sources
Nvidia's stock has seen significant growth due to its leadership in AI chip technology. Despite recent market fluctuations, analysts remain optimistic about the company's long-term potential in the rapidly expanding AI market.
6 Sources
An in-depth look at Nvidia's position in the AI market, its stock performance, and potential alternatives for investors. The article explores Nvidia's strengths, challenges, and future prospects in the rapidly evolving AI industry.
5 Sources
Nvidia's recent stock decline presents a possible investment opportunity amid the ongoing AI revolution. This article examines Nvidia's market position, its competition, and the potential for long-term growth in the AI sector.
3 Sources