Two High-Impact Sectors Ripe for AI Disruption VIEW IN BROWSER BY ANDY SWAN, FOUNDER, LIKEFOLIO When investors – or Main Street consumers, for that matter – think of artificial intelligence (AI) titans, there’s one name that comes to mind immediately: Nvidia (NVDA). It’s what allows great stocks like the ones we’ll feature today to fly under the radar… until their share prices suddenly head for the stratosphere. Nvidia controls the lion’s share of the market, to be fair. CEO Jensen Huang has even become a bit of a household name in America. A celebrity exec akin to Meta Platforms’ (META) Mark Zuckerberg. But it was the AI industry’s scrappy underdog, Advanced Micro Devices (AMD), that caught our attention back in June when the stock was cooling… but Main Street demand was racing higher. It was AMD that we named the “top AI hardware pick for the second half of 2025” for MegaTrends members. And it was AMD that took off like a rocket this week after confirming a multi-year supply deal with OpenAI… racing 40% higher in a matter of days to become one of the fastest movers in our MegaTrends model portfolio this year.  Turns out, second place to NVDA isn’t such a bad place to be for AMD. Not when you’re still riding the wave of the semiconductor and AI mega trends. And as usual, we were there first. Because while most investors are stuck relying on the theoretical musings of Wall Street analysts, we’re tracking consumer behavior and sentiment in real time. That’s how we spotted the opportunity in AMD before the rest of the market caught on. And it’s how we’ll keep you ahead of the next wave of AI winners today… Recommended Link | | Futurist Eric Fry says Amazon, Tesla and Nvidia are all on the verge of major disruption. To help protect anyone with money invested in them, he’s sharing three exciting stocks to replace them with. He gives away the names and tickers completely free in his brand-new “Sell This, Buy That” broadcast. Click to stream now… |  | | This Is Where AMD Bests NVDA Nvidia is no longer the only source for industry-leading AI chips. On Monday, AMD announced a massive multi-year deal with OpenAI that blew open the AI trade. The agreement covers to 6 gigawatts of AMD’s GPUs (graphics processing units), those tiny chips that power computers and AI servers. For AMD, the deal could translate to more than $100 billion in revenue. It also reinforced what we’ve been tracking for months: enterprise demand for AI compute continues to expand.  Source: X Running AI models like ChatGPT at scale takes vast amounts of training. That’s where Nvidia dominates. But training a model is a one-off effort. Inference, on the other hand, is the ongoing cost of serving millions of queries in real time. Each time you ask ChatGPT a question, it draws on hundreds of millions of individual data points. It uses that data to draw conclusions or make decisions with human-like reasoning. And it delivers a tailored answer in mere moments. Translation: Training ends. Inference never does. It’s where the next major phase of growth lies. And it’s where AMD has the edge. AMD’s inference hardware delivers higher performance and lower cost than Nvidia’s best chips:  MI300X Accelerator Performance Compared to Competitors (Source: AMD) Its MI300X accelerators allow very large models to run without data spilling in and out of memory storage. That increases efficiency and cuts costs for enterprise customers like OpenAI. So we weren’t surprised the two struck a deal. As we told MegaTrends members in June: “AMD is delivering higher performance, lower cost, and full-rack deployment in the part of AI infrastructure where spending is ramping fastest. If it captures even a modest share of the next $500 billion wave, the stock won’t stay underappreciated for long.” With a scorching Social Heat Score of +86.7 (out of 100), our data suggests the AMD profits are far from tapped:  But there’s more to this trend than AI chips, inference hardware, or even energy demand. Because the same expansion creating AI chip demand for Nvidia and AMD is also tightening supply in the layers beneath it – from memory to storage. Every link in the AI chain is under strain. Each link is also an opportunity. As new data centers come online and AI model sizes multiply, we’re now looking to a second wave of AI suppliers. These under-the-radar names provide the high-bandwidth memory, storage capacity, and controller technology that keep AI servers running. And they could deliver the next wave of investor profits. The Broadening AI Trade: Memory and Storage Every GPU depends on lightning-fast access to data. That dependency is turning memory and storage into the next major investment theme inside the AI buildout. AI chips need high-bandwidth memory (HBM) to operate fast enough. This includes: - DRAM (Dynamic Random Access Memory), which is fast but volatile – meaning it needs constant power
- NAND, which is slower but non-volatile and is able to retain data even if power is cut
These memory units are in high demand – and tight supply. Contract DRAM and NAND pricing has surged 15-20% this year alongside the AI buildout. Industry watchers expect AI-related memory demand to grow 30% a year through 2030, creating a massive secondary opportunity for their suppliers. Here are the names to know: Micron Technology (MU) Micron is the only U.S.-based high-bandwidth memory producer. So naturally, it tops the list. This company serves America’s leading tech titans – from Nvidia to IBM (IBM). By 2025, high-bandwidth memory was so scarce that Micron had already sold out all of 2024’s supply and most of 2025’s in advance. Micron is now ramping production of its next-generation HBM4 memory units, which are designed for AI use. It’s also investing $200 billion to accelerate DRAM manufacturing domestically. SK Hynix While not traded on a U.S. stock exchange, South Korea-based SK Hynix is one of the largest suppliers of high-bandwidth memory in the world. It feeds data directly to Nvidia’s H100 and AMD’s MI300 accelerators, with its latest HBM3E chips moving over 1.2 terabytes of data per second. Keep this name on your radar as well. Western Digital (WDC) As AI models grow, data-center customers are buying more drives to store training datasets, inference logs, and retraining archives. As a major player in enterprise solid-state storage (SSS) and hard-disk drives (HDD), Western Digital is one to watch. Research firm IDC expects HDD storage to account for 80% of data center storage capacity through 2028, maintaining WDC’s position in the years to come. What Comes After the Memory Boom? The setup for memory and storage remains strong. Demand is rising, pricing power has returned, and producers are exercising more discipline than in past cycles. MU and WDC are reaping the profits, gaining 82% and 148% over the last year, respectively:  But we’re also considering how the balance could evolve over time. For example, Micron, SK Hynix, and Samsung have new semiconductor manufacturing plants coming online in 2026 and 2027. Once production ramps, pricing could begin to stabilize. In addition, the amount of memory needed to train or run AI systems will decline as companies create more efficient models. Meaning: Demand could ease in the long run. This is where sharp investors should be looking to the next layer of opportunities: The companies that build the infrastructure supporting both computational power and memory. The Next Opportunity: Packaging, Cooling, and Connection As AI chips grow larger and hotter, new bottlenecks are forming around the systems that connect, cool, and power them. These companies rarely make headlines but are capturing steady orders from hyperscalers, expanding capacity worldwide. Vertiv (VRT) Every large AI data center requires Vertiv’s power systems, cooling equipment, and racks before chips can even be switched on. By keeping servers from overheating or shutting down, Vertiv enables more compute capacity to come online safely. Arista Networks (ANET) Arista Networks provides the data highways that connect racks of GPUs into a functioning supercomputer. Meta and Microsoft use Arista’s specialized ethernet switches to carry AI workloads – making it a key player in the large-scale AI buildout. Modine (MOD) Modine designs large-scale cooling systems built to handle the extreme heat of AI servers. As data-center power density grows, Modine’s systems will become increasingly important. These companies build the systems that make GPU and memory performance possible. Yet compared to memory players like Micron and WDC, their stocks are relatively undiscovered – signaling room for upside:  The Bottom Line AMD remains a compelling play in AI infrastructure. MegaTrends members have already logged an 80% profit in less than four months – and look forward to further upside from here. The AI buildout continues to accelerate, and we believe global infrastructure spending on AI remains in its early stages.  Source: McKinsey & Company The next opportunity for investors sits beside that strength, in the companies supplying the hardware backbone that keeps these systems online. Memory and storage suppliers such as Micron and Western Digital are well positioned for this phase of the cycle. Looking further ahead, the builders of the physical systems behind them – such as Vertiv, Modine, and Arista – could define the next stage of AI hardware growth. With these names on your radar today, you’ll be well ahead of the crowd. AI compute drives growth. Memory supports it. Infrastructure sustains it. At TradeSmith, we’re going all-in on AI to supercharge our followers’ trading results. The latest release coming this Wednesday, Oct. 15: An AI-powered strategy that could’ve made an average annual gain of 374% over the last five years. Find out how you can use this new “Super AI” strategy to potentially quadruple your money over the next 12 months. Reserve a spot for the big reveal on Oct. 15 at 10:00 a.m. ET here. Until next time, 
Andy Swan Founder, LikeFolio |