Netflix shares are up, but some analysts are concerned about the streamer

This story originally appeared in The Technology Letter and is republished here with permission.

Netflix shares are rising by nine percent early Friday after the company Thursday evening beat with its revenue and profit, and forecast 2025’s revenue will be about in line with consensus, and that operating profit will continue to expand.

This was a turn-around from the last report, July 18th, when the shares sold off slightly as the revenue outlook at the time came up short. (I would note that Netflix shares rose over seven percent from then to now, which was better than the Nasdaq in that time, so, buying the stock after the selloff was a good tactic.)

It’s worth asking, however, whether the company’s trends have in some sense stalled. At least one observer is asking that.

I’ll come back to that, but, first, investors this morning seem to be focusing on better-than-expected subscriber numbers, just over five million versus four and a half million expected, for a total of 282.7 million paid memberships.

The profit outlook is the star, to my mind. The company’s operating profit margin of 29.6% was well above the 28.1% the company had forecast back in July.

The company raised its outlook for this year’s operating profit margin to 27% from what had been a 27% outlook, and for next year, it sees that rising to 28%. Not too shabby, considering the company ended 2023 with a 21% margin. Profitability has been one of the pleasant surprises of Netflix in the last several years, given what we all thought, at one time, was going to be a perennial money-loser.

And yet, the forecast for revenue growth in 2025, in a range of eleven to thirteen percent, seems to some a little “weak.” One thing that caught my eye is that advertising is still a weak spot, meaning, the ad-supported free tier of usage that was rolled out last year.

The company expects that in 2025, ads still won’t be a substantial money-maker. The issue at the moment is that the company’s video watching is moving faster than it can sign advertisers. As Netflix puts it in the investor letter, “the near term challenge (and medium term opportunity) is that we’re scaling faster than our ability to monetize our growing ad inventory.”

Why are they lagging in their ad sales, I wonder? Are they having a hard time finding buyers for some of that video inventory? Robert Fishman of the boutique Moffett-Nathanson research house notes that the company has had “measurement and targeting capabilities below industry standards.” He’s hopeful recent partnerships with The Trade Desk and Google will fix that.

Fishman actually has a different bone to pick: Engagement, the time spent watching. He thinks it’s stalled. Fishman asks why the company isn’t talking about raising prices. “While it is likely that the company still has room to grow here, stalled total time viewed per subscriber may imply stalled pricing power growth as well,” he writes. Netflix is vague in its letter on the topic of engagement, simply saying that it is “healthy: around two hours per day per paid membership on average.” The company says view time was up amongst households, without specifying.

I don’t have any particular insight about view time, but I suspect investors are going to grumble about why the company is not raising prices.

Fishman notes the stock is the most expensive among large-caps, at twenty-eight times projected free cash flow, well above multiples of twenty or twenty-five times for other large-caps such as Amazon and Meta. “Netflix’s stock is massively expensive for a company whose own guidance implies a revenue deceleration into 2025,” he writes.

OFF TO THE CLOUD WE GO

Several analysts this week were in San Jose, California, at the annual Open Compute Project conference and trade show, which features deep dives into highly technical topics of server computers for cloud computing, things such as “liquid cooling”; as well as attendant technologies such as fiber optics.

Several of those observers filed reports on what they saw.

Attendance was over seven thousand people this year, notes Hans Mosesmann of Rosenblatt Securities, way up from forty-four hundred last year. He attributes that the fact that “Data center AI challenges have generally become more acute in dealing with compute, interconnect/networking, memory scaling, liquid cooling, chiplets, and compatibility.”

Mosesmann shares that many people at the show were talking about “breaking down the AI memory wall,” meaning how to connect AI computing to massive amounts of computer memory. (See the memory-chip report in March.)

“AI models are growing exponentially, which are moving the primary performance limitations from processing power to memory and interconnect bandwidth,” writes Mosesmann.

“Solutions include accelerated transition to next generation DDR5 modules [a version of DRAM], introduction of MRDIMMS (Multiplexed Rank Dual Inline Memory Modules), CXL-based memory tiering and photonic interconnects.”

Mosesmann also relates that chatter on the show floor is that Broadcom is looking to compete with Nvidia in AI chips. “Word on the floor is that Broadcom is working on an XPU or merchant accelerator,” he writes, “which would make things interesting for Nvidia, AMD, and maybe Intel (Gaudi datapoints mixed).”

That makes sense: Broadcom already helps Google and others make custom chips (“ASICs”) for AI in the cloud, why shouldn’t they try and parlay that into a broader AI-chip business?

George Notter of Jefferies & Co., who follows Arista Networks, notes that Arista shares were weak when Meta Platforms unveiled some home-grown networking switches. The stock is down three percent this week.

“We assume that many investors thought that the Meta switches were competitive alternatives to Arista,” he writes. In fact, “Based on our conversations at the trade show, we believe that Arista’s position in Meta remains very secure.”

Arista’s own switch, unveiled at the show, “is huge—it supports 102TB of capacity,” Notter observes. “We presume it’s based on Broadcom’s next-generation silicon (not yet announced by Broadcom). Larger and larger switches are critical to Cloud Providers’ need to scale to larger and larger GPU clusters.”

Notter notes that the market for fiber-optic “transceivers” for those cloud networks is surging. “At the conference, we spoke with one major transceiver supplier who noted that they’re sold out through 2025. They see that trend going into 2026,” he writes. “The biggest issue is the supply of laser/datacom chips.” Good news for Coherent and other laser makers, I should think.

Wedbush hardware analyst Matt Bryson offered an interesting tidbit from a panel featuring Arista’s co-founder and “chief architect,” Andy Bechtolsheim. Bechtolsheim, writes Bryson, emphasized what he sees as “a seminal issue being created by the rise of accelerators,” meaning, Nvidia GPUs and the like for AI.

For “NVDA et al. to maintain the current growth rate in processing power, either chip sizes need to increase dramatically (until they are limited by reaching waferscale) or communication speeds have to increase substantially to allow more efficient clustering (e.g shifting to optics),” as Bryson explains it. “But in facilitating the latter, communications companies have to figure out a means of not creating another energy consumption problem.”

That makes fiber optics even more crucial, Bryson concludes. That’s good not just for types like Coherent, but also, suggests Bryson, for startups pursuing optical computing, such as Lightmatter, based in the Mountain View town of Silicon Valley. The company just got another $400 million in venture financing this week, for a total haul, over eight rounds in seven years, of $822 million. The company’s now worth about four and half billion dollars.

My takeaway from all of this that the AI infrastructure business continues to be white-hot, even if the AI payoff is still not clear to many.

No comments

Read more