Dona Ferentes
beware the aedes of marsh
- Joined
- 11 January 2016
- Posts
- 16,859
- Reactions
- 23,029
... and then ....with the rise and rise of Nvidia, how are investors going to piggyback....
There are many AI associated stocks out there which have the global stock pickers in a frenzy. Thanks @Dona Ferentes for posting John Davidson's summary of Alex Pollock's take. I just thought I'd post my experience with Semi-conductor stocks with some background.... and then ....
look, I don't even know where to start. Missed the Nvidia bus, maybe there's another one along soon.
Opinion
Alex Pollak is already investing in ‘the very next’ Nvidia
A shift in where AI queries are being handled has opened up the investment field to more chipmakers, and to apps we haven’t even dreamed of yet.
John Davidson Columnist
29 May 2024
It can’t be a coincidence that, when asked at The Australian Financial Review AI Summit on Tuesday what artificial intelligence-related stocks the fund management firm Loftus Peak was investing in, its investment chief Alex Pollak immediately mentioned US chipmaker Qualcomm.
On stage at the Summit on Tuesday, Pollak had just finished talking about what he sees as the “next thing” in AI – AI PCs and AI phones – and that the chip systems made by Qualcomm, together with similar systems made by Apple, Google and Samsung, are now beginning to power that next thing.
Just how much that inclusion of Qualcomm, Apple, Google and Samsung in the new AI ecosystem will affect Nvidia, the current AI market darling, remains to be seen.
Over the past six months Pollak has halved Loftus Peak’s holding in Nvidia, expecting he’ll find better growth opportunities elsewhere. Even so, Nvidia climbed more than 7 per cent this week, on news that another big AI player, Elon Musk’s xAI, had been funded to build AI supercomputers using Nvidia chips.
But more AI data centres running yet more Nvidia chips doesn’t seem to be the race Pollak is interested in any more.
“The very next thing we are going to see, and the gate for the horses to run this race, opened a few days ago,” he said. “It’s the AI PC and the AI smartphone.”
Pollak was referring to the Copilot+ PCs announced by Microsoft last week, which the company is describing as a “new era” in computers. They are AI-centric laptops that, for the moment at least, run exclusively on Qualcomm chips.
Later this year they will also use chipsets from chipmakers such as Intel and AMD, but the first generation of Copilot+ PCs will all be Qualcomm-based when they go on sale in June.
Meanwhile, AI smartphones, which began to appear late last year when Google launched its Pixel 8 and accelerated early this year with Samsung launching its Galaxy S 24, should reach a crescendo next month when Apple is expected to announce AI changes to its iPhones. These types of devices either use Qualcomm chips or, in the case of those three particular companies, run chips that are in the same family as Qualcomm chips.
But to understand what all this has to do with Nvidia, one first needs to understand the difference between the two big consumers of processor power in the AI era: training and inference.
Training an AI model, which in the case of the generative AI models being built by companies such as Google, OpenAI, Microsoft and Meta, involves hoovering up all the world’s data and looking for statistical relationships between things, is enormously compute intensive, and has seen customers lining up to fill their data centres with powerful Nvidia systems.
But inference, which involves taking a model and getting it to do something useful like writing an email for you, doesn’t require nearly as much compute. Inference, too, has typically been run centrally in huge data centres powered by Nvidia (or similar) chips, but that’s starting to change.
What the AI phones from Google, Samsung and (soon) Apple have in common – and also Microsoft’s Copilot+ PC – is that they all do AI inferencing locally, on low-powered chips inside the device rather than on high-powered chips in the cloud.
Neural processing units
Training, by and large, is staying in the cloud, but inferencing is spreading out to the edge devices.
To qualify for the Copilot+ PC branding, for instance, laptops need to have an inferencing chip known as a neural processing unit on them, capable of running 40 trillion operations per second, or 40 TOPS.
The Qualcomm Snapdragon X Elite chipset on the first generation of Copilot+ PCs will actually be capable of more than that – 45 TOPS.
That’s not a lot of compute power compared to the 800 TOPS Nvidia’s laptop GPUs are capable of, but Microsoft is betting it’s enough for AI inferencing, even if it’s not enough for AI training.
Indeed, to help inferencing run more effectively on consumer devices like AI PCs and AI phones, Microsoft, Google and others are training new, lightweight versions of their models that run fast on low-powered NPUs, but still have enough accuracy to satisfy consumers.
Microsoft’s Copilot+ PCs are going to have 40 different models of differing sizes, and similarly Google has multiple sizes for its Gemini model, some of which will be small enough to have their inferencing done “on device”, and some so big they still need to run in data centres in the cloud.
From an AI stock investment perspective, Loftus’ Pollak says it’s still very much an open question how much value this shift to NPU for inferencing will draw away from Nvidia, and hand to companies like Qualcomm.
But what it does do is open up the possibility of a whole new generation of apps that will take advantage of this local AI inferencing to produce results that were either impossible or impractical to achieve using the cloud.
Even if local inferencing of small models has the disadvantage of not being as accurate as cloud-based inferencing of large models, it has the distinct advantage of being fast, cheap and, above all, private.
Quizzed on which of those apps might be worth investing in, Pollak was reluctant to say. It’s early days, and we’re yet to see how app developers are going to take advantage of the new AI PCs and AI phones.
As it was with the dawn to the internet and the smartphone, most likely it will be apps no one has even thought of .
gotta talk the book. No-one else will.Is this just me:
"Australia's semiconductor future"? Seriously?
lol, sad but true.Is this just me:
"Australia's semiconductor future"? Seriously?
I have no trouble with that, i am sure we have an Australian winter sports committee dreaming of Australia as a world first ski destination.lol, sad but true.
Let us pretend frog, let us pretend.
well the only ASX listed semi-conductor stock i can think of is RFTIs this just me:
"Australia's semiconductor future"? Seriously?
are there any others ?
as if China can't make there own"Biden likely to prevent TSMC and Samsung from making AI chips for China: analyst"
Biden likely to prevent TSMC and Samsung from making AI chips for China: analyst By Investing.com
Biden likely to prevent TSMC and Samsung from making AI chips for China: analystau.investing.com
China is currently unable to make chips below 7 or 8 nanometre.as if China can't make there own
another nail in the coffin for South Korea , not so good for Taiwan either
besides Biden is history in 3 weeks
but do they NEED 5 nano-meter chips ?China is currently unable to make chips below 7 or 8 nanometre.
There are a number of complex reasons why this is so, if you want a good explanation, have a look the video from Peter Zeihan
TSMC and /or this one high end chips.
Mick
I'm sure they developed a workaround.but do they NEED 5 nano-meter chips ?
there is other ways of achieving that productivity and China is still basically a STATE-run company , so if they NEED 1,000 hectares for a new server complex and 3 nuclear reactors to power it , it WILL be done . ( and it won't take 20 years to complete the project )
and besides all that the real push should be for low-power consumption chips and faster RAM ( and CPU/GPU cache )
but insanity prevalence is widespread and expanding
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?