Nvidia šŸŸ šŸ’» - NVDA

https://x.com/The_AI_Investor/status/1766346389943128080?s=20
" even when the competitor’s chips are offered for free, it’s still not cheap enough" :rofl: :star_struck:

You missed out the important part ā€œthat is our goalā€.

It’s clearly not the current situation because TPUv5 is already more efficient and although the MI300X is less efficient it’s still going to end up much cheaper from a TCO perspective.

This whole situation is feeling very similar to Tesla, where there was misguided sense of lack of competition/deep moats and people extrapolating temporary aberrations into a new trend. I wasn’t brave enough to short the stock - but I called the margin collapse so I’m willing to do the same here:

4 Likes

I have as much in AMD as NVDA :innocent:, I can see the MI300X doing well, as for google unless Im missing something, TPU is only good for Ai where as NVDA and the H100 and H200 can do everything

Yeah but the AI compute overhang is what is driving the explosive revenue growth, in an extreme example if you remove that story it’s very hard to come up with a target above $200.

So yes ASICs by design have limited applications, but in return they give huge efficiency and at the moment the workflows they do cover make up a huge fraction of Nvidia’s profits.

I think Nvidia is going to continue to do well as a company but the valuation is getting absurd.

3 Likes


Three fat ladies and a flake. Back up we go :yum:

Just read an article in the telegraph
ā€œ Every investor should own Nvidia – here’s why

Don’t be rattled by the tech company’s red hot share priceā€
:face_exhaling:
That’s that then.
Destined to go to zero now the main stream rags are pushing it.
Top has been called :joy:

1 Like

At least you made a million whole it was going good

1 Like

https://www.tomshardware.com/pc-components/gpus/nvidia-expected-to-give-developers-a-peek-at-next-gen-blackwell-b100-gpu-next-week
The catalyst to push us over $1000 ?
:money_mouth_face:

https://m.youtube.com/watch?si=hiac1gJBvAoq8Yj_&v=By7vn3AVy94&feature=youtu.be
SP to $8000 :scream:

Unstoppable. It’s like a train with no brakes :sweat_smile:

Blackwell
There’s no stopping to get off now
Choo choo Nvda train

NVLink
Chips talking to chips :exploding_head:

If we get a drop tomorrow on all this news
Because people just don’t understand
I’ll be buying more :money_mouth_face:

Nvidia’s new Blackwell GPU platform is coming to AWS

  • AWS to offer NVIDIA Grace Blackwell GPU-based Amazon EC2 instances and NVIDIA DGX Cloud to accelerate performance of building and running inference on multi-trillion parameter LLMs

When you think it can’t get any better :blush:

Microsoft Azure to adopt NVIDIA Grace Blackwell Superchip to accelerate customer and first-party AI offerings

  • NVIDIA DGX Cloud’s native Integration with Microsoft Fabric to streamline custom AI model development with customer’s own data
  • NVIDIA Omniverse Cloud APIs first on Azure Power ecosystem of industrial design and simulation tools
  • Microsoft Copilot enhanced with NVIDIA AI and accelerated computing platforms
  • New NVIDIA generative AI Microservices for enterprise, developer and healthcare applications coming to Microsoft Azure AI
    :money_mouth_face:

The ā€˜Blackwell’ chip marks a historic achievement of 1,000x AI compute in just 8 years. In 2016, the ā€˜Pascal’ chip had just 19 teraflops vs. 20,000 teraflops in the ā€˜Blackwell’ chip today.

The first ā€˜Blackwell’ chip dubbed ā€œGB200ā€ will be available later this year, and will have 208 billion transistors vs. 80 billion transistors in Nvidia’s ā€˜Hopper H100’ in the same 4nm process.

ā€˜GB200’ AI performance is expected to be 5x more powerful than the ā€˜H100’, from 4 petaflops per second, to 20 petaflops per second.

They are so far ahead of the game now
:yum:

4 Likes


Bull Flag?
$1088?
:thinking:

1 Like

Interesting new announcement

are you referring to GB200?
or I’ve missed a new announcement?

2 Likes