Most of what this letter says does make sense. Alphabet management should take heed.
Alphabet and amazon were two companies that Terry Smith would never buy.
He has changed his mind on amazon will he on Alphabet? Its not just there model he didnât like but the price. I would think it would be probably double what it is now when he said it.
Itâs a very bold claim to make but then again the quality of results in Google Search has been declining for a few years now and some of the ChatGPT answers are really great.
Itâs a very odd claim considering Google literally wrote the paper Attention Is All You Need on which this GPT is based and discovered the Chinchilla scaling laws, probably two of the most influential recent papers related to Large Language Models.
Googleâs PaLM is a significantly more capable LLM than GPT3 and Alphabet seems to have large natural advantages in scaling transformers (from data and TPUs).
As ever Google could mess up how they commercialise this/turn this into a product, but I donât think there is any catching up to do in ML.
Google recognise AI as a risk.
But I do think theyâre in a strong position. If Google starts delivering AI answers on their results page, why would users go elsewhere?
Ultimately, competition is good for the consumer!
Often big businesses are too slow to transition and they get beaten by smaller, more agile companies, although OpenAI is backed by Microsoft, it looks like. It will be interesting to see how it goes.
Iâve said before, itâs always difficult to predict what will replace the current state of the art, I certainly couldnât have predicted what would be the next âGoogle Searchâ, but this could be it and itâs interesting to see it happening in real time.
The racing dynamic is very scary from an Alignment perspective, did not expect that so soon.
I think itâll be while before we see a LLM integrated into Search the inference cost is still too high. Maybe when they have TPU v5/6 or some other big cost reduction. Good commentary here:
I wonder if Gmail (rather than search) might be the first place we see a LLM used - it seems a better fit.
Google invests $300m in Anthropic, but basically in the form of cloud compute vouchers for them to spend at GCP.
Certainly a sign of the times when even v. big well-capitalised AI companies are becoming reliant on big tech to fund the next round of training.
Brad vs. ChadGPT
We are watching a battle of giants here. Super exciting times.
Disappointing it gave the wrong information back in the demo though. Which looks to have had a negative impact on the share price.
That mail was v good, Dylan actually just put out a follow up to that today:
I think the lightweight model they are alluding to is probably something like LaMDA following Chinchilla scaling which probably a ~50% reduction in parameters vs GPT3 so probably quite a significant inference cost.