GM

We may be in an AI bubble, but a bubble of over exuberance, not hype. i.e. the 2000 dot com bubble was not wrong about the future potential of the Internet, it was just wrong about the speed.

So it is with AI. We think we’re doing better than we are, because AI has evolved so quickly since its breakout in 2017.

We are not, however, in a bubble like the semi mythical tulip bulb bubble, which is now mostly fictionalised as a story rather than understood as reality.

An important, and often missed point in this evolution is that we are moving from an era of general computing (the CPU) to specialist computing, AI (GPU), Quantum and ASIC compute such as Bitcoin mining. AI has yet to travel this journey, still mostly relying on GPUs for its compute.

Bitcoin is a good lesson for us here. Bitcoin mining moved from CPU mining at first, to GPU mining and now almost 100% of it is done on specialist ASICs.

AI has made some moves into Application Specific Integrated Circuit, chip sets such as Google's TPUs (Tensor Processing Units), Fujitsu's DLU (Deep Learning Unit), Intel's AI ASICs, and specialised chips like Habana Labs Gaudi and Cerebras Wafer Scale Engine. But we have yet to see a dominant general purpose GPU beating AI ASIC. This will come and it may not come from Nvidia.

Reply to this note

Please Login to reply.

Discussion

a breakout in 2017? My first memory of AI was copilot on vscode with chatgpt, 2023 I think...

Why 2017 Was Significant

In 2017, AI moved to the centre of public conversation, driven by several key factors and advancements:

Transformer Architecture: The transformer architecture, a fundamental innovation that underpins modern Large Language Models (LLMs) and the recent AI boom, debuted in 2017.

DeepMind's AlphaGo Zero: An improved version of Google DeepMind's AlphaGo system, AlphaGo Zero, defeated its predecessor 100 games to 0 using less processing power and discovering new tactics never before seen by humans or machines.

Rapid Adoption and Investment: Reports from the time, such as those by the World Economic Forum and Deloitte, described 2017 as "the year of artificial intelligence" and a "breakthrough year for machine learning" due to its rapid proliferation into smartphones, drones, cars, and the Internet of Things (IoT).

Practical Applications: Companies like Microsoft, Apple, and Facebook announced major AI initiatives, and an AI algorithm was shown to better human radiologists at diagnosing pneumonia.

we tested zaps on this note… we made six attempts to⚡zap this note, at so@nuts.cash, over a period of about 1 hour. in each case, we found that your lightning address service or server did not respond correctly. if you wanted to fix this... you could try getting a free rizful lightning address -- https://rizful.com ... if u get it set up, pls reply here so we can do this ⚡zap test again.

It’s coming 🫠

Yeah that's interesting, I never thought of the question of "what's the ASIC for AI?".... I think the premise though bakes in certain assumptions about what AI is or isn't, that in turn relate to the "bubble" talk around the subject.

In the perspective of the economy, AI = LLM, for all intents and purposes, right now. And LLM's have certain inherent limitations to achieving the "irrationally exuberant" expectations that are being foisted on it.

Until there are systems that can actually learn things, rather than just being pretty good at predicting the "next word", there will always be limits to their application in society. Nobody's going to be OK with an AI bus driver that, 1 out of every 100 days, decides to just go GTA and mow over pedestrians.

Going back to ASICs, unlike BTC, the types of computation that _true_ AGI might need might look different from what the current generations of LLM's need... So I personally would consider any large-scale investment in "AI ASICs" to be premature.

A few things I've learned in my AI journey.

Knowledge and language to humans are two distinct and separate things.

This distinction does not apply to AI's. They only predict output, they have no knowledge as we would understand it.

GPU's emulate neurons, within their architecture. Neurons will be better emulated in the future by dedicated Tensor ASICs, what they will look like in detail, we don't know yet, but once we do, GPUs will no longer be used in AI infrastructure.

Was AI used to produce this note?

Just the following:

"Google's TPUs (Tensor Processing Units), Fujitsu's DLU (Deep Learning Unit), Intel's AI ASICs, and specialised chips like Habana Labs Gaudi and Cerebras Wafer Scale Engine"

I asked Gemini to advise me 😂

Had a hunch especially in that portion. You're a sneaky one

You are remarkably astute 🫡

Paranoid*