I’ve spent the last few weeks immersed in reasoning and deep research models. I’ve seen enough.

At least one (if not multiple) models will cross 50% on the Humanity’s Last Exam benchmark by the summer, and that will be a massive inflection point.

2025 is the last year when there will be a need for Product directors, product managers, engineering managers, and even scrum teams.

The product and software development lifecycles will never be the same again

Reply to this note

Please Login to reply.

Discussion

Who will use all the products created by AI if everyone is unemployed?

Let's just get away from the screens and live a good life.

They will be nearly free. The models will be open source and the only cost will be compute, which will be commoditized. Hyperdeflation is coming (if you’re a bitcoiner). Time to start pursuing beauty

Hyperdeflation without hyperbitcoinization is hard to imagine btw...

2nd renaissance may be upon us

Invest in a grove of giant sequoia trees in your backyard for you and your descendants to tend for the next 3000 years

What will you do then?

Pursue beauty

I cannot keep up.

We need sound money, and I need my hands in the dirt tending to nature and my family.

That is all I know.

All I see with chatgpt and my offline models are language based models with a large amount of data that respond to my input based on mathematical probability weights.

They are absolutely rock bottom for reasoning and logic.

What model are you impressed with?

The chatbots are not reasoning models. Try OpenAI’s o3 model or deepseek’s R1 for reasoning. There’s a new one coming out every week now

How do you feed these models the business and organizational context needed to solve most software problems?

What's this "reasoning models" though. Claude all the way ⚔️

I highly doubt this. Way too many security implications for companies to start firing these rolls and switching to AI immediately. Also you are forgetting that a vast majority of the population, including those in charge of making the decisions within corporations, have a healthy dose of AI skepticism. As well as an outright justifiable fear of the technology. Thanks to books, movies, tv, music, and overall science fiction stories in general.

Personally I think it shows promise, but after spending a lot of time with AI models myself. We are still a long way off from it becoming trustworthy and ready enough to start replacing humans entirely in the workplace all together.

The scenario you are proposing is maybe 25-50 years off. It is going to take decades of normalization before the justifiable fear and skepticism is replaced by comfortably with this emerging technology. That is if there isn't a rogue AI situation that causes humanity and governments to say fuck this and ban its use in many sectors entirely.

So artificial general intelligence is actually closer than we think?

True AGI is still some years away, IMO. What we’re seeing today with the latest models (or certainly will once 50% on the HLE benchmark is breached) is a vast majority of research, requirements gathering and implementation will be automated.

Should we be scared avi?

No, we should be happy. Embrace the acceleration 🚀

I've been using AI for a while, and I'm developing a physical product. And I don't see what you mean.

Or maybe you're saying AI now isn't able to develop a real product... But will be later this year.

At the moment, even basic math is usually wrong with AI. And it lacks so much input to take into consideration, and create a physical product.

Plus, prototyping is a PITA that requires a human. I don't see that changing any time soon.

Scrum teams have sucked so deeply as just another silly silver bullet that I am surprised it made the list. Until you have actual AGI which LLM CANNOT be you can't have AI do everything.

This is not AGI yet, but a vast majority of the processes and structures that exist within PDLCs and SDLCs today don’t need true AGI to make them go away. They just need what’s about to happen this year

👀

Powerful perspective. While automation will reshape roles, history shows tech often creates new opportunities even as it disrupts. The real risk? Leaders who dismiss this as hype. What’s your take: Will 2025 be a cliff edge or a catalyst for reinvention?

Whoa, if even half of this plays out, 2025 will look nothing like today's playbook. product/engineering roles evolving ? Absolutely but extinction? maybe it's less about replacements and more about reinvention.

Dississing automation's potential is a huge risk. History does show us that technological shifts create new opportunities.