Replying to Avatar Egge

By now, you probably know I’m not a big fan of "vibe coding." But before I dive into why, let me clarify: I’m not talking about using AI as a helpful tool to assist with coding. I’m referring to the concept originally described by Andrej, fully giving in to the "vibe," ignoring implementation details, and letting AI take the wheel.

In my view, that approach is mostly useless, if not outright harmful, in many situations. I think most would agree: you can’t build critical systems this way. And you certainly can’t unleash vibe coding on legacy codebases without risking regressions all over the place. Sure, it might work fine for weekend projects (which, to be fair, is exactly the context Andrej was referring to). But those weekend projects are where developers learn the craft. When you "vibe" your way to the finish line, you miss out on a huge part of the learning experience. Software design and implementation is hard and it takes time and repetition to learn. At the same time actual typing will let you master your environment, making you a more productive engineer.

When I bring this up, vibe coders often respond with something like, “Well, mastering AI tools is just as important. If you don’t learn to use them, you’ll fall behind.” Really? Come on. “Mastering” vibe coding tools might take an experienced engineer a day or two. Mastering software engineering (if that’s even possible) takes a lifetime. At the same time vibe bros will constantly tell you that AI is gonna be a completely different thing in 12 months from now, so everyone has to relearn all the tricks anyway.

When I hear that university students are solving their data structures and algorithms homework by just pasting it into ChatGPT, while companies like Amazon are still asking candidates to implement a running median using heaps, it feels like there’s a growing disconnect. And honestly, there’s a bit of delusion on both sides.

I think what I am trying to get at is: Don't let the vibe take away your chance to grow and learn. It is an incredible tool, when used correctly though.

I feel like I've lived through this before, when digital cameras got good enough to replace analog. You'll learn a lot when you shoot on film and develop the pictures yourself, but given enough progress the point becomes moot.

Reply to this note

Please Login to reply.

Discussion

I actually think that is a great example to support my point. Personally I prefer to shoot digital too. However it really helps to still understand how aperture and shutter speed work and how focal length and sensor (or film) size will affect your depth of field. While white balance is way easier to adjust on digital cameras, it is still good to know the concepts and understand how it works. And most photographers will agree that in the end shooting on "Auto" sucks.

Learning the basics and fundamentals first will make you a better photographer, no matter whether you choose to shoot digital or analoge in the end.

Yes, you'll have to learn these things eventually (and you will, if you want to get good). But it will lower the entry point massively, allowing humanity to create photos/software more easily, which I think is a good thing.

Not every photo has to be a masterpiece. Think dash cams / insurance / remembering where you patked.

I think that is the key!

- Lowers the barrier.

- You benefit of understanding the underlying thing

- Quality matters, but its wasted time to do it everywhere.

It’s moving from expertise to accessibility. 100 years ago taking a photo required so much specialized knowledge it was a profession. Now anyone can take virtually unlimited photos and “better” is based on post processing algorithms which few see or understand. Up the stack we go.

I suspect people’s assignment of value to a craft is more a reflection of the time they personally have invested in learning it. Photographers, developers, writers, lawyers, etc etc