I’m becoming deeply concerned by the explosion of growth in high level software, with a slow abandonment of low level engineering.

I’ve been in tech for as long as I can remember, and I don’t remember having the issues I have now, nor in this number. Every sentence is beginning to end with “if it works.”

Reply to this note

Please Login to reply.

Discussion

What exactly do you mean? 🤔

That there are way more developers than there are developers who actually understand how software works. That software has become an inverted pyramid growing ever smaller at the lower layers close to the hardware, and ever larger at the top.

This leads to system instability, poor performance, and a general stifling of innovation. Computers have not substantially changed in 50 years. It’s just layers upon layers of code, some of which very few people even understand anymore.

This is true, especially with the invention of tools that just do it for you.

It’s astounding. Even the “low level” languages as so far removed from the hardware now as to be a high level abstraction.

Yeah, I am not a developer, most I can do is light WordPress 😂. But this does seem pretty apparent now.

> Computers have not substantially changed in 50 years

Is that due to a lack of low-level engineering talent, or the limitations of our scientific knowledge that computers depend on?

yep.

before, you could not tinker with computers without knowing a little bit. Discrete eletronics were everywhere with TTL logic ports. basic dos/unix commands used stuff like regular expressions, redirection of stdout, etc.

today the kids grow with cell phones, so high-level they do not know what a directory is. they code websites, but do not know what is a socket.

lower level disciplines are thought as "boring", "distant from market", "not useful" "I ain't going to design chips anyway"

AI has made it worse. undergrads ride the hype-wave, programming in super-high-level abstractions, pyTorch, etc., doing their final projects with a 2 page code (and happy about how short it is) that they do not understand deeply.

Low level engineers are the oil in (energy) the system. Neglecting low level is akin to underestimating. Its mistake to take energy independence for granted. The thing stops without it

That’s amazing. It reminds me of the problems that arose when architects began to rely on systems versus having a deep knowledge of craft, space and structure. Buildings still get built obviously, but the variety of things going deeply wrong have exploded.

This. So much this. Honestly I expected this post to be ignored. This is exactly what is happening in software.

from craft and understanding, to memorization and bureaucracy.

It will…. Just stay on the path. 🤞

Semi tangential but some Python libraries abstract away so much yet fail horribly (JAX, numba, etc) that I got fed up enchanting the dark arts needed to install them/run code and learned Julia. Syntax might be wonky at times, but actually feel closer to the machine and its fun to work with 😁

I haven’t done a deep dive on Julia. That’s geared towards numerical computing right?

I’m very interested in playing with immutability, especially with an eye towards concurrency.

General purpose language, but definitely a focus on numerical computing. "Flexibility of Python with the speed of C" is the slogan. Functions are JIT compiled with the capacity to introspect into the assembly-like code, benchmark, check memory allocations and type uncertainty.

Personal take on multithreading/multi core /GPU processing - its 🤌(chef's kiss) (**with the caveat that the DL ecosystem is not as mature)

To top it off, you can even call Python libraries from Julia. Meaning - do your high performance stuff in Julia and call Python for any plotting or other statistical functions you'd rather not write yourself.

Every time some complexity (of a system or language) is abstracted away, we get a result that is easier to handle in "most" cases, but a lot harder to understand in edge cases. Modern languages and software usually are abstraction of abstractions of abstractions... So it's no wonder it grows in size, and mostly not even linear.

I think one of the first employees at Google solved an early problem they had with RAM, and it proved to be hard to track down. They could not have solved it without the lower level understanding.

the small, low level things is where i like to work.

i also only work in a language that is small and simple for much the same reasons.

to me, reliability and low latency are the two most important things.

How low is low for you?

mambo low...

unless it's bulk cpu/gpu bound processing latency should be in the nanoseconds above the lightspeed limit.

i suppose you also maybe were talking about hardware level? i'm of the opinion that everything should be as close to the metal as possible, while still allowing you to work at arbitrarily higher levels of abstraction.

some things don't need to be handled so closely, this is why i agree with the Go approach of memory management, and the way in Go that if you need to bypass this system it is simple to write your own schedulers and cache managers, again, when you need to, you can tickle the hardware, but most of the time you can work on the level of abstraction that is most efficient and suited for reasoning about the task.

More fuel for the fire: Several high level engineers at my company just told me they didn't trust open source software.

Y?

Usually because there is no guarantee it’ll be around or supported.

We are actively migrating away from a lot of it at my company. But tbh that’s The Qt Company’s fault, not FOSS in general.

My observation, though I'm highly ignorant, is that the sheer speed of hardware allows for exceptionally sloppy coding to run so fast as to not really have an effect on nearly everything that people do. I know a bit more about assembly than most, because I dabbled in it for a while in my younger years for the purpose of reprogramming early fuel injection ECUs. It's pretty amazing what you can do with an 8 but microcontroller if you understand the basics of taking in data, manipulating it in a concise way to get useful outputs.

I look at today's hardware in awe. And it leads to horrors like windows 11.(I hate it so, so much.) I think even Linux is affected by this, though, it still runs wonderfully on much older hardware most of the time.

I'm not sure where I wanted to end up with this little ramble, but, it's pleasant to see others with interesting opinions about this.

It really does boggle the mind that Microsoft Office doesn’t do anything it didn’t do in 2000, but doesn’t run any faster.

It crashes more, from what little I have used it.

But now you can send every detail of your device to the developers automatically. Earth shattering.

Human development depends on the fine balance between abstraction and rigor. Humanity detached from rigor in the 20th century with the introduction of fiat money. That's why you can coexist with high level software but it suffers the nightmare of bugginess, because it isn't rooted on high rigor.

Bitcoin is the return of the fine balance between abstraction and rigor, a sort of algebra of the future.