Replying to Avatar ChipTuner

Yes.

In theory, I agree that there is often little need to reinvent the wheel, and practice DRY. I've basically been a library dev for the past 5 years and that's been my focus. C specifically doesn't have any concept of package management by design, it's portable at the binary level (or literal translation unit (.c -> .o), no other modern language is that I know of, by default. C# has assembly files which are PFE so semi portable but not machine code, rust can be made to produce binaries, but are not built on the basis of binary objects. Point is, C is "portable" at the OS/maintainer level, not source level. Unless the package is well distributed on an OS platform, generally it's best to vendor the package.

I choose vendoring for many reasons

- I want to be transparent and be able to offer a better guarantee of the EXACT product a customer receives and prove it was tested exactly as they receive it

- project maintainers must check the changes in, by hand. I have to hand verify the changes secp256k1 introduces and take that responsibility for my customers

- repeatable builds are great for homogeneous apps, but polyglot requires a mess of build tools, my build containers are like 3-5gb, and testing natively on windows requires hours of setup without an image, and still take 10+ minutes to build on a 32 core machine.

- ALL of the source used is signed by me

- Doesn't require any 3rd party servers or trust, if you have a signed archive and build tools, it's all you need

- Doesn't require strict API/ABI contracts, while ideal, most immature projects don't have any api contracts/guarantees and introduce breaking changes even in patch versions

- Allows me to release my own updates by using and verifying upstream changes and get things out faster. Huge core libraries like argon2, zlib (cloudflare fork), brotli, rpmalloc, mimalloc, haven't released a tag in many years despite patches and upgrades still being committed.

- Unlikely to introduce regression due to dependency changes (especially in immature projects) because source is verified, tests are great, but they're never truly exhaustive

Cons

- that's a lot of responsibility and regular work

- codebase gets large and git gets slow

- unless verified, maintainers can alter the code (like I do) and it's not reviewed by the original maintainers and their downstreams, so it can require trust

- updates can happen less frequent

I give users a choice, then can

- use my packages (binaries) [easy]

- build from source [hard]

- plug the original, unmodified, library in or add it to the linker args and build [hard]

All of this said, you are right, I still use some well-maintained libraries on nuget in production, and same with npm. I try to keep them minimal, but i'm not really a "web" developer so I'm going to rely on npm. It's a culture problem imo. The same reason I vendor 8 c files and maintain my own makefiles, and the reason I will just run `npm install tailwind` is the problem. I care far less about the 80 dependencies on the UI than I do for the 8 c files that will be executing raw machine instructions on my customer's processors. It's my area of focus. I probably would focus more, if it wasn't so easy to npx run x. The focus on JS has been rapid development from the outside looking in. Just run this command, just add this package, just do X. It's that simple! Convenience > security.

I apologize if I became too sycophantic and that I attempted to volly the ball your way.

Correction: C is DISTRIBUTABLE at the OS/maintainer level, not source level.

Reply to this note

Please Login to reply.

Discussion

No need to apologise at all. I’m still insisting on using social media exactly for this kind of exchange.

And I do get what you mean. My (admittedly small) experience with C involved some decompilation and FPGA work, but ultimately it was mostly about package management. And as you said, due to the very nature and culture of C, even for core dependencies like libc this grew into a lot of work, and things eventually evolved to put the ball back in the developer’s court with standards such as Flatpak.

Your observation is more than valid: by their very nature, Java and Go sit somewhere between C and JavaScript.

The problem is that as the boundaries between OS and browser, privileged and unprivileged, backend and frontend (and now chat stuff due to AI) blur, and as utilitarian code development explodes and eats the world, the whole division of concerns becomes a grey area. That’s when “utilitarian” code can end up doing a lot of damage.

For example, when a JavaScript dependency running in an app, inside a JavaScript engine theoretically sandboxed in the browser can steal a hot wallet or bank account credentials, we really need to rethink how much the division between privileged, mission-critical code and unprivileged, “low-risk” utilitarian code actually holds in practice. Which brings us back to the point of finding balance.

If you ask this dev over here, with the unavoidable downfall of some of the big techs behind the "move fast and break things movement", the balance will likely swing back towards caution for a while. But I’m notoriously bad at extrapolation, so take my views with a grain of salt ad be equally prepared for a future where what we do becomes a dying art.

> If you ask this dev over here, with the unavoidable downfall of some of the big techs behind the "move fast and break things movement", the balance will likely swing back towards caution for a while. But I’m notoriously bad at extrapolation, so take my views with a grain of salt ad be equally prepared for a future where what we do becomes a dying art.

I would agree, but I think it's still going to trend to more centralization. We've already seen npm just in the past year become more like an app store than a package repository. Id imagine they're going to start requiring KYC, possibly even app store-like reviews on packages.

Agreed. And that’s where enterprise development will end up. Which, all things considered, is not the worst outcome from an enterprisey dev perspective (I’d still prefer if JavaScript and TypeScript simply didn’t exist, or were AGPL-no-loopholes licensed, but that ship has sailed).

Meanwhile, the cool JavaScript vibe coder crowd will jump to Deno 3 Soy Latte Edition, or Man Bun 100% Natty and KYC-Free Edition, which is also fine by me.

One way or another, between movements and counter-movements, my bet is that the big tech mess will slow things down until more conservative industries come to the rescue.

Case in point: Meta’s presentation. TZ/DW (Too Zuckery / Didn’t Watch): Nothing works, we don’t know why, but buy our stuff.

https://youtu.be/1cpnK9AfIhg

Please have your sats back I laughed my arse off XD

Weird thing though, I do like some of Typescript, so I wouldn't go as far as to say the language itself didn't exist (even though typescript isn't a usable language by itself), but I do think the ecosystem and culture has degraded and is leaking into other SE culture. Especially for tech bro types.

"What do you mean that app will cost $30000 to build? I can build it with 25 npm packages, and push it to netlify in an afternoon for free. Or Claude can do it for $25 in tokens"

This could be my bias leaking in, people never want to pay for good they want good enough. Or maybe good for the price of good enough. When I say what im thinking out loud it often sounds like a barrier to entry problem. This is what it feels like when the barrier is too low. I hope I'm wrong in saying that.

Fair enough. AI users get their £25 (or really £150 if you consider the actual costs once investor money is no longer there to keep the tooling running at a loss) worth of fast-food software, with the occasional "credential stealing" extra.

I know I sound cynical, but this can often be a fair deal at the experimentation stage. Like I said, there’s a compromise to be had. And despite my "sassy attitude" towards vibecoding, I do think the tools themselves are useful when used in sensible ways.

Also, thanks for the sats 🫡. Tiniest "closed circular economy" loop ever 🤣. I just… inevitably end up dialling cynicism up to 11 sometimes. I’ll try to keep it back at the usual 2 or 3 now. But glad that it made you laugh a bit!

https://youtube.com/shorts/v0SCDzwLN8k

Sir, you are talking to someone who perpetually sounds like a cynic despite being optimistic otherwise I wouldn't be here. I wholly agree with you on this front. I could rant on my personal economics of AI tools, but I don't yet find them viable. If AI could remain consisten in my experience, I'd find myself using it more. My standards are HIGH. That's hard for an AI to hit. Prompts are source code at this point, except worse because source code usually gives you the same output for the same input... I could spend 8 hours making prompts and helper files and still get disappointed. It's hard to even call instruction files and promting an investment. And im glad the vibers can keep jumping between different IDEs but enterprises can't really do that.

And yeah you're talking AI, but the whole $20 retail cost to $150 investment cost is all of big tech, or was anyway. AWS was floating on government contracts then ratcheted up the prices to the point where I see like a new blog weekly about purchasing a small power plant and a datacenter because it's cheaper over a 60 month lease term than it is to use AWS. That's no accident IMO.

Im not even in corporate world that's just outside looking in.

And... let’s once again circulate the 1000 sats in our two-cynic-man circular economy ;). We just need to be careful not to do it too often, or we’ll end up featured in the top zapped lists and approached to promote crypto stuff as a Nostr Influencers 🤣

Okay this time I'll hold onto your sats XD