Replying to Avatar GLACA

I’m asking this sincerely - not to argue, but because I genuinely want to learn.

I run a Bitcoin Core node. I mine solo with Bitaxe. I watch the mempool daily. I’m here because I care about this protocol and want to understand it better from all sides - including yours. But some of the current narratives around UTXO bloat and filters feel a bit disconnected from what I see.

Block size hasn’t changed - it’s still 4MB max - and no one on the other side of this debate is pushing to increase it. In fact, despite inscriptions and ordinals, we often see periods where the mempool is quiet and blocks aren’t even full. I’ve watched it myself. This tells me the protocol is far from being overwhelmed.

The UTXO set? Yes, it’s growing. I’ve read it’s around 12GB now. That’s notable - but not alarming. Most of that growth came during the BRC-20 inscription spike. Since then, it’s slowed. My node runs fine on a 1TB SSD, and if I need to upgrade to 2TB, that’s already affordable - and will only get cheaper. At current growth, that upgrade would buy me years.

So my question is - why are we treating this like an emergency?

If the blocks are capped at 4MB, and fees exist to regulate access, and pruning is available for those who can’t store everything… isn’t this system already self-regulating?

I know you care deeply about Bitcoin’s longevity and decentralization. So do I. But I also believe the idea that “whoever pays the fee gets into the chain” has always been one of Bitcoin’s strongest promises. If a group decides certain transactions aren’t worthy - even if they pay their way - doesn’t that edge toward selective permissioning?

That’s the heart of my concern. Where’s the line?

Who decides what’s “good” or “bad” usage?

When I ask these questions, I’m not trying to be difficult. I’m trying to understand whether filters are a tool - or a step toward censorship. Because I believe that Bitcoin will outlast every storage system, every government archive, and every digital library. And because of that, I see huge potential in its permanence - not just for finance, but for truth, history, and even art. Bitcoin as timechain. Bitcoin as the modern Library of Alexandria.

I know that might sound idealistic, but I say it as someone who’s using Bitcoin for those very things - and paying full fees to do so. That’s why the blanket dismissal of inscriptions as “trash” feels like it misses the nuance. Not all inscriptions are noise. Not all projects are spam. Some are attempts to preserve meaning on the most resilient ledger humanity has ever built.

If that’s not something Bitcoin can be used for, then what’s the alternative?

Censorship-prone servers? Social networks that disappear our work overnight?

Everything else is corruptible. Only Bitcoin has the permanence.

And I get it - Bitcoin wasn’t made for art. But it also wasn’t made for multi-sig or Lightning or Taproot. Use cases emerge. The best ones stick because consensus allows them to.

Which brings me to a final point: filters don’t change consensus rules - but they do shape who gets seen, what gets relayed, what feels welcome. So are we sure we’re not just replacing openness with a preference?

Because if we go down that path - of choosing what Bitcoin “should be” - we risk forgetting what made it special in the first place.

I want to get this right. That’s why I’m asking. And I’m asking as someone who wants to learn - not to win.

I think the real question you’re trying to ask is, should Bitcoin strictly be a monetary network?

My answer to that question is yes. That’s why I run knots.

Is it an emergency? Absolutely not.

Core developers pushed the controversial update out, despite concerns from core developers and the community. I run knots as a statement to core developers. If you take control away from node runners, I will run Bitcoin software that gives me control over which transactions flow into my node.

My node is for monetary data only. That’s my vote.

Reply to this note

Please Login to reply.

Discussion

Thanks for the thoughtful reply. Just to clarify - there wasn’t a controversial update pushed by Core developers. From what I understand, there was a proposal to raise the default OP_RETURN relay limit. It triggered discussion, but nothing was merged. The PR was closed. No rules were changed. No behavior was imposed on node runners.

That said, I have a genuine question, and I’m not asking this to argue - I really want to understand better.

Do I understand correctly that what your node is doing is mostly symbolic?

From my current understanding:

Your node still downloads and processes those valid transactions before dropping them.

Then, when those same transactions appear in a block, your node processes them again to validate the block.

So essentially, it’s doing twice the work.

The data still hits the blockchain, because it’s valid and paid for through fees.

And the mempool is still exposed to the same total load, just shuffled differently.

Also - by rejecting valid transactions at relay level, nodes like Knots don’t just increase their own processing burden. They may also add inefficiencies to the broader network by breaking natural transaction propagation paths. It’s a kind of bottlenecking, but without stopping the end result.

If I’m missing something here, I’m open to being corrected. I just want to get to the root of how this improves the system beyond making a statement.

The Bitcoin Core team announced that they were removing the op_return limit in the next iteration of Core. If that has changed, I am unaware.

Yes, I understand my actions are symbolic. I only have 1.5 TH pointed at my node.

Yes, I understand this is more computational work for my node.

Doing the right thing is usually not the easiest thing.

We disagree on what a “valid transaction” is. I believe it is data that includes information pertaining to monetary transactions only. You believe pictures are a “valid transaction”.

Peter Todd is at the forefront of this op_return limit removal. He also supports genocide. So…