Avatar
Rusty Russell
f1725586a402c06aec818d1478a45aaa0dc16c7a9c4869d97c350336d16f8e43
Lead Core Lightning, Standards Wrangler, Bitcoin Script Restoration ponderer, coder. Full time employed on Free and Open Source Software since 1998. Joyous hacking with others for over 25 years.

Don't do this to me, man: I'm on holidays!

Ok, off the top of my head. Let's do a single UTXO, which you can spend. We represent all of the pubkey+balance pairs as a hash (separate output? In the script? I don't know).

Three ways to spend it:

1. New funds. Appends pubkey and the new input amount (maybe allow change output?). You have to provide all the previous pubkey+amount pairs, so this gets more expensive as size increases.

2. Early withdraw. You provide all the pubkeys and amounts, and a signature from your pubkey, and an offset of your pubkey. Your amount gets divided by 10, then divided by number of remaining participants (last one can't exit, too bad!) rounded up. Adding that to each participants amount is *hard*, because there is no iteration in Script. This means open iteration, so an upper limit on how many participants.

3. Final withdrawal. This is easier, simply spend with a single tx with outputs to each pubkey/amount.

So, you need introspection, ideally fully (OP_TX or multiple opcodes) to deal with amounts. You want Script restoration to sanely divide and handle whale amounts.

To do this *well* you want stack iteration, for which I am unaware of any proposal. varops could be amended to allow this in future, but it deliberately doesn't charge for some opcodes because we know there's a weight limit, and that will need to change if we have iteration.

But it's a cute idea!

gm!

want to get into bitcoin development but don’t know where to start to understand the basics of how transactions work?

sign up for nostr:npub1vmpf90hq56wzyxht6teg3llpa74rzcepw9suj5unxl3tph24zd4qgtxhm7 ’s online intro class and get 50% off now through the end of the year using this link

https://www.udemy.com/course/base58-bitcoin-transactions-one/?referralCode=4405769CAB73A12745F4

a lot of people have told me this class is life changingly good; others say it’s the best way to get prepped for

Chaincode’s #BOSS program which is starting their second cohort in a few weeks!

These courses are genuinely a gift to the world. It's hard to describe how invaluable it is to have an expert guide curate your first walk through this chaotic technological landscape.

nostr:nprofile1qqsvh300dvquh50l5t9et2257pxrsk5ndsdgdcdmnnxl9nc0f6l2ejcpz3mhxue69uhhyetvv9ujuerpd46hxtnfduq3samnwvaz7tmjv4kxz7fwwdhx7un59eek7cmfv9kqzrrhwden5te0vfexytnfdu7y9l4j does not charge enough.

I am a little surprised by those buying into the idea that Trump will lead a deficit-reducing administration. I expect conflict, chaos and massive falling out, with the result being business as usual.

Replying to Avatar Mandrik

I suspect few people in the world have interacted with more individuals who lost bitcoin than I have.

I answered support tickets for a non-custodial web wallet that, at the time, was the most popular in the world.

I'm talking about 100,000+ tickets over five years, many from users who lost access to their funds. Not just tiny amounts, mind you.

Sometimes hundreds of bitcoin.

My inability to help them still weighs on me.

We added warnings and info about the importance of backups. It's not that I could have done more. The nature of the old Blockchain(.)info wallet made that impossible.

The bottom line is personal responsibility demands extraordinary effort, and not everyone is up for the challenge.

Lost password? Sorry, I can't help.

Lost seed phrase? Sorry, I can't help.

Funds stolen by a phishing site? *Sigh*

What troubles me most isn't the sadness I felt from doing this daily for so many years. No, eventually you grow numb to it.

That's what truly hurt.

I imagine this is a lesser version of what people in the medical field have to do to cope with their jobs - learning to stop caring so much.

It takes a toll on your humanity if you live this way for too long.

I could have stayed in that job. Stacked more sats. It made sense, financially. I'd have a lot more bitcoin today if I did.

Instead, I left, choosing to be with my family and focus on self improvement.

Anyone who has worked during the early years of a startup will understand how incredibly burnt out you are once you finally step away. It took me years to push through that.

But I still think about those users.

The ones who made all the mistakes of the past that you, the bitcoiners of today, would learn from.

Almost seven years have passed since I left, and I'm no longer numb to their pain. I feel sadness for them again.

And I'm grateful for that.

I hope you all have a Merry Christmas, and take some time to reflect on the things that truly matter in this life. 🧡✌️

Yes. When MtGox went down, and I decided to start working on Bitcoin, I forced myself to read through those loss threads on Reddit.

Some days, balancing optimism for the future and pessimism for the future *mistakes* seems intractable. But if not us, then who?

I don't usually find Nic compelling, but I'm enjoying the thoughtful contrariness of this piece, opposing a Bitcoin Strategic Reserve (and predicting it won't happen):

https://bitcoinmagazine.com/politics/i-dont-support-a-strategic-bitcoin-reserve-and-neither-should-you

#CLN release 24.11.1, for those fans of xpay. I've been impressed how many people are testing it, and especially those who go all in on the #reckless "xpay-handle-pay" setting!

BTW: did you know you can use the "config" command to set `xpay-handle-pay` *on the fly*?

https://github.com/ElementsProject/lightning/releases/tag/v24.11.1

https://antoinep.com/posts/softforks/

In which Darosior explains quite coherently why there's not a great deal of *technical* motion on a Bitcoin soft-fork.

Seems like a fair, rational summary.

Sure, but we wouldn't have anything to talk about! 🤣

The only thing dumber than talking about the Bitcoin price is making Bitcoin price predictions.

I mainly end up hiring workaholics. This is a consequence of seeking passionate, smart people who love their work.

So as a manager I mainly find myself telling them to take more leave and asking pointed questions if I receive an email from them far outside hours in their TZ.

But it also means I model the behavior I want, which helps me regulate my own hours. I have youngish kids, and my wife has her own career, so I try to stick to my weekly work hours. And I broadcast that to my team.

I want to work with these people for a decade, so it's a marathon not a sprint.

xpay (not *coat*, thanks autocorrect!) bug reports trickle in. I'll try for a .1 release this week with fixes.

I am impressed by the number of people banging on it: some of the things I knew were sub-optimal (esp if you tell it to override the pay command) now seem more important.

Away early January, and Blockstream gave us all the Xmas week off, so this week is critical. Like, y'know, every other week!

Replying to Avatar calle

It would be possible to have a deliberately deanonymized ecash which allowed the mint to make people whole, but that would be a whole different kind of irresponsible.

Developers must feel responsible for bugs, as you clearly do, but you cannot let them prevent you from creating new things into the world and improving them.

You're doing great! Carry on! 🧡

So, a lovely interaction with Jeremy Rubin where he shattered my XOR simplified CTV scheme. Damn.

So I'm banging my head against the problem some more. I want "txid with this input txid zeroed" but that can involve too much hashing in the worst case. Even if you move the txids to the end: about 250 GB according to my rough calc.

Jeremy suggested a merkle tree, which can work, but we're getting uncomfortably far from "simple" now. Specifically, my bar is "how hard would it to be to produce this *in Script*, assuming that's fully re-enabled?". Not too bad with a known number of inputs, but I don't want to even think about dealing with arbitrary numbers.

Varops budget doesn't really help here, either. Everywhere else, you can't hit the varops limit unless *your input script* is doing wild things: this would mean you can hit the limit with a single opcode in a reasonable script :(

You're better off just saying "your tx which uses this opcode must have no more than 64 inputs" or "no larger than 10k", but that feels totally arbitrary.

For those following along at home: CTV solves this by committing to just the number of inputs, and if that's not 1 you're kind of on your own. It's not *banned*, just shrugged. I dislike this hole, but do I dislike complexity more?

This is what I ponder over morning coffee before Real Work.

BTW, Rearden (apparently from Jeremy?) pointed out that my simplified CTV-like scheme was flawed because it didn't commit to the order of input txids.

You need to xor SHA(inputnum | intxid) for each input to fix this.

I still like the scheme, because it clearly commits to everything the txid commits to (with modifications required by efficiency concerns). Like a "forward txid" to mirror the normal txids which are backwards references.

I should write it up, for comparison with CTV. Maybe once I've done that I'll no longer think it's a significant simplification?