Not more than ONCE :D yes, it makes sense, I sorta got the inkling that uniqueness was something about the concept but not the word "once".

On a side note, since malice and error are indistinguishable without evidence of intent, you could say that pedophiles represent high entropy humans, that replicate their entropy onto immature other humans, that replication part being the most fundamental element of their intent (normalization).

Random values in messages are definitely needed for optimal security, unless there is inherent entropy in the message. 32 bits of timestamp don't constitute strong entropy. Maybe if it was nanoseconds and 64 bit we'd be starting to talk about something with low collision potential.

Keep in mind that this encryption is not just for the relatively low volume of human readable text. It will inevitably be needed to encrypt massive amounts of data and the bigger the amount of data, the more likely is collision. For this reason, GCM, for example, should not use the same secret on more than 4gb of data. This is directly related to the permutations created by the entropy of the data you are encrypting and the size of the nonce, which for GCM is 12 bytes, or 96 bits.

SSL uses 128 and 256 bit sizes for the secrets because reversing them starts to become unlikely to occur within the time window of which the security of the protocol must survive. The cheaper that computation gets, and the more effective systems of parallelisation becomes, the more likely we are going to see a need to scale up to 384 and 512 bits. For now, it is very safe to bet on 256.

There has been several recent instances of people messing around with dice roll entropy to generate bitcoin addresses and having them almost immediately hacked. Computing a private key by iterating through the counting number space will pick up a short seed with very few iterations.

Pasword Based Key Derivation Functions (is HKDF a term used in papers now?) purpose is to inccrease the amount of time required to test each candidate. It is like raising the difficulty on proof of work, it functions on the same basis as everything I've discussed previously except the number of repetitions and the expansion of the output values (PBKDFs like Argon/Argon2 include a memory usage factor that adds memory hardness to the derivation, similar to how the old Ethereum PoW used a huge key, which was essentially a type of salt, to lower the optimization potential).

FYI Argon2d won that contest but I've heard the older scrypt is better from cryptographers I trust (https://eprint.iacr.org/2016/989 Scrypt is Maximally Memory Hard) and that the contest happened while the field was in flux and we weren't as good at evaluating these kinds of systems yet. So I chose it for encrypting your nostr nsec under a password for the gossip client.

Reply to this note

Please Login to reply.

Discussion

yeah, I used argon2 for my `signr` https://mleku.online/git/signr but its parameters are pretty hard core. Takes about 3 seconds on my ryzen 5 2021 laptop. About the same delay on my dm-crypt passwords (the grub unlock takes typically around 8 seconds).

I wrote a memory, cache and processor hard hash function for my work on parallelcoin. I designed it to do about 10 iterations per second, and that rate was pretty flat across several different CPUs, because it's based on very large integer long division.

Makes me think. I should use it in signr, add a config option and set the long division based one as default so if anyone's actually using it they don't get locked out of their keychain. A job for the weekend.

large integer long division is the hardest and least wide variation on all possible hardware. CPUs have 64 bits, and division circuitry takes up about 1/4 the die. ARM and GPUs can do it also, but GPUs have smaller die devoted to it and both require two cycles to process 64 bits.

I'm too busy on scratching out a living to be able to have the luxury of solving what I consider to be the big problems but maybe in a year or two I'll really have some freedom to play.