Doh sorry I meant "Samourai and Wasabi" 🤦♂️
I'm still blown away by the 2021 paper ( https://arxiv.org/pdf/2109.10229.pdf ), updated twice, about decentralized coinjoin, that states, to explain that it only studies Samourai and Whirlpool:
"While the role of centralized mixing services like JoinMarket, where a trusted third party matches CoinJoin participants, has been studied in the past [ 16], decentralized wallet implementations have not yet been the focus of a comprehensive measurement study."
(It takes extreme, tortuous logic to come to the conclusion that Joinmarket is centralized, but somehow this howling error remains).
More recently this came out, a new paper on address clustering:
https://arxiv.org/pdf/2107.05749.pdf
I haven't read it yet, so it may be very interesting or not at all, fair warning, but the researchers are pretty serious.
However I find this comment of interest:
" Our extraction mechanism relies on change outputs revealed by the multi-input heuristic. This heuristic is effective in practice [15] and widely used, but vulnerable to false positives from techniques like CoinJoin and PayJoin that are intentionally designed to break the heuristic (e.g., [9, 23, 24, 26]). While we take measures to detect CoinJoin transactions and pre-existing cluster collapse, some errors can remain."
Notice how they completely fail to inform the reader of the *crucial*, in this context, difference between traditional coinjoin and payjoin: with payjoin, they will not (in the general case) have *any* way to know it has happened, and therefore not have *any way* to measure whether such a measurement error has occurred, whereas with traditional coinjoin this is emphatically not the case. Disappointing; I hate it when academics gloss over the failures of their method.
Discussion
No replies yet.