Since we are required to check for uniqueness of decoy picks within any given
ring, and since some decoy picks may fail due to unlock time or malformed EC points,
the wallet2 decoy selection code was building up a larger than needed *unique* set of
decoys for each ring according to a certain distribution *without replacement*. After
filtering out the outputs that it couldn't use, it chooses from the remaining decoys
uniformly random *without replacement*.
The problem with this is that the picks later in the picking process are not independent
from the picks earlier in the picking process, and the later picks do not follow the
intended decoy distribution as closely as the earlier picks. To understand this
intuitively, imagine that you have 1023 marbles. You label 512 marbles with the letter A,
label 256 with the letter B, so on and so forth, finally labelling one marble with the
letter J. You put them all into a bag, shake it well, and pick 8 marbles from the bag,
but everytime you pick a marble of a certain letter, you remove all the other marbles
from that bag with the same letter. That very first pick, the odds of picking a certain
marble are exactly how you would expect: you are twice as likely to pick A as you are B,
twice as likely to pick B as you are C, etc. However, on the second pick, the odds of
getting the first pick are 0%, and the chances for everything else is higher. As you go
down the line, your picked marbles will have letters that are increasingly more unlikely
to pick if you hadn't remove the other marbles. In other words, the distribution of the
later marbles will be more "skewed" in comparison to your original distribution of marbles.
In Monero's decoy selection, this same statistical effect applies. It is not as dramatic
since the distribution is not so steep, and we have more unique values to choose from,
but the effect *is* measureable. Because of the protocol rules, we cannot have duplicate
ring members, so unless that restriction is removed, we will never have perfectly
independent picking. However, since the earlier picks are less affected by this
statistical effect, the workaround that this commit offers is to store the order that
the outputs were picked and commit to this order after fetching output information over RPC.
Multisig keys per-transfer were being wiped, but not erased, which lead to a ginormous
quadratic bloat the more transfers and exports you performed with the wallet.
Ensures both transfers and sweeps use a fee that's calculated
from the tx's weight. Using different logic could theoretically
enable distinguishability between the two types of txs. We don't
want that.
69de381 add a test for the long term weight cache (Boog900)
810f6a6 Fix: long term block weight cache The long term block weight cache was doing a wrong calculation when adding a new block to the cache. (Boog900)