Monetizing Honesty

3/18/23

DID (Decentralized ID) systems are a hot topic right now. They are needed for everything from free to play games, social media networks, and airdrops to governance systems like OPs Citizens House. There is (rightly so) a lot of buzz and various takes on how to approach this topic. I think this is one of the existential risks of Defi and I’ve written about that before (that sure aged well). I’m personally in favor of the Gitcoin Passport framework that enables each user to define their own calculation for Sybil Resistance by identifying their stamps and then having an attestation system like I proposed awhile ago compress long form off-chain calculations into simple on-chain maps. The focus of these systems ends at being a proof of humanity per address. They are specifically attempting to solve Sybil-resistance.

However, I think there is the potential to layer reputation systems like credit scores on top of Sybil-resistance and I think the potential of this is largely unexplored. For the sake of this discussion I’ll limit reputation to being a purely quantitative, monetary history of honesty rather than history of competence, impact, or ideology. In that context reputation is synonymous with trust. If we want to keep this purely quantitative, what can we objectively measure about something (e.g. code, people, institutions, etc) that makes people trust it? Here are what I think are the three primary pillars of trust:

  • Social Attestation. When someone signs off on a code review, they are making an attestation. Systems like BrightId and Proof of Humanity are entirely social attestations. These systems are inherently subjective and usually gameable so are unreliable on their own but much of the world is entirely built upon attestation layers (your government ID for example).
  • Time. The longer something has been there the more time attackers have had to inspect it, identify a vulnerability, and attack it. The longer a person has been around the more opportunities have arisen that might lure them to dishonesty. In a permissionless system, time proves doubts to be illusory in the same way that a $20 bill on a busy street is probably fake or glued to the floor the longer it sits there.
  • Opportunity Cost of Honesty. Watch carefully how trust is used once granted. Any position of trust is exploitable to some degree. Otherwise what are you trusting? Honesty is the degree to which an actor is suboptimal at exploiting that trust. Yes, I’m aware how unintuitive that definition of honesty sounds but that’s as close as I’ve come to a formal definition. Your unexploited trust value (UTV) is your maximum exploitable trust – exploited trust.

These pillars stand together to form the base on which a credit system could be built. They are a subset of what Vitalik identified in his post on legitimacy.

The first two are straightforward to measure. The latter is a bit more like measuring carbon deltas for Carbon Credits, Sybil values, or MEV because the maximum exploitable trust term is so subjective. With PoS Sybil-resistance is derived from the money you have. Conversely, your UTV is derived from the money you don’t (in a sense). It’s the money you don’t have because you chose to be honest. This is akin to the forgery-resistance score from Gitcoin passport being floored at the amount you have spent/lost.

The fact that this is subjective (and potentially monetizable) will probably lead to competing standards for how to calculate it much like credit agencies today. Your address will have the equivalent of an Experian score from multiple aggregators which combines the three pillars above into a single number that anyone can query. Each Defi protocol could choose to use a different score, a combination of scores, or create their own score. Could this lead to abuse by an agency? Yes, but less than the current system which isn’t permissionless.

But how do you bootstrap trust if no one will give you a position of trust you can honor/exploit? You can’t bootstrap a system like this based on unsecured loans without first getting robbed by a few ten-thousand Sybilooors. The capital efficient answer today is to get involved in communities until someone is willing to grant you trust based on social attestation and time alone. That lacks a certain scalability element though. I think we can do better, even while remaining permissionless. You have to rely on some form of suboptimality otherwise the system will be gamed for profit. So what forms of permissionless economic suboptimality are already on-chain? The few that come to mind for me are donations to public goods, unrealized losses on airdrops, suboptimal votes in the face of bribes, and delegated credit.

None of these are perfect. The first criteria requires money to bootstrap. The second and third may have ulterior selfish motives the system can’t see. The fourth is basically an extension of social attestation and just begs the question. So how do you bootstrap trust on someone with nothing to give and nothing to lose? I have no good answer but the above is the beginning of a formulation.

Why does it matter? Because this value can be integrated into tokenomic systems. I mentioned unsecured lending above, which is probably the first thing that comes to mind when you hear credit system, but I think that’s actually one of the worst use cases for this. To explain why, we first have to break Defi collateral into two categories. The first category is money used as collateral for good behavior but which doesn’t have a credibly neutral recipient. ETH staking is the largest example of this category but there are many others. The only criteria here is that the system has an objective, enforceable slashing condition. In this category the money just needs to be forfeit by the staker but it doesn’t need to go anywhere. The second category is money used to reimburse someone. This includes all your money markets, collateralized stablecoins, insurance protocols, etc. It is more viable to use UTV as collateral for the former category. I wouldn’t recommend this for the Ethereum base chain because the data to calculate this is on the base chain and this is a subjective score, not credibly neutral.

So where does this get us? Take a PoS system (not DPoS, that’s dumb). The purpose of this system is to reach consensus on something. There are two necessary economic components to a PoS system. First, you have to be able to reward honest actors to incentivize participation. Second, each actor needs something at stake they can lose by being dishonest so you have entropy of bad actors over time. I’ve never seen a system designed so the failure of the dishonest actors is the main reward driver of the honest ones so I assume any such system has a revenue stream attached to it. Now, could we use reputation credit/UTV for such a system in lieu of capital? It certainly satisfies the something at stake requirement. You just have to be able to prevent collusion in the system and prevent bad actors from joining at a faster rate than the dishonesty entropy.

To prevent collusion I’ll rely what I call Bitcoin’s/Satoshi’s great insight. The failure rate of attacking a network scales super-linearly with the number of anonymous actors in that network. This is because the chance of leaking a secret scales exponentially with the network and there is a common Schelling point to unify against dishonest actors once a secret is exposed. This is the primary value of decentralization. Past a certain threshold it is safe and profitable to be honest and risky and expensive to be dishonest. So the key to preventing collusion is to increase decentralization and protect anonymity of actors. Defi satisfies this.

To solve dishonesty entropy we’re just using a different collateral (which is still a scarce resource). Today we use capital for Sybil-resistance. This does not provably satisfy the entropy condition and so plutocratic attacks on PoS systems today are possible. Given the source of this alternative collateral (historical honesty) UTV may prove out to be a superior selection criteria for identifying honest consensus participants.

This actually creates a beautiful sort of flywheel. By acting honestly you build a reputation score. You can stake your reputation in order to promise certain services. By fulfilling your promise you generate both an income stream and further reputation. This creates a compounding benefit to continue acting honestly. A world in which honest actors can build an income stream predicated on their reputation and potentially bootstrapped through community service is also a basis for UBI. It’s a step closer to the type of world in which I’d like to live. A few disclaimers though. First, nothing is ever fully proven to be trustworthy. Second, trust is not monotonically increasing. As systems or the world with which they integrate change trust in them fluctuates.


P.S. I’ve been asked to plug an on-topic game from the EVMaverick’s called Layer Zero. The basic premise of the game is to willingly enter into a contract with a group of people who can all openly rug the contract. Everyone has to put a share of collateral into the pot to join. Anyone in the game can take the pot. If no one takes the pot then everyone is generating UTV at the pot size * participants. Conceptually, a set of non-conspiring participants in a game like this can generate more in UTV than the money at stake. While UTV as defined above is scarce if it can be generated faster than capital it could out-compete capital for shares of revenue streams because of this.

Their particular game only works because the game can’t be permissionlessly joined and the participants are in a tight social circle whose reputation in a community extends beyond the Machiavellian definition of reputation I’m after here (ulterior motive). For a more permissionless variety of this game you just have to nail down the random selection (just like the beacon chain). This could be turned into a seasons format and the players could be tiered amongst people with similar UTV. The result is a Defi UTV factory that (if respected in a credit score) indirectly outputs revenue streams for participants.