Today, I don’t plan to dive too deep into technical aspects. Instead, I want to discuss a social issue we face in the crypto space. The title of this talk is “Social Consensus and Self-Regulation.” Let me start by asking: has anyone here heard of the “Lemon Problem”? Does this term sound familiar?
Alright, not too many people, not really.
In American slang, a “lemon” refers to an unreliable car—one that you didn’t know was unreliable beforehand. I’m not entirely sure about the origin of this term, but that’s what “lemon” means.
On the other hand, a good, reliable car is called a “peach.” I actually didn’t know this myself until I looked it up—kind of a cute contrast.
The Lemon Problem is essentially an issue faced by used car dealerships. When you go to a used car market, it often feels a little sketchy because you don’t know whether you’re buying a “peach” or a “lemon.” This is also a major problem in the crypto industry today—everything may look like a “peach,” but in reality, many protocols turn out to be “lemons.”
So when you buy a car or use a crypto protocol, there’s always a certain probability that it’s a “peach” and a certain probability that it’s a “lemon.” The question is: How much are you willing to pay for it? What is the expected value—the weighted average price—you’d be willing to pay for something that could either be a “peach” or a “lemon”?
The price you’re willing to pay essentially follows a weighted average concept. There’s a certain probability that it’s a “lemon,” multiplied by the value of a “lemon.” There’s a certain probability that it’s a “peach,” multiplied by the value of a “peach.”
Intuitively, you might think that the price you’d be willing to pay falls somewhere between what you’d pay if you knew for sure it was a “peach” and what you’d pay if you knew it was a “lemon.” But why is this a weird dynamic? Why are we even talking about fruit?
So what incentives does this have for used car dealers? What is your incentive if you know that everyone will pay a price between a peach and a lemon?
Your incentive is to only sell lemons, right? If people are willing to pay more than a “lemon” is actually worth, you have no reason to sell “peaches.” You can just sell “lemons” to them and profit.
This is essentially what we call a scam.
I want to pause here because this is a huge issue in the crypto industry today—the Lemon Problem.
Currently, in crypto, because of this Lemon Problem, the probability of finding a “peach” has actually gone down. Fewer people are willing to build “peaches” because they are expensive to produce, while “lemon” dealers flood the market. They see an opportunity: “Wow, I can just sell ‘lemons’ to people who are willing to pay more than they are worth because they mistakenly believe they’re buying ‘peaches.’” As a result, users are losing confidence and participating less in the ecosystem, which is completely understandable.
At this point, I can already hear some of you—or at least an imaginary critic—saying:
“This is just the price of permissionless systems. We have to accept the good with the bad. It’s like the ‘30% crypto discount’—you just have to deal with it.”
But the Lemon Problem is not a one-time cost—it’s a death spiral.
When trust declines, it becomes harder for “peaches” to outperform “lemons.” Eventually, “peaches” exit the market, and all that remains are “lemons”—which is not a good place to be.
That’s why we need to find ways to help consumers identify ‘lemons’. Because if we don’t do it, Gary will—and he’s already working hard on it. This is why I advocate that if we want to uphold the spirit of innovation in crypto while addressing the Lemon Problem, we need some form of self-regulation.
Now, let’s compare this to industries that have successfully tackled similar issues—this might be a controversial discussion.
Okay, what am I talking about?
So am I saying that the crypto space is a casino?
No, I’m saying the crypto industry is actually worse than a casino.
At the very least, we need to be as good as a casino. If crypto is going to succeed, we need to adopt the things that casinos do well.
We need to at least do what casinos do well,
I think it’s worth a look and that’s what I’m going to say next.
Casinos are known for their emphasis on fairness and security. They actively promote this. Why do they do this? They went to great lengths to prove that the casino was not rigged, except of course the way it was clearly rigged.
Let me give you a few examples, this is an automatic card shuffler.
Well, why would they do that? Why did they use this instead of having the dealer deal the cards manually?
Because they want to prove to you that you’re not being cheated—at least, not in any way beyond the built-in house edge. They want to show you verifiable randomness.
They ban cheaters and share cheater information with other casinos. Why are they willing to gang up on cheaters? If I were at the Flamingo Casino (a casino in Las Vegas) and I discovered a cheater, why would I share this information with the winner?
Casinos use precision dice calipers to ensure that dice weight is evenly distributed. All of these measures exist to convince consumers that they are not being scammed. You may be playing against the odds, but at least the game itself is fair.
The government and casinos actually invest together in making casinos safe. People forget that casinos are highly legal and rapidly growing. Ethereum is projected to generate $2 billion in fees this year—meanwhile, the global casino industry is on track to reach $300 billion in revenue.
Marketing security is something casinos do exceptionally well in collaboration with governments. They persuade regulators that making casinos safe benefits everyone.
Okay, how does this work? This is a virtuous cycle, more trust equals more investment in fairness and security.
And we need to achieve this in a decentralized way. One thing we all know—yet I haven’t heard mentioned even once this week—is a three-letter word: FTX. Nobody is talking about it. We love to pretend it was just a bad dream. But in reality, bad actors completely eroded trust in the entire ecosystem—not just among their victims, but for everyone.
We have the technology to prove security and legitimacy—we just need to implement it at the social layer. So, let’s give this week’s obligatory mention—zero-knowledge (ZK), right? It’s a term we’re all familiar with.
We have the ability to prove integrity—to verify identity, reputation, and computational correctness.
The problem is not technology, we keep attending these conferences, constantly talking about technical solutions—but part of the real issue lies in social consensus and ideology.
We already have the capability to create new forms of social consensus that focus on protecting applications and users. We need to accept that this is something we must do—we need to self-regulate before others step in to regulate us.
Right now, we tend to take an extreme ideological stance—either completely permissionless or entirely permissioned. It’s often seen as black or white, all or nothing.
But in reality, there is a broad spectrum of social consensus between these two extremes.
Let me give you an example of what ZK and ASIC research could unlock—something that challenges traditional ideological thinking. Imagine a liquidity pool where only token holders who can prove the legitimacy of their funds (through a third-party verification system) are allowed to participate. This model can be both permissionless and permissioned at the same time. I can create a pool with these rules, and you have the freedom to decide whether or not to participate.
This introduces the idea of voluntary paternalism—where a shared social consensus within a given community (like the people in this room) decides on the rules for safe operation, while users still retain the ability to choose whether they engage with it. It’s not a binary, black-or-white choice where any form of permission—whether social or democratic—is deemed unacceptable.
Another example is the concept of decentralized “cleaning” providers, something that Vitalik and our co-founder Zach Williamson have been exploring. This model introduces a social graph where individuals verify the legitimacy of funds and transactions. Users can observe behaviors and collectively decide, “This is not something we want to be associated with.” This is fundamentally different from centralization and radically different from censorship. Instead, it represents a democratic form of social consensus, where we, as a community, decide what behaviors we will not tolerate within our ecosystem.
The goal here is not to restrict freedom, but to give users more choices in expressing their preferences across different protocol designs.
So, ZK enables permissionless functionality at the base layer, while at the application layer, it allows for permissioned social consensus.
There are already many examples of this in action—discussions around proof of reserves, anti-phishing mechanisms, opt-in compliance pools, and legitimacy proofs for funds.
But ultimately, what we’re saying is: We need to turn zachXBT into ZK—we need to replace reliance on trust and centralized compliance with mathematical proofs and social consensus.
To summarize, we need ZK to enable three major advancements:
First, we need self-regulation and compliance while preserving user choice. As a community and an ecosystem, we haven’t had a real conversation about self-regulation. We’ve mostly just been hoping and praying that regulators won’t notice us.
But Web3 will not succeed if we allow this to continue. We need to prove—to both regulators and users—that we take care of each other and that we take care of our users.
Freedom through choice, not ideology. We shouldn’t force ideology onto users. Instead, let’s give them the choice to decide where they want to participate. That’s what this space is ultimately about—freedom and autonomy.
Finally, we need to improve security, we need to make it reliable, and we need to make crypto a necessity rather than an option. We forget that governments are at least supposedly made up of voters, why are Uber and Airbnb once illegal and now legal? Because someone walked up to the steps of Congress and said, “You can’t take my Uber away from me until I’m dead,” and someone did that, personally, I don’t know if you remember this.
One way we can make crypto a necessity and woven into the fabric of our economic lives is by ensuring it is reliable and secure, and that we support our users.
This is how we turn lemons into peaches.
This edition features a video from BlueYard Capital published on YouTube: “Jon Wu (Aztec) @ If Web3 is to Work… A BlueYard Conversation”
Original video link: https://www.youtube.com/watch?v=o17GnPJXxgU&t=244s
Today, I don’t plan to dive too deep into technical aspects. Instead, I want to discuss a social issue we face in the crypto space. The title of this talk is “Social Consensus and Self-Regulation.” Let me start by asking: has anyone here heard of the “Lemon Problem”? Does this term sound familiar?
Alright, not too many people, not really.
In American slang, a “lemon” refers to an unreliable car—one that you didn’t know was unreliable beforehand. I’m not entirely sure about the origin of this term, but that’s what “lemon” means.
On the other hand, a good, reliable car is called a “peach.” I actually didn’t know this myself until I looked it up—kind of a cute contrast.
The Lemon Problem is essentially an issue faced by used car dealerships. When you go to a used car market, it often feels a little sketchy because you don’t know whether you’re buying a “peach” or a “lemon.” This is also a major problem in the crypto industry today—everything may look like a “peach,” but in reality, many protocols turn out to be “lemons.”
So when you buy a car or use a crypto protocol, there’s always a certain probability that it’s a “peach” and a certain probability that it’s a “lemon.” The question is: How much are you willing to pay for it? What is the expected value—the weighted average price—you’d be willing to pay for something that could either be a “peach” or a “lemon”?
The price you’re willing to pay essentially follows a weighted average concept. There’s a certain probability that it’s a “lemon,” multiplied by the value of a “lemon.” There’s a certain probability that it’s a “peach,” multiplied by the value of a “peach.”
Intuitively, you might think that the price you’d be willing to pay falls somewhere between what you’d pay if you knew for sure it was a “peach” and what you’d pay if you knew it was a “lemon.” But why is this a weird dynamic? Why are we even talking about fruit?
So what incentives does this have for used car dealers? What is your incentive if you know that everyone will pay a price between a peach and a lemon?
Your incentive is to only sell lemons, right? If people are willing to pay more than a “lemon” is actually worth, you have no reason to sell “peaches.” You can just sell “lemons” to them and profit.
This is essentially what we call a scam.
I want to pause here because this is a huge issue in the crypto industry today—the Lemon Problem.
Currently, in crypto, because of this Lemon Problem, the probability of finding a “peach” has actually gone down. Fewer people are willing to build “peaches” because they are expensive to produce, while “lemon” dealers flood the market. They see an opportunity: “Wow, I can just sell ‘lemons’ to people who are willing to pay more than they are worth because they mistakenly believe they’re buying ‘peaches.’” As a result, users are losing confidence and participating less in the ecosystem, which is completely understandable.
At this point, I can already hear some of you—or at least an imaginary critic—saying:
“This is just the price of permissionless systems. We have to accept the good with the bad. It’s like the ‘30% crypto discount’—you just have to deal with it.”
But the Lemon Problem is not a one-time cost—it’s a death spiral.
When trust declines, it becomes harder for “peaches” to outperform “lemons.” Eventually, “peaches” exit the market, and all that remains are “lemons”—which is not a good place to be.
That’s why we need to find ways to help consumers identify ‘lemons’. Because if we don’t do it, Gary will—and he’s already working hard on it. This is why I advocate that if we want to uphold the spirit of innovation in crypto while addressing the Lemon Problem, we need some form of self-regulation.
Now, let’s compare this to industries that have successfully tackled similar issues—this might be a controversial discussion.
Okay, what am I talking about?
So am I saying that the crypto space is a casino?
No, I’m saying the crypto industry is actually worse than a casino.
At the very least, we need to be as good as a casino. If crypto is going to succeed, we need to adopt the things that casinos do well.
We need to at least do what casinos do well,
I think it’s worth a look and that’s what I’m going to say next.
Casinos are known for their emphasis on fairness and security. They actively promote this. Why do they do this? They went to great lengths to prove that the casino was not rigged, except of course the way it was clearly rigged.
Let me give you a few examples, this is an automatic card shuffler.
Well, why would they do that? Why did they use this instead of having the dealer deal the cards manually?
Because they want to prove to you that you’re not being cheated—at least, not in any way beyond the built-in house edge. They want to show you verifiable randomness.
They ban cheaters and share cheater information with other casinos. Why are they willing to gang up on cheaters? If I were at the Flamingo Casino (a casino in Las Vegas) and I discovered a cheater, why would I share this information with the winner?
Casinos use precision dice calipers to ensure that dice weight is evenly distributed. All of these measures exist to convince consumers that they are not being scammed. You may be playing against the odds, but at least the game itself is fair.
The government and casinos actually invest together in making casinos safe. People forget that casinos are highly legal and rapidly growing. Ethereum is projected to generate $2 billion in fees this year—meanwhile, the global casino industry is on track to reach $300 billion in revenue.
Marketing security is something casinos do exceptionally well in collaboration with governments. They persuade regulators that making casinos safe benefits everyone.
Okay, how does this work? This is a virtuous cycle, more trust equals more investment in fairness and security.
And we need to achieve this in a decentralized way. One thing we all know—yet I haven’t heard mentioned even once this week—is a three-letter word: FTX. Nobody is talking about it. We love to pretend it was just a bad dream. But in reality, bad actors completely eroded trust in the entire ecosystem—not just among their victims, but for everyone.
We have the technology to prove security and legitimacy—we just need to implement it at the social layer. So, let’s give this week’s obligatory mention—zero-knowledge (ZK), right? It’s a term we’re all familiar with.
We have the ability to prove integrity—to verify identity, reputation, and computational correctness.
The problem is not technology, we keep attending these conferences, constantly talking about technical solutions—but part of the real issue lies in social consensus and ideology.
We already have the capability to create new forms of social consensus that focus on protecting applications and users. We need to accept that this is something we must do—we need to self-regulate before others step in to regulate us.
Right now, we tend to take an extreme ideological stance—either completely permissionless or entirely permissioned. It’s often seen as black or white, all or nothing.
But in reality, there is a broad spectrum of social consensus between these two extremes.
Let me give you an example of what ZK and ASIC research could unlock—something that challenges traditional ideological thinking. Imagine a liquidity pool where only token holders who can prove the legitimacy of their funds (through a third-party verification system) are allowed to participate. This model can be both permissionless and permissioned at the same time. I can create a pool with these rules, and you have the freedom to decide whether or not to participate.
This introduces the idea of voluntary paternalism—where a shared social consensus within a given community (like the people in this room) decides on the rules for safe operation, while users still retain the ability to choose whether they engage with it. It’s not a binary, black-or-white choice where any form of permission—whether social or democratic—is deemed unacceptable.
Another example is the concept of decentralized “cleaning” providers, something that Vitalik and our co-founder Zach Williamson have been exploring. This model introduces a social graph where individuals verify the legitimacy of funds and transactions. Users can observe behaviors and collectively decide, “This is not something we want to be associated with.” This is fundamentally different from centralization and radically different from censorship. Instead, it represents a democratic form of social consensus, where we, as a community, decide what behaviors we will not tolerate within our ecosystem.
The goal here is not to restrict freedom, but to give users more choices in expressing their preferences across different protocol designs.
So, ZK enables permissionless functionality at the base layer, while at the application layer, it allows for permissioned social consensus.
There are already many examples of this in action—discussions around proof of reserves, anti-phishing mechanisms, opt-in compliance pools, and legitimacy proofs for funds.
But ultimately, what we’re saying is: We need to turn zachXBT into ZK—we need to replace reliance on trust and centralized compliance with mathematical proofs and social consensus.
To summarize, we need ZK to enable three major advancements:
First, we need self-regulation and compliance while preserving user choice. As a community and an ecosystem, we haven’t had a real conversation about self-regulation. We’ve mostly just been hoping and praying that regulators won’t notice us.
But Web3 will not succeed if we allow this to continue. We need to prove—to both regulators and users—that we take care of each other and that we take care of our users.
Freedom through choice, not ideology. We shouldn’t force ideology onto users. Instead, let’s give them the choice to decide where they want to participate. That’s what this space is ultimately about—freedom and autonomy.
Finally, we need to improve security, we need to make it reliable, and we need to make crypto a necessity rather than an option. We forget that governments are at least supposedly made up of voters, why are Uber and Airbnb once illegal and now legal? Because someone walked up to the steps of Congress and said, “You can’t take my Uber away from me until I’m dead,” and someone did that, personally, I don’t know if you remember this.
One way we can make crypto a necessity and woven into the fabric of our economic lives is by ensuring it is reliable and secure, and that we support our users.
This is how we turn lemons into peaches.
This edition features a video from BlueYard Capital published on YouTube: “Jon Wu (Aztec) @ If Web3 is to Work… A BlueYard Conversation”
Original video link: https://www.youtube.com/watch?v=o17GnPJXxgU&t=244s