How can we make the use of web2 data in web3 actually private and verifiable?

Intermediate2/25/2025, 6:46:07 AM
We can't just shift to a world where only web3 exists without sharing anything. No, we still need to share, but only what's necessary.

Forward the Original Title‘’

Many people who claim that web3 is the new internet define it with the phrase “read, write, own.” The “read” and “write” parts are clear, but when it comes to “own” in terms of data, we hardly own anything today.

User data is often stolen by corporations and used in ways that benefit them; we don’t truly own anything on the internet.

However, we can’t just shift to a world where only web3 exists without sharing anything. No, we still need to share, but only what’s necessary.

As someone with a weaker passport, I’m stuck applying for e-visas and submitting endless details about myself to prove I’m eligible for specific visas. And I always end up asking myself:

• Why should I share my entire bank statement when they only need to confirm a specific income level?

• Why should I provide the exact hotel reservation instead of just proving I’ve booked a hotel in this country?

• Why do I have to submit my full passport details when all they need is to verify my permanent residence in my current country?

There are two main concerns here: services know far more than they need to, and the data you’re providing isn’t private. But how does this relate to security and privacy in crypto?

1. Web3 is not gonna make it without web2 data.

As most of you know, smart contracts essentially have no idea how much BTC, ETH, SOL, or any other asset costs. This task is delegated to oracles, which constantly post public data from the outside world to the smart contract.

In the Ethereum world, this role is almost monopolized by @chainlink with their oracle networks to ensure we don’t rely on a single node. So, we really do need web2 data for more use cases beyond just knowing the price of certain assets.

However, this only applies to public data. What if I want to securely connect my bank account or Telegram account and share sensitive information that isn’t publicly available but is private to me?

The first thought is: how can we bring this data onto a blockchain with proof that the private data is secure?

Unfortunately, it doesn’t work that way because servers don’t sign the responses they send, so you can’t reliably verify something like that in smart contracts.

The protocol that secures communication over a computer network is called TLS: Transport Layer Security. Even if you haven’t heard of it, you use it daily. For example, while reading this article, you’ll see the “https://“ in your browser’s address bar.

If you tried accessing the website with an “http://“ connection (without the “s”), your browser would warn you that the connection isn’t secure. The “s” in the link stands for TLS, which secures your connection, ensuring privacy and preventing anyone from stealing the data you’re transmitting.

2. The connection is already secure, can’t we just transport and use it in the web3?

As I mentioned before, we face a verifiability problem: servers don’t sign the responses they send, so we can’t really verify the data.

Even if a data source agrees to share its data, the standard TLS protocol can’t prove its authenticity to others. Simply passing a response isn’t enough: clients can easily alter the data locally, sharing those responses fully exposes them, risking user privacy.

One approach to the verifiability problem is an enhanced version of TLS called zkTLS.

The working mechanism of zkTLS is similar to TLS but slightly different, here’s how it works:

• You visit a website through a secure TLS connection and send the required request.

• zkTLS generates a zk proof that verifies the data while revealing only the specific details the user wants to prove, keeping everything else private.

• The generated zk proof is then used by other apps to confirm and verify that the provided information is correct.

When I say zkTLS, I’m referring to projects utilizing zkTLS, but there are different approaches to data verifiability using various solutions:

  1. TEE (Trusted Execution Environment)

  2. MPC (Multi-Party Computation)

  3. Proxy

Interestingly, each approach introduces its own set of unique use cases. So, how do they differ?

3. Why isn’t there a single standard for zkTLS? How are they different?

zkTLS isn’t a single technology because verifying private web data without exposing it can be approached from multiple angles, each with its own trade-offs. The core idea is to extend TLS with proofs, but how you generate and validate those proofs depends on the underlying mechanism.

As I mentioned before, three main approaches are using TEE-TLS, MPC-TLS, or Proxy-TLS.

TEE relies on specialized hardware, like Intel SGX or AWS Nitro Enclaves, to create a secure “black box” where data can be processed and proofs generated. The hardware ensures the data stays private and computations are tamper-proof.

In a TEE-based zkTLS setup, the TEE runs the protocol, proving the TLS session’s execution and content. The verifier is the TEE itself, so trust depends on the TEE’s manufacturer and its resistance to attacks. This approach is efficient with low computational and network overhead.

However, it has a major flaw: you have to trust the hardware manufacturer, and vulnerabilities in TEEs (like side-channel attacks) can break the whole system.

Proxy-TLS and MPC-TLS are the most widely adopted approaches due to their broader range of use cases. Projects like @OpacityNetwork and @reclaimprotocol, that are built on @eigenlayer, leverage these models to ensure data security and privacy along with an additional layer of economic security.

Let’s see how secure these solutions are, which use cases zkTLS protocols enable, and what’s already live today.

4. What’s so special about MPC-TLS and Opacity Network?

During the TLS Handshake (where a client and server agree on how to securely communicate by sharing encryption keys), the website’s role remains unchanged, but the browser’s process does something different.

Instead of generating its own secret key, it uses a network of nodes to create a multiparty secret key via MPC. This key performs the handshake for the browser, ensuring that no single entity knows the shared key.

Encryption and decryption require cooperation among all nodes and the browser, with each adding or removing their part of the encryption sequentially before data reaches or leaves the website. MPC-TLS provides strong security and can be distributed so no one group has all the power.

Opacity Network enhances the classic @tlsnotary framework by adding safeguards to minimize trust issues. It employs multiple security measures like:

  1. On-chain verification of web2 account IDs

  2. Commit scheme

  3. Reveal scheme

  4. Random MPC-network sampling

  5. Verifiable log of attempts

Account IDs, being static in web2 systems, allow for proof by committee where ten different nodes must confirm ownership. This links the account to a unique wallet, preventing repeated tries with different wallets to find a colluding node.

You can see how Opacity works in detail down below:

Opacity nodes operate within a TEE, making collusion almost impossible if the TEE is secure. Beyond TEEs, Opacity also uses Eigenlayer to leverage an AVS, requiring nodes to restake 32 stETH, with immediate slashing for misconduct, avoiding delays associated with cooldowns.

You can see that Opacity uses both MPC and TEE, but because MPC is used for zkTLS while TEE is used basically for node security, it’s still called MPC-TLS.

However, if the TEEs were to fail, it could enable a node to engage in collusion within the MPC. That’s one of the reasons why an additional economic security layer is needed to prevent this behavior.

That’s also why Opacity is developing a whistleblower mechanism where users who can prove that a notary has acted improperly will be rewarded with a share of the penalty imposed on the notary’s stake.

Due to its simplicity of integration, security, and the privacy it offers, Opacity has attracted various protocols to integrate it into their products across consumer, DeFi, and AI agent sectors.

The team from @earnos_io is developing a platform where brands can reward users for engagement or task completion. EarnOS uses Opacity’s tech to prove traits via popular apps without revealing personal info, letting brands target audiences while users keep privacy and earn rewards.

Opacity is also integrated into the @daylightenergy_ protocol, developing a decentralized electric utility network where users can earn rewards for contributing to clean energy solutions. Thanks to Opacity, users can prove energy device ownership on-chain without specialized hardware.

Opacity can even be integrated with AI agents, bringing more verifiability and transparency to a field that currently faces significant challenges. zkTLS was recently integrated into @elizaOS, allowing for verifiable AI interactions without privacy loss.

However, TEE-TLS and MPC-TLS are only two variations of zkTLS, there’s also a third one called Proxy-TLS, with the Reclaim Network being the most famous representation of this model. So, how is it different in terms of tech from the other two variations, and which use cases can be enabled by Proxy-TLS?

5. What’s so special about Proxy-TLS and Reclaim Protocol?

HTTPS proxies, common on the internet, forward encrypted traffic without accessing its content. In the zkTLS proxy model, it works almost the same with slight additions:

• The browser sends requests to the website through a proxy, which also handles the website’s responses.

• The proxy sees all encrypted exchanges and attests to their authenticity, noting whether each is a request or response.

• The browser then generates a zk proof which states that it can encrypt this data with a shared key without revealing the key and shows the result.

• This works because it’s nearly impossible to create a fake key that turns the data into anything sensible, so just showing you can decrypt it is enough.

Revealing the key would compromise all prior messages, including sensitive data like usernames and passwords. Proxy-TLS is pretty fast, affordable, and handles large data volumes well, making it ideal for high-throughput settings.

The majority of servers don’t restrict access based on varying IP addresses, making this method pretty widely applicable.

Reclaim Protocol uses Proxy-TLS for efficient data verification and employs proxies to bypass Web2 firewalls preventing large-scale proxy blocking.

Here’s how it works:

The main problem here is collusion: if the user and attestor collude, they can sign basically anything and act maliciously. To mitigate this, Reclaim incorporates a subset of validators chosen to introduce randomness and block such exploits.

Reclaim uses Eigen’s AVS to decentralize the validation of the data. EigenLayer operators can act as attestors, but they will need to deploy their own AVS to specify the attestation logic for their service.

Reclaim is a platform enabling unique use cases like importing ride-sharing data for transportation apps, bridging off-chain data for blockchain economics, verifying identities with national ID data, creating custom data solutions via developer tools, and more.

The Reclaim ecosystem is home to 20+ projects, but I’d like to highlight 4 of them in the money markets, digital identity, consumer, and hiring sectors.

@3janexyz is the first credit-based money market on Base, offering secured credit lines to crypto users by assessing their creditworthiness and future cash flows, using both on-chain and off-chain financial data.

3Jane uses Reclaim’s proxy model to verify credit data from VantageScore, Cred, Coinbase, and Plaid, ensuring privacy of this data.

Another use for credit scores with zkTLS is through @zkme_‘s feature, zkCreditScore. It uses Reclaim Protocol to get your US credit score securely with zkTLS. This lets zkMe check a user’s credit score and make unique soulbound tokens (SBTs) to store this data.

Can there be any other use cases besides credit scores? Of course, there are.

We can take @zkp2p as an example, which is a consumer goods marketplace that leverages Reclaim for verifying users’ Ticketmaster data as well as verifying user payments.

At the same time, @bondexapp, which is one of the most popular job boards in crypto, uses Reclaim for getting proof of work of profiles, verifying that the data is real, private, and verifiable.

Looking at the use cases possible via zkTLS, the ability to verify TLS transcripts on-chain is already unlocking numerous new functionalities, allowing users to control their own data without needing permission from large corporations.

More importantly, zkTLS is made to ensure that your personal data is not used against you. So, where is this headed?

6. Is zkTLS here to stay?

There is still work to be done, but different zkTLS protocols are already introducing new use cases that redistribute power back to the users.

@Tim_Roughgarden on the a16z crypto podcast highlighted that zk proofs, proposed in 1985, only gained popularity with blockchain applications, thanks to hundreds of developers working to reduce proof size and costs.

And now, contributions from the blockchain industry are finding uses in other areas beyond just crypto itself.

I expect a similar story to play out with zkTLS, starting with implementation in Web3 and then extending beyond that, because, as I said before, currently, we “read” and “write,” but we are hardly protected and hardly “own” even our own data.

Disclaimer:

  1. This article is reprinted from [Pavel Paramonov]. All copyrights belong to the original author [Pavel Paramonov]. If there are objections to this reprint, please contact the Gate Learn team, and they will handle it promptly.
  2. Liability Disclaimer: The views and opinions expressed in this article are solely those of the author and do not constitute any investment advice.
  3. The Gate Learn team does translations of the article into other languages. Copying, distributing, or plagiarizing the translated articles is prohibited unless mentioned.

How can we make the use of web2 data in web3 actually private and verifiable?

Intermediate2/25/2025, 6:46:07 AM
We can't just shift to a world where only web3 exists without sharing anything. No, we still need to share, but only what's necessary.

Forward the Original Title‘’

Many people who claim that web3 is the new internet define it with the phrase “read, write, own.” The “read” and “write” parts are clear, but when it comes to “own” in terms of data, we hardly own anything today.

User data is often stolen by corporations and used in ways that benefit them; we don’t truly own anything on the internet.

However, we can’t just shift to a world where only web3 exists without sharing anything. No, we still need to share, but only what’s necessary.

As someone with a weaker passport, I’m stuck applying for e-visas and submitting endless details about myself to prove I’m eligible for specific visas. And I always end up asking myself:

• Why should I share my entire bank statement when they only need to confirm a specific income level?

• Why should I provide the exact hotel reservation instead of just proving I’ve booked a hotel in this country?

• Why do I have to submit my full passport details when all they need is to verify my permanent residence in my current country?

There are two main concerns here: services know far more than they need to, and the data you’re providing isn’t private. But how does this relate to security and privacy in crypto?

1. Web3 is not gonna make it without web2 data.

As most of you know, smart contracts essentially have no idea how much BTC, ETH, SOL, or any other asset costs. This task is delegated to oracles, which constantly post public data from the outside world to the smart contract.

In the Ethereum world, this role is almost monopolized by @chainlink with their oracle networks to ensure we don’t rely on a single node. So, we really do need web2 data for more use cases beyond just knowing the price of certain assets.

However, this only applies to public data. What if I want to securely connect my bank account or Telegram account and share sensitive information that isn’t publicly available but is private to me?

The first thought is: how can we bring this data onto a blockchain with proof that the private data is secure?

Unfortunately, it doesn’t work that way because servers don’t sign the responses they send, so you can’t reliably verify something like that in smart contracts.

The protocol that secures communication over a computer network is called TLS: Transport Layer Security. Even if you haven’t heard of it, you use it daily. For example, while reading this article, you’ll see the “https://“ in your browser’s address bar.

If you tried accessing the website with an “http://“ connection (without the “s”), your browser would warn you that the connection isn’t secure. The “s” in the link stands for TLS, which secures your connection, ensuring privacy and preventing anyone from stealing the data you’re transmitting.

2. The connection is already secure, can’t we just transport and use it in the web3?

As I mentioned before, we face a verifiability problem: servers don’t sign the responses they send, so we can’t really verify the data.

Even if a data source agrees to share its data, the standard TLS protocol can’t prove its authenticity to others. Simply passing a response isn’t enough: clients can easily alter the data locally, sharing those responses fully exposes them, risking user privacy.

One approach to the verifiability problem is an enhanced version of TLS called zkTLS.

The working mechanism of zkTLS is similar to TLS but slightly different, here’s how it works:

• You visit a website through a secure TLS connection and send the required request.

• zkTLS generates a zk proof that verifies the data while revealing only the specific details the user wants to prove, keeping everything else private.

• The generated zk proof is then used by other apps to confirm and verify that the provided information is correct.

When I say zkTLS, I’m referring to projects utilizing zkTLS, but there are different approaches to data verifiability using various solutions:

  1. TEE (Trusted Execution Environment)

  2. MPC (Multi-Party Computation)

  3. Proxy

Interestingly, each approach introduces its own set of unique use cases. So, how do they differ?

3. Why isn’t there a single standard for zkTLS? How are they different?

zkTLS isn’t a single technology because verifying private web data without exposing it can be approached from multiple angles, each with its own trade-offs. The core idea is to extend TLS with proofs, but how you generate and validate those proofs depends on the underlying mechanism.

As I mentioned before, three main approaches are using TEE-TLS, MPC-TLS, or Proxy-TLS.

TEE relies on specialized hardware, like Intel SGX or AWS Nitro Enclaves, to create a secure “black box” where data can be processed and proofs generated. The hardware ensures the data stays private and computations are tamper-proof.

In a TEE-based zkTLS setup, the TEE runs the protocol, proving the TLS session’s execution and content. The verifier is the TEE itself, so trust depends on the TEE’s manufacturer and its resistance to attacks. This approach is efficient with low computational and network overhead.

However, it has a major flaw: you have to trust the hardware manufacturer, and vulnerabilities in TEEs (like side-channel attacks) can break the whole system.

Proxy-TLS and MPC-TLS are the most widely adopted approaches due to their broader range of use cases. Projects like @OpacityNetwork and @reclaimprotocol, that are built on @eigenlayer, leverage these models to ensure data security and privacy along with an additional layer of economic security.

Let’s see how secure these solutions are, which use cases zkTLS protocols enable, and what’s already live today.

4. What’s so special about MPC-TLS and Opacity Network?

During the TLS Handshake (where a client and server agree on how to securely communicate by sharing encryption keys), the website’s role remains unchanged, but the browser’s process does something different.

Instead of generating its own secret key, it uses a network of nodes to create a multiparty secret key via MPC. This key performs the handshake for the browser, ensuring that no single entity knows the shared key.

Encryption and decryption require cooperation among all nodes and the browser, with each adding or removing their part of the encryption sequentially before data reaches or leaves the website. MPC-TLS provides strong security and can be distributed so no one group has all the power.

Opacity Network enhances the classic @tlsnotary framework by adding safeguards to minimize trust issues. It employs multiple security measures like:

  1. On-chain verification of web2 account IDs

  2. Commit scheme

  3. Reveal scheme

  4. Random MPC-network sampling

  5. Verifiable log of attempts

Account IDs, being static in web2 systems, allow for proof by committee where ten different nodes must confirm ownership. This links the account to a unique wallet, preventing repeated tries with different wallets to find a colluding node.

You can see how Opacity works in detail down below:

Opacity nodes operate within a TEE, making collusion almost impossible if the TEE is secure. Beyond TEEs, Opacity also uses Eigenlayer to leverage an AVS, requiring nodes to restake 32 stETH, with immediate slashing for misconduct, avoiding delays associated with cooldowns.

You can see that Opacity uses both MPC and TEE, but because MPC is used for zkTLS while TEE is used basically for node security, it’s still called MPC-TLS.

However, if the TEEs were to fail, it could enable a node to engage in collusion within the MPC. That’s one of the reasons why an additional economic security layer is needed to prevent this behavior.

That’s also why Opacity is developing a whistleblower mechanism where users who can prove that a notary has acted improperly will be rewarded with a share of the penalty imposed on the notary’s stake.

Due to its simplicity of integration, security, and the privacy it offers, Opacity has attracted various protocols to integrate it into their products across consumer, DeFi, and AI agent sectors.

The team from @earnos_io is developing a platform where brands can reward users for engagement or task completion. EarnOS uses Opacity’s tech to prove traits via popular apps without revealing personal info, letting brands target audiences while users keep privacy and earn rewards.

Opacity is also integrated into the @daylightenergy_ protocol, developing a decentralized electric utility network where users can earn rewards for contributing to clean energy solutions. Thanks to Opacity, users can prove energy device ownership on-chain without specialized hardware.

Opacity can even be integrated with AI agents, bringing more verifiability and transparency to a field that currently faces significant challenges. zkTLS was recently integrated into @elizaOS, allowing for verifiable AI interactions without privacy loss.

However, TEE-TLS and MPC-TLS are only two variations of zkTLS, there’s also a third one called Proxy-TLS, with the Reclaim Network being the most famous representation of this model. So, how is it different in terms of tech from the other two variations, and which use cases can be enabled by Proxy-TLS?

5. What’s so special about Proxy-TLS and Reclaim Protocol?

HTTPS proxies, common on the internet, forward encrypted traffic without accessing its content. In the zkTLS proxy model, it works almost the same with slight additions:

• The browser sends requests to the website through a proxy, which also handles the website’s responses.

• The proxy sees all encrypted exchanges and attests to their authenticity, noting whether each is a request or response.

• The browser then generates a zk proof which states that it can encrypt this data with a shared key without revealing the key and shows the result.

• This works because it’s nearly impossible to create a fake key that turns the data into anything sensible, so just showing you can decrypt it is enough.

Revealing the key would compromise all prior messages, including sensitive data like usernames and passwords. Proxy-TLS is pretty fast, affordable, and handles large data volumes well, making it ideal for high-throughput settings.

The majority of servers don’t restrict access based on varying IP addresses, making this method pretty widely applicable.

Reclaim Protocol uses Proxy-TLS for efficient data verification and employs proxies to bypass Web2 firewalls preventing large-scale proxy blocking.

Here’s how it works:

The main problem here is collusion: if the user and attestor collude, they can sign basically anything and act maliciously. To mitigate this, Reclaim incorporates a subset of validators chosen to introduce randomness and block such exploits.

Reclaim uses Eigen’s AVS to decentralize the validation of the data. EigenLayer operators can act as attestors, but they will need to deploy their own AVS to specify the attestation logic for their service.

Reclaim is a platform enabling unique use cases like importing ride-sharing data for transportation apps, bridging off-chain data for blockchain economics, verifying identities with national ID data, creating custom data solutions via developer tools, and more.

The Reclaim ecosystem is home to 20+ projects, but I’d like to highlight 4 of them in the money markets, digital identity, consumer, and hiring sectors.

@3janexyz is the first credit-based money market on Base, offering secured credit lines to crypto users by assessing their creditworthiness and future cash flows, using both on-chain and off-chain financial data.

3Jane uses Reclaim’s proxy model to verify credit data from VantageScore, Cred, Coinbase, and Plaid, ensuring privacy of this data.

Another use for credit scores with zkTLS is through @zkme_‘s feature, zkCreditScore. It uses Reclaim Protocol to get your US credit score securely with zkTLS. This lets zkMe check a user’s credit score and make unique soulbound tokens (SBTs) to store this data.

Can there be any other use cases besides credit scores? Of course, there are.

We can take @zkp2p as an example, which is a consumer goods marketplace that leverages Reclaim for verifying users’ Ticketmaster data as well as verifying user payments.

At the same time, @bondexapp, which is one of the most popular job boards in crypto, uses Reclaim for getting proof of work of profiles, verifying that the data is real, private, and verifiable.

Looking at the use cases possible via zkTLS, the ability to verify TLS transcripts on-chain is already unlocking numerous new functionalities, allowing users to control their own data without needing permission from large corporations.

More importantly, zkTLS is made to ensure that your personal data is not used against you. So, where is this headed?

6. Is zkTLS here to stay?

There is still work to be done, but different zkTLS protocols are already introducing new use cases that redistribute power back to the users.

@Tim_Roughgarden on the a16z crypto podcast highlighted that zk proofs, proposed in 1985, only gained popularity with blockchain applications, thanks to hundreds of developers working to reduce proof size and costs.

And now, contributions from the blockchain industry are finding uses in other areas beyond just crypto itself.

I expect a similar story to play out with zkTLS, starting with implementation in Web3 and then extending beyond that, because, as I said before, currently, we “read” and “write,” but we are hardly protected and hardly “own” even our own data.

Disclaimer:

  1. This article is reprinted from [Pavel Paramonov]. All copyrights belong to the original author [Pavel Paramonov]. If there are objections to this reprint, please contact the Gate Learn team, and they will handle it promptly.
  2. Liability Disclaimer: The views and opinions expressed in this article are solely those of the author and do not constitute any investment advice.
  3. The Gate Learn team does translations of the article into other languages. Copying, distributing, or plagiarizing the translated articles is prohibited unless mentioned.
Start Now
Sign up and get a
$100
Voucher!