•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•

Ethereum developers Vitalik Buterin and Davide Crapis have proposed a payment and access model designed to let users use AI chatbots privately while still paying for services. They argue that current approaches can expose sensitive information through API calls that may be recorded, tracked, and potentially linked back to the user.
Buterin and Crapis say AI chatbot usage raises serious privacy concerns because users often share personal and sensitive information when interacting with providers. They contend that providers cannot ignore the issue as adoption grows.
They outline two common payment options. One is requiring users to sign in with an email address or pay by credit card. In that setup, every chatbot request can be tied to a real-world identity, enabling profiling and tracking and potentially creating legal exposure if logs are presented in court.
The second option is using blockchain payments for anonymity. Buterin and Crapis say this approach has drawbacks: users would need to pay on-chain for every request, which they describe as slow and costly. They also note that it creates a visible record of each message, making per-request privacy difficult because transaction histories can be tracked.
The developers’ proposal centers on a system where a user deposits funds into a smart contract once and then makes thousands of private API calls. Buterin and Crapis say this structure helps the provider verify that requests have been paid for, while reducing the need for the user to confirm their identity for each interaction.
Buterin and Crapis say the system would use zero-knowledge cryptography to prevent cheating and abuse. They describe zero-knowledge tools as enabling a user to prove that something is true without revealing their identity.
The model includes a mechanism called Rate-Limit Nullifiers (RLN), which is intended to allow anonymous requests while detecting attempts to cheat the protocol.
According to the proposal, the process begins when an account owner generates a secret key and adds funds to a smart contract. Those funds act as a buffer for API calls. The account owner funds the account once and then makes private calls using the deposited balance rather than submitting separate transactions for each API call.
The developers note that this creates an inherent limitation: a user can only make as many calls as they have deposited funds for. For each request, the protocol assigns a ticket index. The user must then produce a special proof, called a ZK-STARK, to show they are spending from the deposited funds and to account for any refunds they are entitled to, since AI requests may not all cost the same amount.
The protocol also generates a unique nullifier for each ticket to prove usage and to identify attempts to reuse the same ticket index for multiple requests.
Buterin and Crapis say abuse is not limited to double-spending. They also describe scenarios where users may try to break provider rules by sending harmful prompts, jailbreaks, or requests for illegal content such as weapon instructions.
To address these risks, the proposal adds a layer called dual staking. Buterin and Crapis describe it as subjecting the user’s deposit to strict mathematical rules while also applying provider policy enforcement.
Premium gym chains are entering a “golden era” that is ending or already in decline, as rising operating costs collide with shifting consumer preferences toward more flexible, community-based ways to exercise. Long-term memberships are shrinking, margins are pressured by higher rents and facility expenses, and competition from smaller, more personalized…