Every spring, “quarterly recap” and “state of the market” reports start making the rounds—and so do big claims about crypto “adoption.” The tricky part is that most adoption headlines aren’t actually measuring people. They’re measuring activity.
This isn’t about dunking on reports (many are careful and useful). It’s about reading them with confidence. Below is a myth-busting framework for interpreting common on-chain metrics—what they can tell you, what they can’t, and what to verify before you accept a “users are up” conclusion. This is general education, not financial advice.
Why transactions and active addresses don’t equal “users”
When a report says “adoption is rising,” it often points to proxy metrics—numbers that stand in for something harder to measure. The most common are transactions, active addresses, new addresses, and fees. They’re real signals, but they aren’t the same as unique people.
Transaction count can mean many things: a person sending funds, an exchange moving funds internally, automated activity from bots, or one “user action” that creates multiple on-chain transactions. So “more transactions” can reflect growth, but it can also reflect changes in how services process transfers.
Active addresses generally refers to addresses that sent or received in a period. The limitation is right in the design: one person can control many addresses, and many people can interact through one address (for example, via custodial services). Address reuse policies, privacy practices, and wallet defaults can all change the number—without any change in real-world adoption.
New addresses can indicate fresh interest, but it can also be driven by services that generate a new deposit address per customer, or by users creating multiple addresses for organization or privacy. Think of it as “new labels,” not “new humans,” unless a methodology explicitly links it to a user model.
Fees (and fee revenue) can reflect demand for block space, but fees also rise with congestion and fall with efficiency improvements. A fee spike can be about market conditions, not necessarily a lasting expansion of the user base.
The context that changes everything: Layer 2s, batching, and incentives
Even if a metric is calculated correctly, the context can flip the interpretation. Three common examples show up again and again in adoption narratives.
Layer 2s and off-chain activity: If more activity moves to Layer 2 networks or other off-chain mechanisms, the main chain may show fewer transactions or lower fees—even while overall usage grows. The opposite can happen too: a change in how transactions settle back to a main chain can create visible bursts that look like “new demand.”
Batching and aggregation: Exchanges and payment services often bundle many transfers into fewer on-chain transactions to save costs. That can reduce transaction count while the number of customers served stays the same (or even increases). In other cases, technical changes can create more on-chain transactions per user action.
Incentives and automation: Airdrops, points programs, and “quests” can motivate users (or scripts) to generate activity that’s temporary. Bots can inflate certain metrics without representing meaningful adoption. Responsible reports will usually discuss how they identify or exclude suspicious activity—if they can.
A checklist for evaluating any adoption report
Instead of asking, “Do I believe this adoption claim?” try asking, “What exactly did they measure, and what else could explain it?” Here’s a practical checklist you can use with any report or headline.
- Define the metric: Is “active addresses” send-only, receive-only, or both? Is the timeframe daily, weekly, or monthly?
- Look for methodology: Do they link to definitions, data cleaning steps, and known limitations?
- Check the unit of analysis: Are they counting addresses, transactions, entities (clustered addresses), or estimated users? Each tells a different story.
- Ask what’s included/excluded: Are exchange/internal transfers filtered? Are smart contract interactions treated differently from simple transfers?
- Watch for timeframe effects: A short window can capture a campaign-driven spike; a longer view may show reversion.
- Consider structural shifts: Did Layer 2 usage, batching behavior, or fee conditions change during the period?
- Seek corroboration: Do multiple independent providers show a similar pattern, using clearly described methods?
- Separate activity from adoption: A safe takeaway is often “network activity changed,” not “millions of new users arrived,” unless the report supports that leap.
If a report doesn’t provide definitions or limitations, you don’t have to discard it—but you should treat any adoption conclusion as tentative.
Sources
Recommended sources to consult for neutral metric definitions and methodology notes (and to verify any specific adoption claim with date-stamped data). Verification notes: confirm each provider’s exact definition of “active addresses,” “transactions,” and any entity-clustering approach; methodologies differ, and results may not be directly comparable across sources.
- Coin Metrics — coinmetrics.io
- Chainalysis (research/resources) — chainalysis.com
- Glassnode (academy/resources) — glassnode.com
- Cambridge Centre for Alternative Finance — ccaf.io
- Investopedia (definitions) — investopedia.com