Overview
Agnostic delivers a comprehensive web3 analytics ecosystem that transforms how developers interact with on-chain data through high-performance SQL interfaces and collaborative tools. Unlike traditional blockchain data solutions, it combines near real-time data across multiple chains with dramatically faster query performance and decoded on-chain data.
Near real-time data across Ethereum, Arbitrum, Base, and other major EVM chains with an announced ~100× faster query performance than traditional solutions
Comprehensive decoded on-chain data with an announced ~90% coverage, eliminating the need for manual hex decoding
Multiple access methods (web app, desktop, PostgreSQL protocol, HTTP API) with native integration to BI tools
One-click API generation via Tailor for rapid implementation of data-driven applications
Collaborative notebook environment (DGN) for sharing analyses with both technical and non-technical stakeholders
Ideal for mid-to-senior level blockchain developers, researchers, and product managers who need efficient, cross-chain data analytics capabilities with minimal infrastructure overhead and maximum performance.
What Works Well 🚀
Lightning-Fast Queries Turn Minutes into Milliseconds
Agnostic's most impressive technical achievement is its query performance, which transforms blockchain data analytics from a patience-testing exercise into a responsive, interactive experience. Built on ClickHouse, a high-performance column-oriented database, the platform can deliver significantly faster query responses than traditional solutions, with internal benchmarks showing up to 100× speed improvements for certain workloads. Actual performance may vary depending on query complexity and dataset size.
This performance leap fundamentally changes what's possible in on-chain analytics workflows. Complex multi-join queries that would timeout or require pre-aggregation in other platforms complete in under a second with Agnostic. This enables real-time monitoring and analysis that's simply not feasible with conventional blockchain indexing solutions.
For developers, this means the ability to monitor critical metrics like transaction flows, contract interactions, and user behaviors in near real-time without maintaining expensive custom indexing infrastructure. The sub-second query response also enables interactive exploration of data patterns, making it possible to investigate anomalies or optimize parameters on the fly rather than waiting minutes between hypothesis tests.
Decoded Data Eliminates the Hex-Decoding Nightmare
A significant productivity improvement Agnostic delivers is its comprehensive decoded on-chain data. The platform automatically decodes a large proportion of contract interactions-reportedly around 90% for supported protocols-into human-readable fields, though coverage may vary for less common or non-standard contracts.
This represents a considerable productivity improvement. Instead of writing custom decoders for each contract or protocol, you can immediately query meaningful data using familiar SQL syntax. For example, analyzing Uniswap V3 pool events becomes as simple as selecting from the appropriate table with human-readable field names rather than decoding position IDs and tick ranges from raw hex data.
The platform achieves this through an extensive library of mapped contract ABIs, transforming otherwise opaque blockchain data into structured, queryable information. This structured approach extends across the entire data model, with consistent schema for common entities like blocks, transactions, and events across different chains, making cross-chain analytics straightforward through simple SQL UNION queries.
For developers constantly switching between protocols and chains, this consistency reduces the context-switching overhead that typically plagues multi-chain development. Rather than learning different data models for each chain or indexer, you can apply consistent query patterns across ecosystems.
Open Architecture Prevents Vendor Lock-in
Agnostic's commitment to open-source principles extends throughout its entire stack, from the core query engine to datasets and collaborative tools. This approach provides flexibility and reduces the risk of vendor lock-in that plagues many blockchain data platforms.
All core components - AGX (SQL interface), DGN (notebooks), and the data schemas - are open-source and available on GitHub. Some datasets are accessible via public S3 buckets for offline analysis, though comprehensive historical data may require hosted access.
This means you can inspect the code, contribute improvements, or even self-host the entire platform if needed. The data itself is accessible through multiple channels: the web app (agx.app), desktop application, HTTP API, PostgreSQL protocol, or even public S3 buckets for offline analysis.
This architectural openness is particularly valuable for teams who need to ensure long-term data availability and control. Rather than being locked into proprietary access methods or closed data models, Agnostic provides standardized interfaces that integrate with existing analytics workflows and BI tools. The PostgreSQL compatibility is especially useful, allowing direct connection from tools like Grafana, Superset, or custom dashboards without any special adapters.
For teams building critical infrastructure, this flexibility to switch between hosted and self-hosted modes provides valuable redundancy and control. If you've ever been affected by API changes or service deprecations from blockchain data providers, Agnostic's open architecture offers a welcome alternative.
One-Click API Generation Bridges Analytics and Applications
With Agnostic Tailor, the platform enables rapid conversion of supported SQL queries into GraphQL endpoints. While most common analytical queries are supported, highly complex or non-standard SQL may require additional adjustment.
The technical implementation is notably efficient — SQL queries with special column aliases (e.g., gql_arg_string_tokenAddress
) automatically generate GraphQL schemas with appropriate argument typing. This means a query tested and refined in the SQL interface can be quickly exposed as a production API endpoint without additional backend development.
For teams juggling both analytics and user-facing applications, this capability can reduce development cycles. Instead of maintaining separate analytics databases and application APIs, Agnostic unifies these workflows into a single platform. A metrics dashboard that previously required a custom backend, API layer, and frontend can potentially be implemented more directly from analytics queries, reducing development time.
The resulting GraphQL APIs are deployed to regional endpoints with authentication via API tokens, making them suitable for production use rather than just prototyping. This production-readiness can help reduce the technical debt typically associated with analytics-driven features in blockchain applications.
Improvement Opportunities 🌱
Limited DEX Protocol Coverage Requires Custom Analytics for Newer Exchanges
While Agnostic provides standardized data for major DEX protocols, its current coverage is limited to Uniswap V2, Uniswap V3, Curve, and forks that retain identical ABIs. Forks with modified ABIs may require custom integration. This represents only a subset of the constantly evolving DeFi exchange landscape.
The practical impact of this limitation becomes apparent when analyzing trading activity across a diverse protocol ecosystem. If your project integrates with or monitors newer DEX models, specialized venues, or chain-specific exchanges, you'll need to implement custom queries against raw event logs rather than using the convenient pre-structured trades tables.
For example, analyzing activity on Arbitrum-native exchanges or newer concentrated liquidity models requires working directly with the lower-level event logs rather than the more convenient trades tables. While entirely possible through Agnostic's comprehensive event logging, it requires additional development effort and introduces potential inconsistencies when comparing metrics across different exchange types.
The silver lining is that Agnostic's open-source approach means motivated teams can contribute new DEX decoders or create their own views. The platform provides all the raw event data needed to build these extensions, but teams should budget additional development time for exchanges not covered by the standard models.
Limited Historical Data for L2 Chains Impacts Analysis
If your protocol has significant Layer 2 presence, you should be aware that Agnostic currently provides less historical data for networks like Arbitrum and Optimism-often covering only recent months-compared to the more extensive multi-year history available for Ethereum mainnet. This disparity creates challenges for comprehensive historical analysis across chains.
This limitation becomes particularly relevant for year-over-year comparisons or long-term trend analysis on L2 networks. For protocols that migrated to L2s early or maintain significant L2 footprints, this historical data gap may necessitate maintaining separate data solutions for comprehensive longitudinal analysis.
The constraint is understandable given the relative youth of many L2 ecosystems, but it creates a meaningful workflow gap for teams deeply invested in these networks. If historical pattern analysis is critical to your protocol's operation, you may need to implement a hybrid approach that combines Agnostic's recent data with historical data from other sources for a complete picture.
That said, for real-time monitoring and recent trend analysis — which represent the majority of day-to-day analytics needs — the available historical window may be sufficient. Teams focused on current operations rather than historical research will find the limitation less impactful.
Chain Support Expansion Requires Team Involvement
While Agnostic supports the major EVM chains where most DeFi activity occurs (Ethereum, Polygon, BSC, Base, and Arbitrum), its support for emerging L2/L3 networks requires custom implementation rather than being immediately available.
For protocols pursuing aggressive multi-chain strategies or exploring newer networks, this can create a gap between chain deployment and analytics capability. Rather than having immediate data access on new chains, teams must work with Agnostic to add support for these networks, potentially creating delays in monitoring capabilities.
The platform notes that additional chains can be added on demand, but this still introduces a dependency on external timeline that may not align with aggressive deployment schedules. For protocols at the bleeding edge of multi-chain expansion, this represents a workflow limitation that requires advance planning.
However, for teams focused on the established EVM chains where most liquidity and user activity remains concentrated, the current coverage is more than sufficient. The supported chains represent the vast majority of DeFi TVL and transaction volume, making this limitation primarily relevant for early adopters of emerging L2/L3 solutions.
Real-World Application: Multi-Chain Liquidity Health Monitoring 🔍
The Challenge
A DeFi protocol operating across Ethereum and Arbitrum needs to implement comprehensive liquidity monitoring to detect anomalies, optimize parameters, and provide transparency to users. The system must track metrics across both chains with near real-time updates, generate alerts for concerning patterns, and expose key data through public APIs.
Implementation Process
Exploring Data Structure and Building Initial Queries
The first implementation step involves exploring Agnostic's data structure across both chains to build effective queries. For Ethereum, this means working with the agnostic__blockchain__ethereum_mainnet__decoded_logs
table, while Arbitrum data is accessed through tables like sonarx.arbitrum.logs
and sonarx.arbitrum.transfers
.
This exploration reveals the slight schema differences between chains, requiring chain-specific query patterns. For example, monitoring Uniswap V3 pools on Ethereum can be implemented with a query like:
-- Example: Monitoring trading volume on Uniswap V3 USDC/WETH pool
WITH
evm_hex_decode('0x88e6a0c2ddd26feeb64f039a2c41296fcb3f5640') as usdc_weth_03_pool_address,
18 as weth_decimals,
6 as usdc_decimals
SELECT
date_trunc('hour', timestamp) as hour,
COUNT(*) as swap_count,
SUM(ABS(JSONExtract(inputs, 'arg2', 'Int256')) / pow(10, weth_decimals)) as eth_volume,
SUM(ABS(JSONExtract(inputs, 'arg3', 'Int256')) / pow(10, usdc_decimals)) as usdc_volume
FROM
agnostic__blockchain__ethereum_mainnet__decoded_logs
WHERE
timestamp >= now() - INTERVAL 24 HOUR
AND address = usdc_weth_03_pool_address
AND signature = 'Swap(address,address,int256,int256,uint160,uint128,int24)'
GROUP BY
hour
ORDER BY
hour DESC;
Note: 💡 Queries on large datasets like Arbitrum transfers may require optimization in production environments. Consider adding additional filters (like block number ranges), indexes, or LIMIT clauses to improve performance.
While Arbitrum requires a different table structure:
-- Example: Monitoring token transfers on Arbitrum
SELECT
date_trunc('hour', DATETIME) as hour,
COUNT(*) as transfer_count,
SUM(VALUE) as total_value,
COUNT(DISTINCT FROM_ADDRESS) as unique_senders,
COUNT(DISTINCT TO_ADDRESS) as unique_receivers
FROM
aws__public_blockchain_data__sonarx__arbitrum__transfers
WHERE
DATETIME >= now() - INTERVAL 7 DAY
AND LOWER(TOKEN_ADDRESS) = LOWER('0xFF970A61A04b1cA14834A43f5dE4533eBDDB5CC8')
GROUP BY
hour
ORDER BY
hour DESC;
Note: 💡 The table structure for Arbitrum follows a different naming pattern than Ethereum mainnet tables. When implementing in production, you'll need to account for these structural differences between chains.
The implementation requires accommodating these structural differences while maintaining consistent metrics across chains.
Building Cross-Chain Dashboard for Comprehensive Monitoring
With basic queries established, the next step involves creating a comprehensive cross-chain monitoring dashboard using Agnostic's DGN notebook environment. This entails developing queries that normalize and combine data from both chains to provide unified metrics:
-- Example: Cross-chain transaction comparison
WITH
eth_txs AS (
SELECT
date_trunc('hour', timestamp) as hour,
COUNT(*) as tx_count,
'Ethereum' as chain
FROM
agnostic__blockchain__ethereum_mainnet__decoded_logs
WHERE
timestamp >= now() - INTERVAL X DAY
GROUP BY
hour
),
arb_txs AS (
SELECT
date_trunc('hour', DATETIME) as hour,
COUNT(*) as tx_count,
'Arbitrum' as chain
FROM
aws__public_blockchain_data__sonarx__arbitrum__transactions
WHERE
DATETIME >= now() - INTERVAL X DAY
GROUP BY
hour
)
SELECT * FROM eth_txs
UNION ALL
SELECT * FROM arb_txs
ORDER BY hour DESC, chain;
Note: 💡 Cross-chain comparisons may require performance optimization due to the large data volumes involved. Consider partitioning queries by time ranges, adding additional filters, or using Agnostic's caching capabilities for frequently accessed metrics.
The dashboard includes visualizations for key metrics like swap counts, volumes, unique users, and price movements across both chains, providing at-a-glance assessment of protocol health.
Implementing Alerting System with GraphQL APIs
For real-time monitoring, developers could leverage Agnostic Tailor to create GraphQL APIs from analytical queries, enabling programmatic access for custom alerting systems. Here's how such an implementation might look:
-- Example GraphQL API design (conceptual - not tested)
WITH
evm_hex_decode(gql_arg_string_poolAddress) as pool_address,
gql_arg_int_token0Decimals as token0_decimals,
gql_arg_int_token1Decimals as token1_decimals,
swaps as (
SELECT
timestamp,
JSONExtract(inputs, 'arg2', 'Int256') as amount0,
JSONExtract(inputs, 'arg3', 'Int256') as amount1,
1 / (
pow(1.0001, JSONExtract(inputs, 'arg6', 'Int32')) / pow(10, token0_decimals - token1_decimals)
) as price
FROM
agnostic__blockchain__ethereum_mainnet__decoded_logs
WHERE
timestamp >= now() - INTERVAL gql_arg_string_timeWindow
AND address = pool_address
AND signature = 'Swap(address,address,int256,int256,uint160,uint128,int24)'
),
aggregated as (
SELECT
date_trunc('hour', timestamp) as time_interval,
COUNT(*) as swap_count,
AVG(price) as avg_price,
MIN(price) as min_price,
MAX(price) as max_price,
SUM(ABS(amount0) / pow(10, token0_decimals)) as token0_volume,
SUM(ABS(amount1) / pow(10, token1_decimals)) as token1_volume
FROM
swaps
GROUP BY
time_interval
)
SELECT
time_interval as gql_field_string_timeInterval,
swap_count as gql_field_int_swapCount,
avg_price as gql_field_float_avgPrice,
min_price as gql_field_float_minPrice,
max_price as gql_field_float_maxPrice,
token0_volume as gql_field_float_token0Volume,
token1_volume as gql_field_float_token1Volume,
max_price / min_price - 1 as gql_field_float_priceVolatility
FROM
aggregated
ORDER BY
time_interval DESC;
Note: 💡 The query above represents a conceptual design for the GraphQL API. Agnostic Tailor functionality is available only for Enterprise customers according to the platform. This query has not been directly tested due to enterprise plan restrictions. When implementing in production, work with the Agnostic team to properly configure GraphQL endpoints.
Potential Implementation Extension
Once these GraphQL endpoints are created with Agnostic Tailor, developers could build external systems to consume this data. For example, a custom Node.js service could poll these endpoints at regular intervals, store the results, and implement business logic to:
Compare current metrics against historical baselines
Trigger alerts for significant deviations (e.g., >5% price swings, >50% volume drops)
Detect unusual activity patterns based on custom thresholds
Expected Benefits
The monitoring system described would provide several key advantages:
Near real-time liquidity monitoring across both chains
Automated detection of anomalous liquidity events
Faster identification and response to liquidity issues compared to manual monitoring
Unified cross-chain liquidity dashboard for both team and community visibility
Public GraphQL API enabling community integration and transparency
Cross-chain monitoring would be particularly valuable for identifying arbitrage opportunities and potential MEV exposure that might be missed by chain-specific monitoring systems.
Final Verdict 🤔
Agnostic represents a significant advancement in web3 analytics capabilities, offering considerable value for blockchain development teams through its combination of performance, flexibility, and decoded data. The platform's SQL-based approach and multi-chain support address many painful aspects of blockchain data analysis, reducing the time and resources required for effective protocol monitoring, research, and optimization.
For mid-to-senior level developers working on established chains like Ethereum, Arbitrum, and Base, Agnostic could be an important addition to the development toolkit. The time saved through fast queries on decoded data, combined with the reduction of custom indexer maintenance, creates potential ROI for teams of all sizes. The platform's open architecture and PostgreSQL compatibility allow it to integrate with existing workflows rather than becoming another siloed tool.
Teams focused on emerging L2/L3 networks or highly specialized protocols may need to supplement Agnostic with additional solutions in the short term, given the current limitations in chain coverage and specialized protocol data models. However, the platform's open-source foundation makes it adaptable to these specific needs for teams willing to contribute extensions.
In a landscape where many blockchain data solutions involve tradeoffs between performance, flexibility, and accessibility, Agnostic offers a promising alternative. For web3 teams looking to streamline their analytics workflows and reduce time spent wrestling with hex-encoded data, it presents a compelling option that can transform data analysis from a technical burden into a strategic advantage.
Technical Details
Pricing: Free tier available on agx.app; enterprise pricing for additional features like Tailor (GraphQL API generation)
Access Requirements: Web application (agx.app), desktop application, or API access with authentication tokens for enterprise users
Supported Networks: Ethereum, Polygon, BSC, Base, Arbitrum with additional chains available on demand
Learning Curve: Low-to-Medium - Requires SQL knowledge and basic understanding of blockchain data models, but eliminates complex decoder development and custom indexing
Documentation Quality: Comprehensive - Available through documentation site with query examples, schema definitions, and API references
Integration Options: Web app, desktop application, HTTP API, PostgreSQL protocol, GraphQL endpoints (via Tailor for enterprise users)
Support: Community forums for free tier, dedicated support channels for enterprise customers