Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
At its core, EO is an open infrastructure platform that empowers developers to build secure blockchain oracles backed by Ethereum's battle-tested security model. EO creates a foundation for specialized data services that combine deep domain expertise with unmatched cryptoeconomic security.
Smart contracts are powerful tools for executing transparent, immutable logic. However, to reach their full potential, they need reliable access to real-world data. Blockchain oracles serve as the critical bridge between blockchain systems and external data sources, enabling smart contracts to interact with the world beyond their native blockchain.
Trust minimization is achieved through a decentralized network of 100+ staked-backed operators, an immutable data layer, and a coordination system allowing different unrelated entities to take part and contribute while putting stakes at risk through an infrastructure that is designed to maintain trusted, reliable data operation for decentralized data machines.
The EO stack provides a comprehensive platform that anyone can use to build specialized blockchain oracle services without permission or gatekeeping. Domain experts can leverage our infrastructure to deliver high-quality data solutions for their specific use cases, focusing on what they do best. With EO handling the complex security and coordination challenges, builders can freely innovate within their expertise – whether it's financial data, real-world events, or computational services – and deploy their solutions immediately.
This open architecture creates a vibrant marketplace where specialized blockchain oracle services thrive through competition and innovation. The ecosystem benefits from a rich diversity of secure data solutions, each optimized for specific use cases while maintaining rigorous security standards. This permissionless approach unlocks Web3's true potential by giving users and applications access to an ever-expanding landscape of secure, specialized data services – all protected by Ethereum-grade security.
Choose your path in the EO ecosystem through three distinct roles:
Blockchain Oracle Validated Services (OVS) enable domain experts to build specialized blockchain oracle solutions using EO's secure infrastructure.
ePrice is EO's flagship product, providing secure and reliable price feeds for decentralized applications. Built with robust security mechanisms and availability across all major networks, ePrice offers permissionless integration for any application requiring trusted price data.
Operators form the backbone of EO's decentralized network, validating data and maintaining network security. By staking ETH through EigenLayer, operators earn rewards while contributing to the ecosystem's security and reliability.
How does the eOracle protocol compare to a typical oracle?
Today's blockchain oracle landscape is dominated by a small group of companies that must build and maintain everything: infrastructure, security, node networks, and data solutions. This "do-it-all" approach has created an unsustainable bottleneck in blockchain innovation.
When diving deep into solving the blockchain oracle problem, two layers of complexity emerge that make any centralized solution fundamentally insufficient.
First is the inherent complexity of data itself. Each domain demands its own specialized understanding. Financial data requires sub-second updates and sophisticated anomaly detection. Social media data demands extreme throughput and ability to track emerging patterns in real-time. Risk and cyber analysis require complex prediction models and automated decision-making. Each domain comes with its own patterns, failure modes, and expertise requirements. A solution perfect for one becomes fundamentally flawed for another.
But that's only half the challenge. Building infrastructure for trustless data delivery - with high reliability, consistent uptime, and true decentralization - requires solving deep problems in cryptography, game theory, and distributed systems. The technical complexity here rivals that of the base layer blockchains themselves.
Complex coordination problems of this scale cannot be solved by any single entity. Instead, they require the power of free market innovation, where diverse participants can contribute their specialized knowledge and compete to deliver the best solutions. Just as no single provider can solve every blockchain's unique challenges, no single blockchain oracle provider can possess all the expertise needed to bring the world's data on-chain.
EO introduces a fundamental shift: separating blockchain oracle services into distinct layers:
The security layer (EO) handles infrastructure and cryptographic guarantees
The data layer (OVS) enables specialists to focus purely on their domain expertise
This separation represents the natural evolution of a maturing market - where different participants can specialize and perfect their part of the solution, resulting in better outcomes for everyone.
The EO stack creates the foundation for a marketplace of blockchain oracle services where specialized knowledge can flourish and compete. Through this marketplace, the blockchain oracle problem will solve itself through the distributed efforts of countless builders and innovators, secured by Ethereum itself.





This section provides a comprehensive overview of EO's smart contract infrastructure. Our design creates a modular and programmable platform that supports new blockchain oracle development through flexible operator management, diverse data types, and customizable aggregation methods.
EO Chain contractsTarget ContractsThe Aggregation Library is a collection of smart contracts that provides reliable on-chain data aggregation for various use cases. Each method addresses specific data scenarios to maintain accuracy and integrity in decentralized environments.
Considerations for choosing the proper aggregation:
Time-variance. Since time is continuous, fetchers access the source at slightly different times. We don’t expect the time differences to be significant; more particularly, the time differences should not exceed one second, which is the rate of our blockchain i\o.
Simplicity. It is crucial due to runtime considerations and explaining to users how our mechanism works (KISS/Occam’s razor).
(Honest) mistakes. Although we have incentive mechanisms to punish/reward fetchers’ behavior, mistakes are unavoidable. For instance, downtime, latency, etc. These can happen even if fetchers are honest and thus should be accounted for.
Malicious behavior. Our solution should be as robust as possible to attacks. Namely, it should minimize the consequences of an attack and facilitate punishing attackers.
For detailed explanations, please refer to Robust Aggregation 101
For some shor example for aggregation methods please refer to Median
A core feature of PoS protocols is the ability to implement transparent and objective rewards and penalties mechanisms. The objective is to incentivize data validators to maintain data quality and accuracy and discourage attempts to manipulate the blockchain oracle.
The fundamental prerequisites to enable a quality mechanism are:
Inaccurate data reports should be detectable
Honest mistakes and malicious manipulation attempts should be distinguishable
Faults should be objectively attributable (which is trivial in the canonical case where data validators sign their data reports)
Metrics on data quality should be defined
EO chain serves as a robust infrastructure to achieve these abilities as:
Operators are stake-backed
Data validators' past activity is transparent and recorded in an immutable manner
Prepared on-chain components to support rewards distribution and slashing
For a fundamental analysis of blockchain oracle cryptoeconomic security and key considerations behind slashing in a subjective environment, please refer to:
🥢On-chain-Subjective Slashing FrameworkCryptoeconomic SecurityThis section outlines the Principal Token (PT) pricing methodologies supported by EO. It introduces three foundational pricing models and discusses a hybrid approach designed to balance stability and market responsiveness.The guidelines provided are general. Specific implementation details depend significantly on the issuer's protocol design and the underlying AMM architecture.
PT pricing involves two primary components: selecting a price source and defining the pricing model. Price sources may include trading data or fixed APY assumptions. The three core pricing models are: linear discounting, zero-coupon bond, and AMM-based pricing.
Deterministic models rely solely on time-based formulas, independent of external price feeds. They offer predictability and manipulation resistance, making them ideal for low-liquidity environments.
Linear DiscountZero-Coupon Bond (ZCB) PricingThe first step in the data processing flow is defining what off-chain data the blockchain oracle service needs to submit on-chain and implementing the appropriate operator code.
Define and implement data validators execution logic - Writing DV code to retrieve the required data. Data here refers broadly to any output from computations, including fetching information from APIs, querying databases (blockchains included), and calculating statistics on account balances.
Transaction format - Data validators must pack their computation results into a transaction structure before sending them to EO chain. Data reports may include additional data such as timestamps, proof of computation etc. Transaction are signed using either BLS or ECDSA.
Sending transaction to EO chain - The EO chain is EVM-compatible, supporting the standard transaction formats used by common libraries like ethers.js and web3.js.
After the data validator code is established, The EO Wrapper facilitates interaction with the EO chain:
Retrieving OVS configuration, including update notifications
Sending reports to the EO chain with minimal latency
Publishing Prometheus metrics to monitor the off-chain component's health
Key considerations
Data validators typically run identical logic. This redundancy achieves decentralization and ensures that malicious behaviors can be easily validated on-chain. However, this replication creates coordination challenges. Data validator task should be easily scaled to run among all OVS operators.
In certain scenarios, computation integrity evidence can be recorded on-chain to verify the actions of data validators. This process helps in validating data accuracy and quality.
EO addresses the Blockchain Oracle problem from the first principles. As the first blockchain oracle built on EigenLayer, we believe true innovation in blockchain data requires:
Decoupling complex problems into layers
Solving each layer with dedicated, high-quality solutions
Maintaining uncompromising security standards
Current Blockchain Oracle solutions, managed by a few centralized organizations, create stagnation in blockchain innovation. Domain experts from diverse data fields are unable to bridge their valuable insights on the chain due to the complexity of building a secure blockchain Oracle infrastructure.
The EO stack changes this by providing production-ready infrastructure components. Data experts can now build custom blockchain Oracle solutions focused on their domain expertise while EO handles the security and infrastructure complexities. Through blockchain Oracle Validated Services (OVS), specialized data solutions can be built and deployed without compromising on security or decentralization.
These docs guide you through the core concepts of the OVS framework, introducing the components and features of the EO stack and outlining how to start building.
EO contracts on the target chain: managing publication, verification, and data usage.
EOFeedManager: The entry point for submitting feeds to the target chain, holding the latest feeds EO has published to the chain.
https://github.com/eodata/target-contracts/blob/develop/src/EOFeedManager.sol
EOFeed Verifier: Used by the feed manager to verify the integrity of the payload and the signatures of the witnesses who signed it.
https://github.com/eodata/target-contracts/blob/develop/src/EOFeedVerifier.sol
EOFeedAdapter: This smart contract exposes an interface identical to Chainlink contracts (the industry standard of having dedicated contracts per feed) from our client's side. Adaptation is needed because while our novel batching method reduces costs, all feeds are managed by the feed manager.
https://github.com/eodata/target-contracts/blob/develop/src/adapters/EOFeedAdapter.sol
Blockchain Oracle Validated Services (OVS) are decentralized blockchain Oracle protocols built on top of EO. The OVS framework introduces a new paradigm for blockchain Oracle design, focusing on separating infrastructure from specialized data solutions, creating a mature market with a clear separation of concerns.
By providing a platform for diverse participants to create specialized blockchain oracles, EO aims to foster a more competitive market that enhances the quality and reliability of data available to smart contracts. This approach not only democratizes access to data services but also ensures that blockchain applications can benefit from the expertise of a wide range of data and computation providers.
For a deeper understanding of the vision and necessity behind OVS, please refer to
Blockchain Oracles enhance blockchains by integrating off-chain data while maintaining the core properties of blockchains. Blockchain oracles eliminate dependence on a central third party by introducing redundancy through a distributed reporting system.
A decentralized blockchain oracle needs to address the following questions:
Sources - What are the qualitative sources for this type of data? What off-chain computation should be transmitted to the blockchain?
Aggregation - What is the optimal method for aggregating validator reports into a single value? Considerations include resistance to manipulation (robustness), performance, simplicity, and more.
Incentive Management and Security - How can validator participation be assessed and appropriately rewarded or penalized? In particular, how can misreports be detected and distinguished between honest mistakes and malicious manipulation attempts?
Answering these questions requires expertise in various domains, including data science, cryptography, game theory, and the specific field from which the data originates. We do not want a cryptographer performing poor data science nor a data scientist executing inadequate cryptography.
Potential OVS builders include:
Blockchain Oracle-First Companies - Teams that build specialized blockchain oracle services using EO's infrastructure.
Blockchain Oracle-Dependent Products - Projects that need custom blockchain Oracle solutions not currently available.
Internal Blockchain Oracle Needs - Teams looking to connect their existing operations on-chain through blockchain Oracle integration.
Next, the key features of the EO stack are explored, which teams can use to build their OVS.
Prior to examining the OVS framework and architecture, we present an high level overview of the data processing flow through EO-chain.
The data processing consists of 3 stages:
Data validators obtain data from sources using WebSockets, APIs, or perform some general off-chain computations. They sign this data and send it as a transaction to the EO-chain.
Chain validator nodes receive the signed reports, verify the reporter's identity, and aggregate the verified reports using dedicated schemes. This computational aggregation process and its result become an immutable part of the EO-chain.
EO Cryptographic broadcaster bridges data feeds from the EO-chain to various target chains. This involves monitoring EO-chain events in real-time, validating feed data against predefined criteria, signing data, and submitting verified feed updates to target chains.
All of the above actors are stake-backed EO\eigen’s operators.
For more detailed information, please refer to
The following sections describe how builders can use and configure EO's features to meet their OVS needs
This section covers the process of designing and building an OVS on EO. It outlines the essential components developers need to configure and develop during integration.
The OVS design and deployment process consists of four parts:
Set up - Register as an OVS on EO chain, specify operator prerequisites, and define data feeds and sources.
Off-chain Components - Develop the logic for the data validator to retrieve data and define the update format. Use the EO Wrapper to retrieve the OVS configuration from the EO chain, and publish Prometheus metrics to monitor the health of the off-chain component.
On-chain components - Design aggregation logic and incentive management related contracts including slashing, rewards distribution, and so on.
Target chains publishing - Configure EO broadcaster to establish update parameters, triggering conditions, and verification schemes
Configure EO broadcaster to set target chain update parameters, triggering conditions, and verification schemes.
Target chain updating -Teams should establish their publishing requirements according to their data reporting needs and specific use cases.
Deciding on exact data to be delivered
Setting triggering parameters
Chain-specific batching strategies and optimizations
Data consumption interface - EO implements Chainlink's , enabling seamless transitions between blockchain oracle providers. See for details on consuming data from EO.
Verification scheme - EO broadcaster supports BLS, ECDSA and additional cryptographic techniques supporting bridging data cross-chain and proving integrity. While the canonical flow of the broadcaster is recommended, different use-cases may require different solutions on the perfomance-security tradoff curve.
Key considerations
Performance-Security Tradeoff Analysis for Optimized Verification Per Use Case
Gas minimization through appropriate data compression
End users experience
For more information, see
The EO stack is production-ready, with a phased approach to permissionless building. EO's flagship product ePRICE, focused on price feeds, is operational alongside EtherOracle, the peg-securing blockchain oracle.
Established companies are building on EO to decentralize their existing products, while new teams leverage our stack to build from day one. Both benefit from focusing on their domain expertise without compromising on security and infrastructure design.
The potential for OVS spans across blockchain innovation. From zkTLS blockchain oracles validating encrypted web data to prediction market dispute mechanisms, any decentralized consensus-based process can be built using the EO stack.
Let's explore the current projects building on EO.
Price feeds make or break DeFi - too vital to run on promises alone. ePRICE represents a fundamental shift in price feed design, built on the core principles of restaking and cryptoeconomic security. Where traditional solutions rely on reputation and centralized trust, ePRICE ensures reliability through transparent architecture and Ethereum-based security guarantees.
DeFi protocols depend on price feeds as their source of truth, creating two fundamental challenges that ePRICE addresses head-on through innovative design and robust infrastructure.
The core challenge of price feeds lies in the fundamental nature of crypto markets: trading occurs simultaneously across multiple venues, prices vary between markets, and liquidity shifts continuously. At its heart, this creates a complex price discovery problem where market manipulation must be distinguished from legitimate price movements.
A robust price indexing model must maximize responsiveness to legitimate market-wide price movements while minimizing reactions to temporary or isolated price anomalies.
The ePRICE Approach: Our solution combines continuous data collection from multiple high-quality sources with sophisticated indexing algorithms that adapt to market behavior and liquidity changes in real-time. Each price feed is tailored through dedicated DeFi research, considering the unique characteristics and trading patterns of different assets. This market-aware approach ensures our price discovery remains accurate even as market dynamics evolve.
The most accurate price data becomes worthless if it can't be reliably delivered, especially during market stress periods when price feeds are most critical. A secure blockchain oracle infrastructure must maintain true decentralization while ensuring no single point of failure can compromise the system.
The ePRICE Approach: As the first blockchain oracle built on the EO Stack, ePRICE leverages a decentralized, modular system designed specifically for high-stakes data delivery. Our network of 130+ validators, backed by over $2M in restaked ETH, ensures data integrity through real-time performance tracking and sophisticated outlier detection. Each price update flows through a transparent process where operators submit signed data to our proof-of-stake chain, achieving consensus before being cryptographically broadcast to target chains.
ePRICE combines battle-tested infrastructure with sophisticated price discovery to deliver a new standard in blockchain oracle security and reliability. Built on Ethereum's security through restaking, our solution offers permissionless integration for any protocol requiring trusted price data.
This approach combines AMM TWAP pricing with a protective price floor. The floor can be implemented in two ways:
ZCB Floor: Sets a lower bound using the ZCB model with a relatively high APY.
Linear Discount Floor: Leveraging linear discount's tendency to underprice the market value.
This approach allows protocols to choose between stronger price protection (linear) or better market alignment (ZCB) while maintaining TWAP's market responsiveness. The price at time is given by:
With Linear Discount Floor:
With ZCB Floor:
where t is the AMM-observed spot price over each sampling interval, is the elapsed time of that interval, and the sums run over all samples in the averaging window ending at . In the linear floor, denotes the total term (so - is the time remaining) and in the ZCB floor, denotes the time to maturity (years).
This design leverages the natural tendency of deterministic models to underprice PTs, creating a protective lower bound that ensures stability. Meanwhile, when market conditions are favorable and trading prices exceed this floor, the TWAP component allows the PT to track higher market prices, enabling better capital efficiency. This balance helps protect against downside risk while maintaining upside potential.
The on-chain components consist of aggregation logic and incentive management related mechanisms.
Aggregation logic- Teams must define the proper method for aggregating validator reports into a single blockchain oracle update. Considerations include resistance to manipulation, performance, simplicity, and more. Methods vary significantly for different data types and use cases.
For an exhaustive explanation of robust aggregation and the EO aggregation library, please refer to
Incentives mechanism- Define metrics for data quality and implement tools to evaluate operator performance. Based on these metrics, rewards and penalties are distributed through smart contracts. For an exhaustive elaboration on incentives design please refer to
More sophisticated pricing methods can be obtained by incorporating multiple pricing methods and deviation-based parameter updating, enabling higher capital efficiency while maintaining stability.
Market-driven models derive prices from observed trading activity, enabling real-time responsiveness. They are better suited to high-liquidity environments where pricing should reflect market conditions.




The median algorithm returns the middle value from a sorted list of validator reports. It is most naturally used for numerical data but can be applied to any ordered set of data.
A possible variant is the weighted median - Let be the stake of data validator , and assume that . Also, for simplicity, assume that are sorted. The weighted median is an element such that
Resistant to extreme values and outliers, as they only affect the tails of the sorted list—ignores the impact of anomalous data points due to its reliance on central values.
The aggregation value will always be between the honest reports, assuming a majority of the stake belongs to honest validators (weighted median)
May not reflect the influence of large/small prices if they are outliers
Non-incremental - the aggregated value cannot be updated upon each report in an online manner (as opposed to mean, for example, which supports where
is the mean after reports and is the -th report).
Prices are susceptible to noise and volatility. Therefore, financial applications often average prices over time. Well-known methods include Moving Average, Exponential Smoothing, Time-weighted Average Price (TWAP), and Volume-weighted Average Price (VWAP)
TWAP (Time-Weighted Average Price) - calculates an average value over a specified time period, weighting each data point by the duration or frequency of updates.
TWAP is calculated as follows:
where is the value of the at the -th measurement and is the change in time since the previous measurement.
Reduces the impact of rapid fluctuations in the data by spreading them out over time.
Incorporates time factors, making the final output reflective of trends rather than single-point events.
Offers smoother price data for algorithmic trading and liquidity management.
Less appropriate sensitive to fast market shifts or abrupt changes -smoothing over time makes TWAP less responsive than other algorithms.
Requires additional storage of the historic data.
Borsa Network is building a specialized blockchain Oracle Validation Service (OVS) on EO to bring sophisticated intent-solving capabilities to DeFi. Their system ensures efficient, fair execution of user intents through decentralized validation.
The protocol transforms DeFi intent solving through a three-stage process:
Intent Composition & Submission
Network-Level Processing & Bundling
On-chain Execution
Unlike traditional MEV-driven models, Borsa's case-agnostic approach prioritizes user outcomes while preventing exploitation.
Built on EO's infrastructure, Borsa's OVS implements:
Proof-of-stake validation for intent fulfillment
Optimization verification ensuring best execution
Full ERC-4337 compatibility for standardized processing
Leveraging EO enables Borsa to focus on intent solving while gaining:
Ethereum-based security through restaked ETH
Decentralized operator network
High-performance infrastructure for complex computations
This implementation demonstrates how specialized builders can leverage EO's infrastructure to create sophisticated, secure blockchain oracle services. Borsa's OVS represents a significant advancement in DeFi infrastructure, enabling more efficient and trustless operations.
The setup phase involves registering as an OVS and specifying prerequisites for operators who wish to participate in the OVS.
OVS registration on EO chain - Include registration as an OVS and define data feeds and sources. The relevant contracts to be deployed are described in EO Chain contracts
Setting operators requirements - Specify prerequisites for operators to participate on OVS, including:
Stake - Amount and type of tokens operators must hold → OVS native tokens can serve as a prerequisite, requiring a matching strategy on eigen . For more details: https://github.com/Layr-Labs/eigenlayer-contracts/tree/dev
Reputation requirements based on operator activity history
Software, runtime environment, and technology prerequisites
EO supports both permissionless operator registration (where operators join the OVS quorum upon meeting predefined criteria) and manual approval by the OVS manager.
Key considerations
The Corruption-Analysis Model helps establish the minimum stake requirement by comparing the potential costs and benefits of corrupting the blockchain oracle. For a detailed explanation, please refer to Cryptoeconomic Security
Default settings can be applied for stake and reputation requirements, using restaked ETH and auto-approved EO operators
EO cryptographic Broadcaster responsible for data publishing to target chains.
Overview
The EO Broadcaster is a distributed system designed to bridge data feeds from the EO Chain to various target chains with high reliability, low latency, and cryptographic security.
The broadcaster's multilayer architecture enables flexibility and dynamism in handling data report formats, verification schemes, and update triggering logic.
System Architecture
The system consists of four main components that work together through message queues and a shared state database:
Witnesses
Aggregators
Processors
Publishers
Witnesses
Witnesses monitor the EO Chain for specific events and cryptographically attest to their validity.
Key responsibilities:
Monitor EO Chain events in real-time
Validate feed data against predefined criteria
Generate Merkle trees for efficient data verification
Sign data using both BLS and ECDSA signatures
Submit signed attestations to the aggregators
The witness component ensures that each feed update is independently verified and signed by multiple parties.
Aggregators
Aggregators process witness signatures and combine them when sufficient voting power is achieved. They implement a stake-weighted consensus mechanism based on witness voting power.
Key responsibilities:
Threshold-based sealing
BLS and ECDSA signature aggregation
Distributed coordination
aggregators achieve distributed coordination through the state database and capable for horizontal scaling.
Processors
Processors handle sealed aggregations and prepare them for target chain distribution. They maintain the latest state of feeds and generate proofs for target chain verification.
Key responsibilities:
Verify aggregated signatures
Maintain latest feed values in the state database
Generate and store Merkle proofs
Route updates to appropriate target chains and schedules.
Implement feed routing logic based on target chain configurations
Publishers
Publishers handle the final submission of verified feed updates to target chains, implementing chain-specific optimizations and transaction management to ensure proper delivery, the one and only task of the publisher is to get the package delivered, no altering, or other logic and without opening the package.
Key features:
Dynamic gas price optimization
Transaction retry logic
Chain-specific batching strategies
Groups identical/similar data points together and at the end of each aggregation window outputs the value corresponding to the heaviest cluster assuming sufficient weight (x% of total validators stake)
Default parameters:
Aggregation window of one block
Sufficient stake is 67% of the total validators stake
Proper in cases where we expect the same result among all data validators . For example, fetching from a single and stable source or performing the same computation
Final result is strictly backed by most validators or stake weight (assuming 67% requirement)
Given a relative majority submitting accurate reports, iInaccurate reports do not affect final result (as deviant or malicious reports fall into smaller clusters)
Efficient online implementation and possible optimizations (Cleaning small clusters occasionally etc)
Storage inefficiency in worst case scenarios (ccurs when reports have high variance)
Limited effectiveness with volatile data—when values change rapidly, multiple small clusters form instead of a clear majority, making it difficult to reach consensus.
When billions in liquid staking assets depend on accurate reward calculation, centralized blockchain oracles become an unacceptable risk.
Liquid staking protocols have revolutionized ETH staking, but they've introduced new challenges. When millions of users depend on accurate reward distribution and token rebasing, the blockchain oracle that powers these calculations becomes critically important. This isn't just about price feeds – it's about complex validator performance analysis, reward accrual verification, and secure rebasing execution.
Rebasing in liquid staking protocols orchestrates a delicate balance between staked ETH and liquid tokens. When validators accumulate rewards over time, the protocol must precisely calculate these earnings and adjust token supply accordingly. This process demands real-time monitoring of thousands of validators, precise reward calculations, and immediate detection of critical events like slashing or exits. Even a minor calculation error could cascade through the entire DeFi ecosystem, potentially leading to protocol insolvency or unwarranted liquidations. The complexity of this task, combined with the magnitude of assets at stake, demands a solution far beyond traditional blockchain oracle capabilities.
EtherOracle leverages EO's infrastructure to create a robust, decentralized validation system:
Multi-Source Data Collection
Beacon chain API integration
Ethereum execution layer monitoring
Smart contract state analysis
Validator performance tracking
Distributed Computation
Independent reward calculations
APR verification
Validator state monitoring
Withdrawal request processing
Consensus Mechanism
Hash-based report verification
Quorum-driven agreement
Multiple integrity checks
Secure mainnet execution
EtherOracle's role extends far beyond the immediate needs of liquid staking protocols. While it directly serves protocols managing validator sets and distributing rewards, its accuracy ripples through the entire DeFi ecosystem. When a user stakes their ETH, they trust the rebasing mechanism to fairly distribute their rewards. When lending protocols accept liquid staking tokens as collateral, they rely on accurate rebasing to maintain proper risk parameters. Even AVS protocols building on EigenLayer depend on precise liquid restaking token balances to ensure their security guarantees. EtherOracle thus becomes a critical infrastructure piece, securing not just individual protocols but the interconnected fabric of DeFi itself.
Learn more about Etherfi and Liquid Restaking - https://www.ether.fi/
By Bitpulse, in collaboration with EO
As DeFi protocols embrace innovations like Bitcoin staking and restaking, the ecosystem is unlocking billions in liquidity. However, these assets introduce unprecedented complexity in risk management. Traditional risk assessment frameworks were not designed for these new primitives, leaving lending protocols struggling to effectively scale and manage their collateral.
Pulse is the first specialized risk intelligence layer for synthetic assets. Built by Bitpulse in partnership with EO, Pulse delivers real-time, multidimensional risk assessments that enable protocols to make informed decisions about synthetic assets in real-time.
Pulse helps the DeFi ecosystem prosper by democratizing risk:
For lending protocols: Confidently underwrite collateral.
For investors: Deploy capital into better-managed markets.
For the DeFi ecosystem: Establish industry standards for synthetic asset risk assessment.
Comprehensive synthetic asset risk analysis using multidimensional scores across protocol, counterparty, liquidity, and market risk.
Real-time monitoring and risk alerts
Cross-protocol exposure analysis to detect cascading risks across interconnected assets and protocols.
Stress-Test Simulations to analyze resilience during black swan events like Luna or FTX collapses.
Integrate with Pulse to:
Understand whether a synthetic asset should be accepted as collateral given your risk profile
Make dynamic lending decisions based on real-time risk metrics
Protect against complex failure modes in synthetic assets
Adjust parameters based on sophisticated risk analysis
Access institutional-grade risk assessment tools
Developed by Bitpulse and EO, combining deep expertise in:
Asset risk analysis
Machine learning for financial systems
DeFi protocol architecture
Building scalable risk and data platforms in banks like Anchorage Digtial and giant tech giants like YouTube
We're actively partnering with lending protocols to validate and refine our risk assessment framework. If you're building or operating a lending protocol interested in safely expanding into synthetic assets, we'd love to collaborate -> Website, X
EO's infrastructure and components enable builders to design custom blockchain oracle solutions for their specific needs while avoiding the complexities of low-level implementation details.
Key features of the EO stack include:
Modular stake-backed operators - OVS builders can selectively configure a subset of EO EigenLayer validators to handle blockchain oracle tasks—ranging from fetching data from custom APIs, running specialized off-chain computations, to monitoring real-time events. Each validator’s service is secured by restaked ETH (through EigenLayer), EO tokens, or an OVS-native token, ensuring alignment and integrity through staking-based incentives.
Execution & Consensus Layer - EO's purpose-built blockchain manages decentralized blockchain oracle operations, from data aggregation to consensus achievement.
Modular and Programmable blockchain oracle platform - A comprehensive smart contract architecture creates a modular and programmable blockchain oracle platform that supports new blockchain oracle development. This design provides a programmable layer for operator management, diverse data types, and aggregation algorithms while integrating with off-chain components and EO chain.
Data aggregation library - Collection of audited contracts that provides reliable on-chain data aggregation for various use cases. Each algorithm addresses specific data scenarios and security requirements.
Immutable data layer - EO chain serves as a permanent, transparent record of all blockchain oracle-related activity. This enables:
Verifiability: anyone can independently confirm the origin and accuracy of data.
Incentive Mechanisms: Permanent logging of operator performance enables the implementation of incentive mechanisms, including slashing, rewards distribution, insurance, and other advanced strategies to ensure consistently high service quality.
cryptographic broadcaster - a distributed system designed to bridge aggregated data from the EO Chain to various target chains with high reliability, low latency, and cryptographic security.
EO Ecosystem - EO Chain serves as a fertile ground for collaboration between multiple blockchain oracle protocols. Different OVS solutions can integrate to provide richer, more comprehensive data services. For example, a risk analysis oracle and a price feed oracle can combine to offer lending protocols price rates accompanied by risk analysis.
OVS Management - An app enabling OVS builders to manage operators, fetch data, publish, and configure various parameters
The ZCB model uses a compound interest formula to determine the PT price:
where is the APY fixed at inception, represents years to maturity, and is the face value. This model captures the exponential accumulation of yield over time and is widely adopted in traditional finance. It offers a more precise representation of time value compared to linear discounting, making it particularly appropriate for longer-term PTs or contexts requiring alignment with standard bond valuation practices.
The linear discount model is commonly used due to its simplicity and its ability to capture the essential behavior of PTs: a steady appreciation toward face value as maturity approaches. It models this convergence through a linear function, and can be viewed as a first-order approximation of traditional bond pricing, substituting exponential growth with a linear interpolation over the term.
The price at time is given by:
where represents time to maturity and stands for the face value.
Setting the base discount:
There are two principal approaches to determining the base discount:
Direct APY assignment: The base discount is often set equal to the current APY at the time of market creation. This method is widely used for its simplicity, but tends to underprice by not accounting for compound interest effects.
Calibration to a target initial price: The base discount can be chosen to achieve a desired initial PT price. This approach treats the initial price as the primary input from which the base discount is derived. For example, in a one-year market, setting the base discount to yields an initial price that aligns with the target APY.
EO contracts for Rootstock chain
Compatible with AggregatorV3Interface.
mBTC/USD
8
mTBILL/USD
8
Explorer: https://rootstock.blockscout.com/
Chainlist: https://chainlist.org/chain/30
YAP-nALPHA-pUSD-4/USD
0xAe982B8dc9E53dF3dCd0bf44295Cc6D980f4A3dc
YAP-pUSD-WPLUME-9/USD
0x5875b6582dc7a050c7e10d9bC52e005e8870B483
This model uses AMM-observed prices averaged over time. This method smoothens volatility and provides resistance to price manipulation, though it responds slowly to rapid market changes. This trade-off can be adjusted through different time window settings.
The price at time is given by:
where t denotes the AMM-observed spot price over each sampling interval and is the elapsed time of that interval. The sums cover all samples within the window ending at . If all intervals are equal ( is constant), TWAP reduces to the arithmetic mean of the sampled prices.
PT pricing involves balancing trade-offs between manipulation resistance, market responsiveness, simplicity, and capital efficiency. The optimal approach depends on the protocol's risk preferences, market conditions, and asset liquidity.
In a world where social signals drive markets and shape digital communities, bringing verified social data on-chain unlocks new horizons for web3 applications.
Social data isn't just about likes and follower counts. Understanding social influence requires deep analysis of engagement patterns, network effects, and temporal dynamics. Echo begins with X , parsing through complex social signals to deliver verified, meaningful insights about social impact and reach.
Echo starts by solving the immediate challenge of bringing verified X engagement metrics on-chain. But this is just the beginning. Our architecture is designed to expand across all social platforms, creating a comprehensive social blockchain oracle that can capture and verify social signals wherever they emerge. Through standardized data models and flexible validation frameworks, Echo enables progressive integration of new social platforms and metrics.
Echo's infrastructure processes social data through multiple layers:
Raw Data Collection: Direct API integration with X, capturing real-time engagement metrics and network analysis
Verification Layer: Cross-reference multiple data sources to ensure accuracy and detect manipulation attempts
Pattern Analysis: Advanced algorithms to identify genuine engagement versus artificial inflation
Network Effect Calculation: Measuring true reach and impact across social graphs
Echo leverages EO's infrastructure and Ethereum security through EigenLayer to ensure reliable, tamper-proof social data delivery.
Echo's first deployment focuses on X engagement metrics crucial for web3 communities:
Real-time monitoring of specified accounts
Engagement analysis (likes, retweets, replies)
Reach calculation and virality tracking
Community growth patterns
Network influence scoring
While Echo begins with X metrics, its implications reach across the entire web3 ecosystem:
DAO governance enhanced by social signal integration
DeFi protocols incorporating social sentiment
NFT valuations informed by creator engagement
Community platforms with on-chain social reputation
Marketing campaigns with verifiable impact metrics
Try ECHO beta Version on Camp Network testnet -
Powered by EO infrastructure and secured by Ethereum through EigenLayer
Method
Formula
Pros
Cons
Preferred Environment
Linear Discount Rate
Simple, predictable, manipulation-resistant
Underprices, unresponsive to market dynamics
Low-liquidity markets
Zero-Coupon Bond Rate
Reflects compounding, predictable, manipulation-resistant
Unresponsive to market dynamics
Longer-term markets with stable yield
AMM-TWAP
Market-responsive
Latency, liquidity-dependent
Active DeFi markets with deep liquidity
Hybrid Approach
Manipulation-resistant, market-responsive
Higher complexity
Markets requiring both stability and adaptability
Base
Linear Discout
26th March. 2026
Hybrid
10th December, 2025
Hybrid
20th November, 2025
HyperEVM
Hybrid
25th September, 2025
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
TBTC/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
WBTC/USD
8
1.0%
24 hours
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
sFRAX/FRAX
8
1.0%
24 hours
Fundamental exchange Rate
sfrxETH/frxETH
8
1.0%
24 hours
Fundamental exchange Rate
ETH/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours















Compatible with AggregatorV3Interface.
xUSD/USD*
0x91f37a3058a7fd3f4f66ed87d715cf05bb4fbfbd
8
Fundamental exchange Rate
yUSD/USD
0x2da05F177485264D432878D4A17d722bc64Db0EF
8
Fundamental exchange Rate
hgETH/USD
0x70cf192d6b76d57a46aafc9285ced110034eb013
8
Fundamental exchange Rate
sUSDf/USDC
0xc3DBFA1c994762Aab9EFa0D7651e0C1cBb3F8bb2
18
Fundamental exchange Rate
xBTC/BTC
0xacFC9fd2F8379d76F07B0C06d8282B5dE5b7b1Cb
8
Fundamental exchange Rate
xETH/ETH
0xe59005c9f5BcA6F78184cF6BFe8ABc3Db50c3FA5
18
Fundamental exchange Rate
cUSDO/USD
0xde2b298544633394e0578fd0b8c27bad60a80e5d
8
Fundamental exchange Rate
upUSDC/USD
0x0a9cf62e0a8160659a53cabc833809e5fe76a20f
8
Fundamental exchange Rate
coreUSDC/USD
0xe493b4542bb9b242f8330136542648adc2407d70
8
Fundamental exchange Rate
xETH/USD
0xc7b98e6629665edeb91911a5183f2ca092b4c894
8
Fundamental exchange Rate
xBTC/USD
0xdc5761a0c1af03d476ce9bc17a2091aa8b676a25
8
Fundamental exchange Rate
* The xUSD feed reflects the exchange rate published by Stream’s xUSD vault contract (https://etherscan.io/address/0xE2Fc85BfB48C4cF147921fBE110cf92Ef9f26F94), sourced from the roundPricePerShare function. This rate is based on Stream’s internal NAV reporting and is not backed by an independent proof of reserves.
Explorer: https://etherscan.io/
ChainList: https://chainlist.org/chain/1
Compatible with AggregatorV3Interface.
BTC/USD
8
0.5%
24 hours
ETH/USD
8
0.5%
24 hours
USDC/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
miUSD/USD
8
0.5%
24 hours
Explorer: https://katanascan.com/
Chainlist: https://chainlist.org/chain/747474
Compatible with AggregatorV3Interface.
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
Explorer: https://explorer-sepolia.inkonchain.com/
Chainlist: https://chainlist.org/chain/763373
Compatible with AggregatorV3Interface.
BTC/USD
8
1.0%
24 hours
DAI/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
Chainlist: https://chainlist.org/chain/1123
Compatible with AggregatorV3Interface.
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
sFRAX/FRAX
18
1.0%
24 hours
Fundamental exchange Rate
sfrxETH/frxETH
18
1.0%
24 hours
Fundamental exchange Rate
Explorer: https://sepolia.explorer.mode.network
Chainlist: https://chainlist.org/chain/919
Compatible with AggregatorV3Interface.
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
PLUME/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
wOETH/OETH
18
1.0%
24 hours
Explorer: https://test-explorer.plumenetwork.xyz/
Chainlist: https://chainlist.org/chain/98867
Compatible with AggregatorV3Interface.
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
Explorer: https://turin.explorer.tac.build/
Chainlist: https://chainlist.org/chain/2390
Compatible with AggregatorV3Interface.
BTC/USD
8
0.5%
24 hours
ETH/USD
8
0.5%
24 hours
USDC/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
XDC/USD
8
0.5%
24 hours
ynRWAx/USDC**
18
0.5%
24 hours
⚠️
**Exchange-rate feed self-reported by YieldNest. Reflects protocol-defined accounting and does not track traded market price or include independent Proof of Reserves.
Explorer: https://xdcscan.com//
RPC: https://rpc.xdc.org
Chainlist: https://chainlist.org/chain/50
Compatible with AggregatorV3Interface.
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
xUSD/USD*
8
1.0%
24 hours
Fundamental exchange Rate
* The xUSD feed reflects the exchange rate published by Stream’s xUSD vault contract (https://etherscan.io/address/0xE2Fc85BfB48C4cF147921fBE110cf92Ef9f26F94), sourced from the roundPricePerShare function. This rate is based on Stream’s internal NAV reporting and is not backed by an independent proof of reserves.
Explorer: https://plasmascan.to/
Chainlist: https://chainlist.org/chain/9745
EO contracts for Soneium chain
Compatible with AggregatorV3Interface.
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
Explorer: https://soneium.blockscout.com/
Chainlist: https://chainlist.org/chain/1868
Compatible with AggregatorV3Interface.
* The xUSD feed reflects the exchange rate published by Stream’s xUSD vault contract (https://etherscan.io/address/0xE2Fc85BfB48C4cF147921fBE110cf92Ef9f26F94), sourced from the roundPricePerShare function. This rate is based on Stream’s internal NAV reporting and is not backed by an independent proof of reserves.
Explorer:
RPC:
Chainlist:
Price feeds are a crucial component in the decentralized finance (DeFi) ecosystem, allowing for a wide range of financial activities such as lending, borrowing, trading, and derivatives. Price feeds enable dapps to access accurate and updated pricing data in a secure and trustless manner.
EO price feeds aggregate information from many different data source and are published on-chain for easy consumption by dApps. This guide shows you how to read and use EO price feeds using Solidity.
Reading EO price feeds on EVM-compatible blockchains follows a consistent format for both queries and responses across different chains.
EO follows Chainlink's , allowing a smooth transition between blockchain oracle providers.
You have a basic understanding of .
An interface named IEOFeedAdapter defines several functions for accessing data from an external source.
These functions include retrieving the decimals, description, and version of the data feed, as well as fetching round data and the latest round data.
The constructor initializes a public variable named _feedAdapter, which is of type IEOFeedAdapter. It sets _feedAdapter to connect to a specific EOFeedAdapter contract deployed on the Holesky network at address 0xDD8387185C9e0a173702fc4a3285FA576141A9cd. This adapter is designated for the BTC feed.
The getPrice() function retrieves the latest price data from the _feedAdapter by calling the latestRoundData() function. It returns the answer, which represents the latest price of the BTC feed.
The usePrice() function internally calls getPrice() to fetch the latest price , illustrating how the price could be parsed and used.
Contact for more details on deployments and usage.
Compatible with AggregatorV3Interface.
* The xUSD feed reflects the exchange rate published by Stream’s xUSD vault contract (https://etherscan.io/address/0xE2Fc85BfB48C4cF147921fBE110cf92Ef9f26F94), sourced from the roundPricePerShare function. This rate is based on Stream’s internal NAV reporting and is not backed by an independent proof of reserves.
Explorer:
RPC:
Chainlist:
Compatible with AggregatorV3Interface.
* The xUSD feed reflects the exchange rate published by Stream’s xUSD vault contract (https://etherscan.io/address/0xE2Fc85BfB48C4cF147921fBE110cf92Ef9f26F94), sourced from the roundPricePerShare function. This rate is based on Stream’s internal NAV reporting and is not backed by an independent proof of reserves.
Explorer:
RPC:
Chainlist:
The service allows users to easily query for recent price updates via a REST API or via a websocket
A user could use this endpoint to query symbol price quotes via REST API.
Our REST API endpoints could be found at: https://api.eo.app
Endpoint: /api/v1/get_rate
Method: GET
Parameters:
symbol (string, required): The symbol for which to get the rate.
Authentication The REST API authentication method uses a username and password. The username is the API key provided to you by EO, and the password should be left blank.
Endpoint: /api/v1/get_symbols
Method: GET
Example
Response
We provide a simple socket stream using the standard Socket.IO library. The message protocol is described in the following section and includes a connection, authentication and subscribe phases.
Connection is made to our EO api endpoint at https://api.testnet.eo.app. Once connected the consumer can initiate an authentication event.
Request Events
authenticate
name: authenticate
payload:
subscribe
name: subscribe
payload:
authenticated
This message is received when your client was successfully authenticated with a valid API key. Once you receive this it's a good trigger point for subscribing to quote feeds.
name: 'authenticated'
payload: none
quote
name: quote
payload:
$ curl -X GET -u your_api_key "https://api.testnet.eo.app/api/v1/get_feed?symbol=eth"
{
"symbol":"eth",
"rate":"2235520000000000000000",
"timestamp":1704645014000,
"data":"0x0c64cc6cb523085bac8aa2221d5458999...2309417974f4a72b98"
}
curl -u your_api_key: 'https://api.eo.app/api/v1/get_symbols'["jpy", "eth", "btc"]{
"token": "your_api_key" // your blockchain oracle api key
}{
"topic": "feed", //(string, required): The topic to subscribe to (e.g., 'feed').
"symbols": ["btc", "eth", "jpy"] //(array of strings, optional): An array of symbols to subscribe to. If empty, subscribe to all symbols.
}{
"symbol": "eth",
"rate": "1923.45",
"timestamp": 1238971228,
"data": "0xbd3de674b8bd65........93c9220fb3202e405266", // provable merkle tree
}BSTONE/USD
8
1.0%
24 hours
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
FDUSD/USD
8
1.0%
24 hours
M-BTC/USD
8
1.0%
24 hours
SolvBTCm/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
BTC/USD
8
0.5%
24 hours
ETH/USD
8
0.5%
24 hours
USDC/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
xBTC/BTC
8
0.5%
24 hours
Fundamental exchange Rate
xETH/ETH
18
0.5%
24 hours
Fundamental exchange Rate
xUSD/USDC*
18
0.5%
24 hours
Fundamental exchange Rate
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
MANTA/USD
8
1.0%
24 hours
STONE/USD
8
1.0%
24 hours
Fundamental exchange Rate
SolvBTC/USD
8
1.0%
24 hours
TIA/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
wstETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
BTC/USD
0xEB0CDef56e02A334B7eaB620560aDa727bB994f6
8
1.0%
24 hours
ETH/USD
0x544eC0881831ade3ff4fa45A08095f20ad3e6c03
8
1.0%
24 hours
USDC/USD
0x4bc26034a74a3455F745Ebdbc08F4254d5FeFe01
8
1.0%
24 hours
USDT/USD
0x0cFb159a62b9C812728D347C1fBCb686CBBAa07F
8
1.0%
24 hours
ezETH/ETH
0xA4B0AA259242029F881298AA0EE359C953D7202a
18
1.0%
24 hours
Fundamental exchange Rate
weETH/ETH
0x1F14709a7138A92F71bEeAF7c17B66aff0B729d7
18
1.0%
24 hours
Fundamental exchange Rate
wstETH/ETH
0x1b9D482AEA73F8E232e8849FD71dC59c6c358AdC
18
1.0%
24 hours
Fundamental exchange Rate
yUSD/USD
0xa0B704EcF567F41cB8Be659342Be9988571d9240
8
1.0%
24 hours
Fundamental exchange Rate
// SPDX-License-Identifier: MIT
pragma solidity 0.8.25;
interface IEOFeedAdapter {
function decimals() external view returns (uint8);
function description() external view returns (string memory);
function version() external view returns (uint256);
function getRoundData(uint80 _roundId)
external
view
returns (uint80 roundId, int256 answer, uint256 startedAt, uint256 updatedAt, uint80 answeredInRound);
function latestRoundData()
external
view
returns (uint80 roundId, int256 answer, uint256 startedAt, uint256 updatedAt, uint80 answeredInRound);
}
contract EOCLConsumerExample {
IEOFeedAdapter public _feedAdapter;
/**
* Network: Holesky
* EOFeedAdapter: 0xDD8387185C9e0a173702fc4a3285FA576141A9cd
* Feed Symbol: BTC
*/
constructor() {
_feedAdapter = IEOFeedAdapter(0xDD8387185C9e0a173702fc4a3285FA576141A9cd);
}
function getPrice() external view returns (int256 answer) {
(, answer,,,) = _feedAdapter.latestRoundData();
}
function usePrice() external {
int256 answer = this.getPrice();
// Do something
// .............
}
}BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
MANTA/USD
8
1.0%
24 hours
STONE/USD
8
1.0%
24 hours
Fundamental exchange Rate
SolvBTC/USD
8
1.0%
24 hours
TIA/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
wstETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
BTC/USD
8
0.5%
24 hours
ETH/USD
8
0.5%
24 hours
USDC/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
xUSD/USD*
8
0.5%
24 hours
Fundamental exchange Rate
BTC/USD
8
0.5%
24 hours
ETH/USD
8
0.5%
24 hours
USDC/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
miUSD/USD
8
0.5%
24 hours
xUSD/USD*
8
0.5%
24 hours
Fundamental exchange Rate
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
BTC/USD
8
0.5%
24 hours
ETH/USD
8
0.5%
24 hours
USDC/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
Compatible with AggregatorV3Interface.
BTC/USD
8
1.0%
24 hours
DAI/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
Explorer: https://bob-sepolia.explorer.gobob.xyz/
Chainlist: https://chainlist.org/chain/808813
Compatible with AggregatorV3Interface.
ETH/USD
8
1.0%
24 hours
M-BTC/USD
8
1.0%
24 hours
MODE/USD
8
1.0%
24 hours
STONE/USD
8
1.0%
24 hours
Fundamental exchange Rate
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
ezETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
uniBTC/USD
8
1.0%
24 hours
Fundamental exchange Rate
weETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
wrsETH/USD
8
1.0%
24 hours
Explorer: https://explorer.mode.network
Chainlist: https://chainlist.org/chain/34443
Compatible with AggregatorV3Interface.
BTC/USD
8
0.5%
24 hours
ETH/USD
8
0.5%
24 hours
LBTC/USD
8
0.5%
24 hours
Fundamental exchange Rate
TON/USD
8
0.5%
24 hours
USDC/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
cbBTC/USD
8
0.5%
24 hours
pufETH/ETH
18
0.5%
24 hours
Fundamental exchange Rate
rsETH/ETH
18
0.5%
24 hours
Fundamental exchange Rate
wstETH/ETH
18
0.5%
24 hours
Fundamental exchange Rate
yUSD/USDC
18
0.5%
1 hour
Fundamental exchange Rate
Explorer: https://explorer.tac.build/
Chainlist: https://chainlist.org/chain/239
Compatible with AggregatorV3Interface.
ETH/USD
8
1.0%
24 hours
STONE/USD
8
1.0%
24 hours
Fundamental exchange Rate
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
WBTC/USD
8
1.0%
24 hours
ZRC/USD
8
1.0%
24 hours
ezETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
weETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
wrsETH/USD
8
1.0%
24 hours
wstETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
Explorer: https://explorer.zircuit.com/
Chainlist: https://chainlist.org/chain/48900
EO contracts for Soneium chain
Compatible with AggregatorV3Interface.
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
Explorer: https://soneium-minato.blockscout.com/
Chainlist: https://chainlist.org/chain/1946
Compatible with AggregatorV3Interface.
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
STONE/ETH
18
1.0%
24 hours
Fundamental exchange Rate
STONE/USD
8
1.0%
24 hours
Fundamental exchange Rate
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
ZRC/USD
8
1.0%
24 hours
Chainlist: https://chainlist.org/chain/48898
Compatible with AggregatorV3Interface.
* The xUSD feed reflects the exchange rate published by Stream’s xUSD vault contract (https://etherscan.io/address/0xE2Fc85BfB48C4cF147921fBE110cf92Ef9f26F94), sourced from the roundPricePerShare function. This rate is based on Stream’s internal NAV reporting and is not backed by an independent proof of reserves.
Explorer:
RPC:
Chainlist:
ETH/USD
8
0.5%
24 hours
TBTC/USD
8
0.5%
24 hours
USDC/USD
8
0.5%
24 hours
USDe/USD
8
0.5%
24 hours
WBTC/USD
8
0.5%
24 hours
sUSDe/USDe
18
0.5%
24 hours
Fundamental exchange Rate
xUSD/USD*
8
0.5%
24 hours
Fundamental exchange Rate
yUSD/USDC
18
0.5%
24 hours
ETH/USD
8
0.5%
24 hours
TBTC/USD
8
0.5%
24 hours
USDC/USD
8
0.5%
24 hours
WBTC/USD
8
0.5%
24 hours
EO contracts for Scroll chain
Compatible with AggregatorV3Interface.
ETH/USD
8
1.0%
24 hours
STONE/USD
8
1.0%
24 hours
Fundamental exchange Rate
SolvBTC/USD
8
1.0%
24 hours
SolvBTCb/USD
8
1.0%
24 hours
SolvBTCm/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
USDe/USD
8
1.0%
24 hours
WBTC/USD
8
1.0%
24 hours
pufETH/USD
8
1.0%
24 hours
uniETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
weETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
wrsETH/USD
8
1.0%
24 hours
wstETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
Explorer: https://scrollscan.com
Chainlist: https://chainlist.org/chain/534352
Compatible with AggregatorV3Interface.
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
LINK/USD
8
1.0%
24 hours
M-BTC/USD
8
1.0%
24 hours
SOL/USD
8
1.0%
24 hours
SolvBTC/USD
8
1.0%
24 hours
SolvBTCBBN/USD
8
1.0%
24 hours
Fundamental exchange Rate
TAIKO/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
ezETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
sFRAX/FRAX
18
1.0%
24 hours
Fundamental exchange Rate
sfrxETH/frxETH
18
1.0%
24 hours
Fundamental exchange Rate
stETH/ETH
18
1.0%
24 hours
Fundamental exchange Rate
uniBTC/USD
8
1.0%
24 hours
Fundamental exchange Rate
ynETH/ETH
18
1.0%
24 hours
Fundamental exchange Rate
Explorer: https://taikoscan.io
Chainlist: https://chainlist.org/chain/167000
This document analyzes how to aggregate multiple data reports into a single reliable estimate. We review aggregation methods and their properties, focusing on resistance to errors and manipulation.
When protocols encounter tasks that cannot be solved on-chain (blockchain oracle tasks) and require a decentralized solution, it's crucial to distinguish between two fundamentally different purposes of decentralization:
Decentralization for Better Results - When there's no single "right" answer, decentralizing the methodology itself improves outcomes. This applies to scenarios where different approaches provide complementary insights or where aggregating multiple viewpoints improves accuracy.
Decentralization for Security —When the task is well-defined but requires protection against manipulation or failure. This applies to operations requiring high reliability and fault tolerance where a single point of failure must be avoided.
The distinction between these purposes is fundamental: the first seeks to improve quality through diverse methodologies, while the second ensures integrity through distributed execution of a single, well-defined process.
This document analyzes how to aggregate multiple data reports into a single reliable estimate. We review aggregation methods and their properties, focusing on resistance to errors and manipulation, to find an optimal strategy balancing accuracy and efficiency. We place special emphasis on the weighted median since it is the most common aggregation method for price feed blockchain oracles.
We first consider a simple setting where data origins from a single-source.
Let be the relevant quantity at time , e.g., the BTC/USD price. Notice that is unknown. Instead, we can access an estimate from a known and agreed-upon source, for instance, interactive brokers. A set of fetchers fetch for us, where denotes the quantity reported by fetcher . The aggregate is the number we forward as our estimate of (and essentially ).
The question is how to aggregate into one number, , representing the price according to that source at time .
Time-variance. Since time is continuous, fetchers access the source at slightly different times. We don’t expect the time differences to be significant; more particularly, the time differences should not exceed one second, which is the rate of our blockchain i\o.
Simplicity. It is crucial due to runtime considerations and explaining to users how our mechanism works (KISS/Occam’s razor).
(Honest) mistakes. Although we have incentive mechanisms to punish/reward fetchers’ behavior, mistakes are unavoidable. For instance, downtime, latency, etc. These can happen even if fetchers are honest and thus should be accounted for.
Malicious behavior. Our solution should be as robust as possible to attacks. Namely, it should minimize the consequences of an attack and facilitate punishing attackers.
To quantify the last point, the Breakdown Point of an aggregate is the minimal ratio of reports by a malicious actor that allows it to achieve arbitrary deviation from .
We review below several options for our selection of an aggregation. All of them are special cases of minimizing an objective function. While there are infinite such functions, our analysis focuses only on median, average, and their trimmed counterparts.
Simple Average (Mean)
Pros: Easy to calculate; treats all reports equally; good for consistent data without outliers.
Cons: Can be skewed by outliers: its breakdown point is zero.
Weighted Average
Pros: Accounts for the varying significance of each report (e.g., based on stake); more accurate if reports are not equally reliable.
Cons: More complex to calculate, can still be skewed.
Median
Pros: Less affected by outliers than the mean; simple to understand and calculate.
Cons: May not reflect the influence of large/small prices if they are outliers.
Mode (most common value)
Pros: Represents the most frequently occurring price; useful in markets with standard pricing.
Cons: Vary widely or if there is no repeating price.
Trimmed Mean
Pros: Excludes outliers by trimming a specified percentage of the highest and lowest values before averaging; balances the influence of outliers.
Cons: Arbitrariness in deciding what percentage to trim; could exclude relevant data.
Quantile-based Aggregation
Pros: Can focus on a specific part of the distribution (e.g., median is the 50% quantile); useful for risk management strategies.
Cons: Not representative of the entire data set; can be as sensitive to outliers as the mean.
weighted median enjoys the robustness of the median and the ability to consider different significance levels as the weighted average. Its breakdown point is 50% of the weight; below that, an adversary can only manipulate the result within the range of correctly reported values (as we prove later on). The weighted median allows us to incorporate the stake of the different fetchers.
Mathematically, let be the (positive) stake of fetcher , and assume that . Also, for simplicity, assume that are sorted. The weighted median is an element such that
Therefore, our aggregate for such .
To demonstrate the robustness of the weighted median, we present the following theorem. It proves that as long as the majority of the stake belongs to honest fetchers, the aggregate will always be between the honest reports; namely, an attacker with a minority weight (stake) cannot shift the aggregate too much.
Theorem: Let be the set of honest fetchers, for such that , and let be the set of malicious fetchers, for such that .
Then, the weighted median aggregate always satisfies
Recall that in the reasonable case we do not expect high variance among (honest) reports; thus, the interval will be small. This ensures that our aggregate is robust to manipulations.
Recall that prices are susceptible to noise and volatility. Therefore, financial applications often average prices over time. Well-known methods include Moving Average, Exponential Smoothing, Time-weighted Average Price (TWAP), and Volume-weighted Average Price (VWAP).
Our current service does not implement such time averages. We allow our customers the flexibility of the computation at their end.
We are now facing a similar problem, but now each quantity is given by a source (and not a fetcher). We have a set of sources , where each source has a price . Along the different prices we have additional weight per source. Weights capture our confidence in that source, volume, liquidity, etc. The weight-adjusted price is given by
There are several ways by which we can set the weights.
Volume-Weighted Average Price (VWAP):
Description: VWAP is calculated by taking the dollar amount of all trading periods and dividing it by the total trading volume for the current day. In your case, it involves weighting each source's rate by its volume, giving more influence to sources with higher trading volumes.
Advantages: Reflects more liquidity and is a common benchmark used by traders. It gives a fair reflection of market conditions over the given period.
Disadvantages: More susceptible to volume spikes, which can distort the average price.
Liquidity-Adjusted Weighting:
Description: Here, the rate from each source is weighted based on its liquidity. This method requires a clear definition and measurement of liquidity, which can include factors like bid-ask spread, market depth, and the speed of price recovery after a trade.
Advantages: Provides a more realistic view of the market by acknowledging that more liquid markets better reflect the true market price.
Disadvantages: Liquidity can be harder to measure accurately and may vary quickly, making it challenging to maintain an accurate aggregate price in real-time.
This document explores single-source price aggregation in blockchain oracle systems, covering both result improvement and security aspects of decentralization. It analyzes various aggregation methods, focusing on weighted median for its manipulation resistance and stake-based weighting capabilities, while also examining time-based averaging and multi-source aggregation approaches.
The document highlights breakdown points in aggregation methods, demonstrates weighted median's security when honest fetchers hold majority stake, and evaluates trade-offs between different aggregation strategies.
ETH/USD
8
0.5%
24 hours
SOV/USD
8
0.5%
24 hours
High risk: Liquidity is low across all markets. Consider carefully before integrating
STONE/USD
8
0.5%
24 hours
Fundamental exchange Rate
USDCe/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
WBTC/USD
8
0.5%
24 hours
tBTC/USD
8
0.5%
24 hours
uniBTC/USD
8
0.5%
24 hours
Fundamental exchange Rate
wstETH/USD
8
0.5%
24 hours
Fundamental exchange Rate
ETH/USD
8
0.5%
24 hours
SOV/USD
8
0.5%
24 hours
High risk: Liquidity is low across all markets. Consider carefully before integrating
STONE/USD
8
0.5%
24 hours
Fundamental exchange Rate
USDCe/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
WBTC/USD
8
0.5%
24 hours
tBTC/USD
8
0.5%
24 hours
wstETH/USD
8
0.5%
24 hours
Fundamental exchange Rate













































At EO, we've implemented a comprehensive categorization system for our data feeds. This system is designed to inform users about the intended use cases of each feed and highlight potential market integrity risks associated with data quality. Our goal is to provide you with the information needed to make informed decisions when integrating EO feeds into your applications.
It's important to note that all feeds published in EO's documentation undergo rigorous monitoring and maintenance, adhering to the same high standards of quality. Each feed is subject to a thorough assessment process during implementation. The specific assessment criteria may vary depending on the type of feed being deployed and may evolve over time as our understanding of market integrity risks deepens.
We group our data feeds into the following categories, ranked from lowest to highest level of market integrity risk:
✅ Low Market Risk
🟡 Medium Market Risk
⭕ High Market Risk
⚙ Custom Feeds
🔭
This categorization serves as a guide to help you understand the relative risk profile of each feed. However, we encourage users to conduct their own due diligence and risk assessment when integrating any data feed into their smart contracts or applications.
By providing this transparent categorization, EO aims to empower developers and projects with the knowledge they need to build robust, risk-aware decentralized applications. Remember, the appropriate use of a feed depends on your specific use case and risk tolerance.
Key Risk Factors by Market Integrity Risk
Market Events
Highly resilient to disruption
Ongoing market events (e.g., token migrations)
Significant market events (e.g., hacks, bridge failures, major exchange delistings)
Price Feeds Stability
Use numerous data sources
The price spread between trading venues
Asset/project market deprecation
Trading Volume
Consistent price discovery due to high volumes across many markets
Low/inconsistent volume causing liquidity issues and price volatility
Extremely low trading volumes
Centralization
X
Concentrated trading on a just few exchange
Concentrated trading on a single exchange
Data Inconsistency
X
x
High spread between data providers
Custom Feeds are designed for specific purposes and may not be appropriate for general usage or align with your risk parameters. It's essential for users to examine the feed's characteristics to ensure they match their intended application.
Custom feeds fall into these categories:
Onchain single source feeds: Utilize data from one onchain source, with only one provider currently supporting the feed.
Onchain Proof of Reserve Feeds: Employ a large, diverse group of vetted node operators to obtain and confirm onchain reserve data.
Exchange Rate Feeds: Access exchange rates from external onchain contracts for token conversions. EO doesn't own or manage these contracts.
Total Value Locked Feeds: Assess the total value locked in particular protocols.
Custom Index Feeds: Calculate values based on multiple underlying assets using predetermined formulas.
Offchain Proof of Reserve Feeds: Verify offchain reserves through custodian attestations.
LP Token Feeds: Combine decentralized feeds with calculations to value liquidity pool tokens.
Wrapped Calculated Feeds: Specific feeds that are pegged 1:1 to underlying assets, but may deviate from market price given that the price is a derivative formed from a calculated method.
When integrating price data for an asset into your smart contract, ensure the asset maintains adequate market liquidity to prevent price manipulation. Low-liquidity assets can experience high volatility, potentially harming your application and users. Unscrupulous actors may exploit volatility or low trading periods to manipulate smart contract execution.
Some feeds source data from single exchanges rather than aggregated services. These are identified in the feed's documentation. Evaluate the specific exchange's liquidity and reliability.
Liquidity migrations, where tokens move between providers (e.g., DEX to CEX), can temporarily deplete the original pool's liquidity, increasing manipulation risk. If planning a migration, collaborate with stakeholders (liquidity providers, exchanges, blockchain oracle operators, data providers, users) to maintain accurate pricing throughout.
Low-liquidity assets may show price oscillations between points at regular intervals, especially when data providers show unusual price spreads. To manage this risk, continuously assess the asset's liquidity quality. Low-liquidity assets may also experience erratic price movements from incorrect trades.
Develop and test your contracts to manage price spikes and implement protective measures. For instance, create tests simulating various blockchain oracle responses.
Certain data providers rely on a single source, which may be unavoidable when only one source exists, either onchain or offchain, for specific data types. It's crucial to thoroughly evaluate these providers to ensure they deliver reliable, high-quality data for your smart contracts. Be aware that any errors or omissions in the provider's data could adversely affect your application and its users. Careful assessment of single-source providers is essential to mitigate potential risks associated with data inaccuracies or inconsistencies.
Price data quality can be affected by actions taken by crypto and blockchain project teams. These "crypto actions" are akin to corporate actions but specific to the crypto sphere. They include token renaming, swaps, redenominations, splits, reverse splits, network upgrades, and other migrations initiated by project teams or governing communities. Maintaining data quality depends on data sources implementing necessary adjustments for these actions. For instance, a token upgrade resulting in migration may require a new Data Feed to ensure accurate price reporting. Similarly, blockchain forks or network upgrades might necessitate new Data Feeds for data continuity and quality. Projects considering token migrations, forks, network upgrades, or other crypto actions should proactively engage relevant stakeholders to maintain accurate asset price reporting throughout the process.
Data Feed performance is dependent on the blockchain networks they operate on. During times of high network congestion or downtime, the frequency of EO Data Feeds may be affected. It's recommended that you design your applications to detect and appropriately respond to such chain performance or reliability issues. Implementing measures to handle these network fluctuations can help maintain the stability and accuracy of your data-dependent applications.
Assets with significant presence on decentralized exchanges (DEXs) face unique market structure risks. Market integrity may be compromised by flash loan attacks, volume shifts between exchanges, or temporary price manipulation by well-funded actors. DEX trades can also experience slippage due to liquidity migrations and trade size. The impact of high-slippage trades on market prices depends on the asset's trading patterns. Assets with multiple DEX pools, healthy volumes, and consistent trading across various time frames generally have lower risk of deviant trades affecting aggregated prices.
When evaluating a EO Data Feed for backed or bridged assets (e.g., WBTC), users should weigh the pros and cons of using a feed specifically for the wrapped asset versus one for the underlying asset.
Decisions should be made individually, considering:
Liquidity
Market depth
Trading volatility of the underlying asset compared to its derivative
Users must also assess the security mechanism maintaining the peg between the wrapped asset and its underlying counterpart. Regularly review these factors as asset dynamics evolve over time.
EO Data Feeds are designed to report market-wide prices of assets using aggregated prices from various exchanges. For backed or bridged assets, these feeds continue to report the underlying asset's price in addition to the wrapped token's price. This approach reduces manipulation risks associated with the typically lower liquidity of wrapped tokens.
However, users should be aware that extreme events, such as cross-chain bridge exploits or hacks, may cause significant price deviations between wrapped assets and their underlying counterparts. For instance, a bridge hack could lead to a collapse in demand for a particular wrapped asset.
To mitigate risks during such scenarios, users should implement safeguards in their applications. Circuit breakers, which can be created using EO Automation, are recommended to proactively pause functionality when unexpected scenarios are detected in data feeds.
Additionally, consider using EO Proof of Reserve for real-time monitoring of wrapped asset reserves. This enables protocols to ensure proper collateralization by comparing the wrapped token's supply against the Proof of Reserve feed.
Exchange rate feeds differ fundamentally from standard market rate EO Price Feeds in their architecture and purpose.
Market rate feeds provide price updates based on aggregated prices from multiple sources, including centralized and decentralized exchanges. This approach offers a comprehensive view of an asset's market-wide price.
In contrast, exchange rate feeds are specific to particular protocols or ecosystems. They report internal redemption rates for assets within that ecosystem, sourcing data directly from designated smart contracts on a source chain and relaying it to a destination chain.
Exchange rate feeds are particularly useful for:
Pricing yield-bearing assets by combining the exchange rate with the underlying asset's market rate
Enhancing liquidity pool performance for yield-bearing assets by enabling programmatic adjustments to swap curves
It's crucial to note that both feed types have distinct risk profiles and mitigation strategies, which vary based on asset type and liquidity. Users are responsible for selecting the appropriate feed for their needs.
Purpose
This document provides a comprehensive explanation of EO's smart contract infrastructure design. This design aims to create a modular and programmable blockchain oracle platform capable of supporting the development of new blockchain oracles. This designs aims to provide a programmable layer that consists of operators management, different data types and aggregation methods.
Scope
This design document covers the architecture and implementation details of the smart contracts that will form the core of EO chain contracts. It takes into account the offchain components which are communicating with the EO chain.
Building new OVSs requires many pieces of infrastructure, specialized for managing operators, aggregating decentralized data and providing it to consumers.
The new smart contract infrastructure aims to achieve the following:
Modularity: Create a system where different parts can be easily added, removed, or updated without affecting the whole system.
enable updating data feeds and data sources
enable registering/deploying new OVSs
enable registering/deploying new feeds
enable registering new aggregators
Scalability: Design the system to handle more data sources, data feeds, aggregation methods.
Customizable Blockchain Oracle Services (OVS): Make it easy for partners to create and deploy their own custom blockchain oracle services according to specific needs
Enhanced Flexibility: Ensure the new system can easily manage the connection between data feeds/sources, feeds/OVS, feed/aggregators etc
The system is designed to enable the construction of new blockchain oracles, referred to as blockchain Oracle Validated Services (OVS). Each OVS operates with its own set of registered operators who participate in the data aggregation and consensus processes specific to that OVS. Operators can request to join an OVS, and their participation is approved by the OVS Admin, who manages operator registration, data sources, and feeds within the OVS. The operators’ stake or voting power, registered to the EO network, is crucial for consensus on the data provided by each OVS. Every OVS contains feeds—data pipelines with input from one or more sources and an aggregator smart contract that processes the inputs to generate a single output. This output is distributed through the EO distribution system to target chains. The system’s modularity allows new blockchain oracles to be easily built by deploying a new OVS, configuring custom feeds, and allowing operators to opt-in for data updates, creating a scalable, customizable solution for diverse blockchain oracle use cases.
This section will describe the core components of the system:
Data Feed Management Module: Handles the configuration and management of data feeds.
Operator Management Module: Manages operator registration, activation, and stake.
Aggregation Engines: Combines data from different operators to create a unified result.
OVS Module: Allows partners to build and customize their blockchain oracle services.
Diagram
As an OPERATORS_MANAGER:
activate operators and manage operators configuration
setStake()
Activate Operators
Add Operators to OVS
manage aliases
changeAlias()
assignAlias()
As an OVS_MANAGER:
register OVS
register aggregator
approve OVS
approve aggregator
As FEED_MANAGER:
add/remove/update data sources
track feed id and store basic feed info - Ovs and Aggregator
As an OVS Admin:
grant/revoke other roles within smart contract
deposit rewards
approve/decline operators requests to opt-in to OVS
add feeds to OVS (saved to aggregator)
As an operator:
register/deregister from EO AVS
register/deregister to OVS
join to OVS
declareAlias to each OVS
updateFeedReport
Anyone:
request register new Aggregator
request register new OVS
Operator - OVS
each operator can be registered to many OVS
each OVS can have many operators
OVS - Aggregator
each OVS can support many aggregators
each Aggregator can be used in several OVS as separate instance
OVS - Feed
each OVS can contain many feeds
feed is related to one OVS only
Feed - Source
feed can be fetched from several sources
sources can be used by several feeds
actors
OVS_MANAGER
operator
guest
storage
interfaces
Responsibilities: separate contract responsible for feeds tracking and sources management
Interactions: can contain most of the storage which is used by other components
storage
interface
Interface
Role
Contract
Responsibilities
OVS_MANAGER
Registry (OVS module)
register/deploy OVSs, rigister/approve aggregators
FEED_MANAGER
Config
sources should be managed in Registry globally
OVS_ADMIN (OVS Admin)
OVS
admin, owner of OVS instance contract, manages other roles through AccessControl
OVS_PAUSER (for now it is OVS_ADMIN)
OVS
pause OVS
OVS_OPERATOR (for now it is OVS_ADMIN)
OVS
Approve Operators, add new feeds/sources to the OVS.
struct OVS {
uint256 totalStake
Feed[] feeds
bool isActive
registeredOperators[]
requestedOperators[]
}
enum OVSOperatorStatus {
Unknown,
Requested,
Approved,
Declined
}
uint64 ovsAddresses[]
mapping(address ovsAddress => OVS) ovses;
mapping(address ovsAddress => mapping(address operator => OVSOperatorStatus)) ovsOperatorStatuses;
// ovsIds for each operator are stored in operator struct// as anyone?
function requestRegisterOVS(???) // not v1
function requestRegisterAggregator() // not v1
// as operator(alias):
function registerOperatorToOVS(address ovs, address alias)// alias can be declared on this step
// but how to do it in batch?
function registerOperatorToOVSes(address[] ovses, , address[] aliases) // when operator is active.
function updateFeedReport() // not needed if call directtly to OVS
//as OVS
function updateOperatorAsApproved(address operator)
function updateOperatorAsDeclined(address operator)
//as OVS_MANAGER
function approveOVS() //->deploys clone of OVS
function registerOVS() //- >deploys clone of OVS / OR whitelisting
function approveAgregator() // do we need to track all aggregators and approve them???
function registerAggregator()uint64 sourcesIds[]
mapping(uint64 sourceId => Source) sources;
function updateNextFeedId() external returns (uint256 feedId)
function updateNextSourceId() external returns (uint256 sourceId)
uint64 feedIds[]; // ranges for each OVS? 0-n 10000-n
mapping(uint64 feedId => Feed) feeds;
uint256 totalStake;
// other config ???
address[] registeredOperators; // check if not redundant,
// stored in OVS instance or in OVS module (part of Registry) ???// as an Registry
function registerOperator(address operator, address alias)// alias can be declared on this step
// as OWS_OWNER
function addFeeds(Feed[] feeds)
// add to registeredOperators[] to Registry
// + change OVSOperatorStatus to Approved + call _recalculateStake()
function approveOperators(address[] operators)
// change OVSOperatorStatus to Declined
function declineOperators(address[] operators)
// updateOVS
// remove from registeredOperators[] (at Registry)
// + change OVSOperatorStatus to Unknown + _recalculateStake()
function deleteOperators(address[] operators)
function depositRewards()
function updateFeedReport() // directly to each OVS but
//not to Registry.OVSModule single entry point?
function changeAlias(address newOperatorAlias)
function aggregate(Feed[] feeds) // feed.data should be decoded according to defined scheme in Aggregatorflowchart LR
Aggregator-- "reads feeds config,
thresholds" -->OVS
Aggregator-- reads cached votes and stake -->OVS
BTC/USD
8
0.5%
24 hours
ETH/USD
8
0.5%
24 hours
LBTC/USD
8
0.5%
24 hours
Fundamental exchange Rate
M-BTC/USD
8
0.5%
24 hours
TBTC/USD
8
0.5%
24 hours
USDC/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
WBTC/USD
8
0.5%
24 hours
bfBTC/USD
8
0.5%
24 hours
brBTC/USD
8
0.5%
24 hours
enzoBTC/USD
8
0.5%
24 hours
ezETH/USD
8
0.5%
24 hours
Fundamental exchange Rate
hemiBTC/USD
8
0.5%
24 hours
iBTC/USD
8
0.5%
24 hours
oBTC/USD
8
0.5%
24 hours
pumpBTC/USD
8
0.5%
24 hours
rsETH/USD
8
0.5%
24 hours
Fundamental exchange Rate
stBTC/USD
8
0.5%
24 hours
uBTC/USD
8
0.5%
24 hours
uniBTC/USD
8
0.5%
24 hours
Fundamental exchange Rate
vUSD/USD
8
0.5%
24 hours







BNB/USD
8
0.5%
24 hours
BTC/USD
8
0.5%
24 hours
BTCB/USD
8
0.5%
24 hours
ETH/USD
8
0.5%
24 hours
LBTC/USD
8
0.5%
24 hours
Fundamental exchange Rate
SolvBTC/USD
8
0.5%
24 hours
USDC/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
USDe/USD
8
0.5%
24 hours
WBTC/USD
8
0.5%
24 hours
bfBTC/USD
8
0.5%
24 hours
sUSDE/USDE
18
0.5%
24 hours
uniBTC/USD
8
0.5%
24 hours
Fundamental exchange Rate
wBETH/USD
8
0.5%
24 hours
xSolvBTC/USD
8
0.5%
24 hours
Fundamental exchange Rate
BTC/USD
8
0.5%
24 hours
BTCB/USD
8
0.5%
24 hours
ETH/USD
8
0.5%
24 hours
LBTC/USD
8
0.5%
24 hours
Fundamental exchange Rate
SolvBTC/USD
8
0.5%
24 hours
USDC/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
WBTC/USD
8
0.5%
24 hours
uniBTC/USD
8
0.5%
24 hours
Fundamental exchange Rate
Compatible with AggregatorV3Interface.
BTC/USD
8
0.5%
24 hours
ETH/USD
8
0.5%
24 hours
S/USD
8
0.5%
24 hours
USDC/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
wsrUSD/USD
8
0.5%
24 hours
Fundamental exchange Rate
xBTC/BTC
8
0.5%
24 hours
Fundamental exchange Rate
xETH/ETH
18
0.5%
24 hours
Fundamental exchange Rate
xUSD/USD*
8
0.5%
24 hours
Fundamental exchange Rate
xUSD/USDC*
18
0.5%
24 hours
Fundamental exchange Rate
yUSD/USD
8
0.5%
24 hours
Fundamental exchange Rate
yUSD/USDC
18
0.5%
24 hours
scUSD/USD
8
--
--
smsUSD/msUSD
18
--
--
xETH/USD
8
--
--
xBTC/USD
8
--
--
ceresSmsUSD/msUSD
18
--
--
* The xUSD feed reflects the exchange rate published by Stream’s xUSD vault contract (https://etherscan.io/address/0xE2Fc85BfB48C4cF147921fBE110cf92Ef9f26F94), sourced from the roundPricePerShare function. This rate is based on Stream’s internal NAV reporting and is not backed by an independent proof of reserves.
Explorer: https://sonicscan.org/
Chainlist: https://chainlist.org/chain/146
Slashing refers to penalizing protocol participants who deviate from protocol rules by removing a portion of their staked assets. This mechanism is unique to PoS protocols, as it requires the blockchain to enforce such penalties.
On-chain subjective slashing refers to penalizing nodes for faults that cannot be attributed to validators based solely on on-chain evidence or protocol rules. In blockchain oracle networks, faults are typically of the form of submitting inaccurate data in an attempt to manipulate the blockchain oracle.
A key challenge in implementing robust slashing is the risk of nodes getting slashed due to honest mistakes. This concern is amplified in the subjective case, since determining whether a fault occurred is debatable — beyond just determining if it was committed maliciously or honestly.
Therefore, a prerequisite to slashing is the ability to detect misreports reliably and qualitatively. We outline different considerations for designing such a mechanism. We then need to establish appropriate penalty policies completing the detect and deter mechanism.
We propose a minimal, stylized mathematical model to analyze how the slashing mechanism should be designed. We abstract away some details for both simplicity and generality, making the analysis relevant to a general data blockchain oracle rather than a specific type. The concrete example of price reports is kept in mind and given special attention throughout the document.
More specifically, we aim to achieve the following:
Establish a common framework of terminology and concepts to enhance communication within the company
Derive basic principles and guidelines for an optimal solution.
Better articulate considerations, limitations, and inherent trade-offs, and suggest several options along different points on the trade-off curve.
Lastly, we note that implementation details are out of scope.
First, we outline the goals we want detection and penalties to achieve in a non-formal way. We deliberately avoid formalizing the desired properties at this point, instead stating general objectives to keep in mind.
Discourage misreports. Requires the ability to detect faults and penalize appropriately to deter initial misconduct.
Avoid penalizing honest mistakes. Necessitates the ability to differentiate between honest errors and malicious actions.
Avoid discouraging risk-taking: The penalty system should not discourage participants from reporting abrupt and sharp changes in data values.
Discourage uninformed voting. Examples of uninformed voting strategies include following the majority vote or consistently reporting the last aggregated result.
Prevent correlated attacks. As defined and described below
These objectives establish the foundation for developing effective detection and penalty mechanisms. Let us now examine our model.
We introduce a model that is as general as possible, leaving the data domain, aggregation method, and metric unspecified. Here is the basic setup:
The protocol consist of data fetchers , submiting reports in rounds (for example every block). Let be the report of player at round , and assume the reports belong to the domain . We use two special characters and to denote non-reports and out-of-domain reports, respectively. Namely, we denote if and only if player did not report at time , and assume without loss of generality that in case . In each round all reports are aggregated to a single value by an aggregation function . The aggregated value is denoted by .
After each round, we compute two functions:
Detection function: . Where corresponds to an honest report' correspond to a fraud and is the history, consisting of all past reports and past decisions.
Penalty function : , corresponds to the amount of stake to be slashed from player .
By "non-verifiable truth," we mean that the value validators report on cannot be objectively verified, either because it is inherently unknowable or because the protocol cannot access it.
We refer to various methods and approaches to identify fraud as filters. These filters have different properties:
Type of proof: What information does the filter require - can it be applied using only on-chain data? What strength of evidence does this filter provide?
Cost: This includes expenses such as gas consumption for on-chain computation.
Execution time
Precision and recall: Different filters are located at different spots on the precision-recall tradeoff curve and, in particular, have different tendencies for false positives and false negatives.
We now describe and analyze different types of filters. We classify them into three classes: Logical Conditions, Statistical Tools, and Filters Based on Human Judgment.
Defined conditions that can be automatically checked and enforced. These should be determined within the protocol setup phase and updated periodically. Examples are:
Logic&Physics: Methods involve studying the underlying function and identifying major deviations.
. Submissions outside of domain.
. Reports distanced from the protocol result. Note this result is not known to validator on submission.
. Reports distanced from the last protocol result (known to validator upon submission)
. Reports distanced from the last report of the validator.
Robust against a malicious majority (coordinated attack).
Does not account for the lazy strategy of submitting the same response without gathering information on the actual value.
May discourage reporting on a sudden change in value.
Preliminary analysis of the specific data of interest - a comprehensive domain analysis should be conducted to determine and define:
Defining the domain of valid reports and being able to determine if a report falls within this domain (compute predicate )
Provide a metric function
Specify resonable changes of the function of value per time to determine .
Social: Evaluating a report relative to other reports in the same round. Most naturally is relative to the aggregation result: . We can also consider comparison to other function of . Does not defend against a malicious majority (coordinated attack).
More options to what value we compare when deciding on .
Compared to other reports .
Compared to the validator past reports .
Compared to the aggregation result. This is not the same as (1) because aggregation also takes into account validator stakes.
Compared to where are weights derived from validator stake. This is a generalization of (3).
Advanced statistical methods such as anomaly detection, comparing data against known fraud indicators or patterns, and machine learning algorithms that learn from historical fraud data to predict or identify fraudulent behavior. In more details different filters in this class include:
Anomaly Detection: Using statistical models to identify unusual behavior that deviates from the norm, indicating potential fraud.
Pattern Recognition: Analyzes historical data to identify patterns associated with fraudulent activities, helping to predict and detect similar attempts in the future.
Reinforcement Learning: Employs algorithms that learn from data over time to improve the detection of fraudulent transactions automatically.
Rule-based Systems: Applies a set of predefined rules based on known fraud scenarios to detect fraud. These systems are often used in conjunction with other methods to enhance detection capabilities.
This method can be used to evaluate validators' long-range performance (discussed below) and trigger an alarm before an incident occurs.
Human oversight and judgment serve as an additional verification layer. For example:
Committee Voting: A committee of validators or stakeholders can vote to resolve disputes and identify faults. Ideally, the committee is a large, external, and impartial jury committed to participating in informed voting.
Random Validator Selection: A randomly selected validator is used to validate suspicious reports. This method is more affordable and faster than a full committee vote. The set from which the validator is chosen can be distinct from the protocol validators set.
Whistleblowing Mechanism: Any validator can submit a bond and raise a challenge against another validator.
Each filter in this class presents a mini mechanism design challenge of its own, as participants' incentives must be properly aligned to ensure the filter's effectiveness. For example, voters should be incentivized to vote according to their true beliefs.
Each validator receives a rating that reflects their overall protocol performance. This rating enables a reputation system that strengthens the slashing mechanism in two key ways:
First, it can be viewed as a statistical filter, serving as a tie-breaker in case of a dispute. For example, it supports more informed voting, thus enhancing a committee voting mechanism.
Second, it adds an additional penalizing dimension by allowing us to decrease a validator's rating instead of slashing their stake. This is especially helpful in cases of minor faults or faults that are not fully attributable or cannot be proven to be committed maliciously. That is, we can expand the model and denote by introducing . Penalty function is updated to reflect the fact that a penalty can be a reduce in reputation.
Reputation can be based on and measured by the following criteria:
Participation Rate: Measure the frequency and consistency of the validator's participation in protocol activities.
Report Accuracy: Evaluate how closely the validator's reports align with the aggregated results\ other reference.
Prediction of Abrupt Changes: Assess the validator's ability to predict sudden and significant changes in data values.
Trend Prediction Accuracy: Gauge the accuracy of the validator's predictions regarding long-term trends. For example let be two points in time where . We can check if during the interval validator reports where of an ascending nature (there are various methods to measure it, most naive one is the number of indices such that
Conformity to Expected Voting Distribution: Determine how closely the validator's votes match the expected distribution pattern. For example, assign each validator a vector where We expect . Additional statistical tests of greater complexity can be applied, but this demonstrates the core concept.
A reputation system can be used to relax actual stake slashing. Alternatively, we can consider:
Decreasing profile rate. For certain faults, actual slashing occurs only if the validator rate falls below a specific threshold.
Decreasing effective stake. Define:
Through the use of effective stake rating decrease impacts rewards and future income from the protocol. We can decrease effective stake either directly (decreasing ) or indirectly by decreasing reputation.
Reducing rewards and future income from the protocol.
Revoke - Prohibiting from the Protocol
Denote the filter applied in level and returns the probability for a mis-report with . Assume filters . In this section we discuss different methods for combining different filteres into a robust mechanism.
Avoid expensive computation if possible.
Some filters serve as an alarm before the fact, and not only apply after the event has occurred.
Complementing and correlated filters. For example, an initial definition for correlation can be considering a couple correlated if .
Complementing means both of them alerting is strong evidence.
Correlated means that if one of them is on, it is less surprising that the other one is also on.
Example for taking correlation into account: If then or was (w.h.p) manipulated and is not reliable. In this case, should be compared to an alternative aggregated value . In any case, in such scenario conditions should be altered.
Levels are ordered based on cost, execution time, and "type of proof". The ideal proof is logical and the last resort involves matching with off-chain information
We suggest a chain-filter mechanism, modeled after the multilevel court structure. This involves applying a sequence of filters, starting with cost-effective, quick filters, and moving to more expensive ones if needed. At each step, we either make a definitive decision or advance to the next level. To prevent "honest slashing," we prioritize precision at each stage. If a situation is unclear, we turn to more robust, costly methods to classify a report as fraudulent.
Formally, at every step we compute a function . If the result is either 0 or 1 to the report is considered honest or fraud respectively and the detection process is over. means we could not reach a decision and we continue to compute. level functions satisfies:
Meaning that once a definite decision is reached inside a level, we conclude if the report is fraud or not. Note:
We only convict nodes along the way.
This is a general design and different data may require different filters and ordering
We can apply filters in a “surprise inspection” manner
In addition to deciding where to position ourselves on the precision-recall curve, we need to outline the relationships between different levels. Considerations include the keys by which the levels are sorted - type of proof provided, cost, and execution time.
We assign each player a number which represents the probability that the player report is a fraud. That is, “definitely a fraud” corresponds to and “definitely not a fraud” corresponds to . This number could be the output of an AI algorithm, as mentioned above.
A natural rule for deciding on who reported false information is a threshold rule: we decide on a threshold and determine that reports with are considered as frauds. The threshold can be adjusted to fit the system's specific requirements.
Different layers contain multiple filters, each connected by an 'and' relation, while the relationship between layers can be either 'or' or 'veto.' A 'veto' condition is not necessarily positioned in the first layer due to considerations of cost and execution time; it may be used as a last resort because of these factors.
This framework proposes a comprehensive approach to on-chain subjective slashing in blockchain oracle networks, addressing the challenge of penalizing inaccurate data submissions while protecting honest validators. Key components include:
Multiple filtering layers combining automated detection (statistical models, pattern recognition) with human judgment mechanisms
A reputation system that provides an additional dimension for penalties and serves as a statistical filter
A chain-filter mechanism that progresses from cost-effective to more expensive verification methods
Flexible threshold rules and inter-level relationships that can be tuned based on specific system requirements
The framework aims to balance precision and recall while maintaining economic feasibility and execution efficiency.
Compatible with AggregatorV3Interface.
BTC/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
EUR/USD
8
1.0%
24 hours
LBTC/BTC
18
1.0%
24 hours
Fundamental exchange Rate
LBTC/USD
8
1.0%
24 hours
Fundamental exchange Rate
LINK/USD
8
1.0%
24 hours
SOL/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
XDC/USD
8
1.0%
24 hours
eBTC/BTC
8
1.0%
24 hours
Fundamental exchange Rate
ezETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
sFRAX/FRAX
18
1.0%
24 hours
Fundamental exchange Rate
sfrxETH/frxETH
18
1.0%
24 hours
Fundamental exchange Rate
stETH/ETH
18
1.0%
24 hours
Fundamental exchange Rate
weETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
wstETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
yUSD/USDC
18
1.0%
24 hours
ynETH/ETH
18
1.0%
24 hours
Fundamental exchange Rate
Explorer: https://basescan.org
Chainlist: https://chainlist.org/chain/8453
Compatible with AggregatorV3Interface.
BTC/USD
8
0.5%
24 hours
ETH/USD
8
0.5%
24 hours
PLUME/USD
8
0.5%
24 hours
USDC/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
nALPHA/USD
8
0.5%
24 hours
nBASIS/USD
8
0.5%
24 hours
nCREDIT/USD
8
0.5%
24 hours
nELIXIR/USD
8
0.5%
24 hours
nETF/USD
8
0.5%
24 hours
nINSTO/USD
8
0.5%
24 hours
nPAYFI/USD
8
0.5%
24 hours
nTBILL/USD
8
0.5%
24 hours
pETH/USD
8
0.5%
24 hours
pUSD/USD
8
0.5%
24 hours
sdeUSD/deUSD
18
0.5%
24 hours
wOETH/OETH
18
0.5%
24 hours
wsrUSD/USD
8
0.5%
24 hours
Fundamental exchange Rate
wsuperOETHp/USD
8
0.5%
24 hours
xUSD/USD*
8
0.5%
24 hours
Fundamental exchange Rate
yUSD/USD
8
0.5%
24 hours
Fundamental exchange Rate
yUSD/USDC
18
0.5%
24 hours
* The xUSD feed reflects the exchange rate published by Stream’s xUSD vault contract (https://etherscan.io/address/0xE2Fc85BfB48C4cF147921fBE110cf92Ef9f26F94), sourced from the roundPricePerShare function. This rate is based on Stream’s internal NAV reporting and is not backed by an independent proof of reserves.
Explorer: https://explorer.plume.org/
Chainlist: https://chainlist.org/chain/98866</a
Compatible with AggregatorV3Interface.
AVAX/USD
8
1.0%
24 hours
BBTC/USD
8
1.0%
24 hours
BBUSD/USD
8
1.0%
24 hours
BNB/USD
8
1.0%
24 hours
DAI/USD
8
1.0%
24 hours
ETH/USD
8
1.0%
24 hours
M-BTC/USD
8
1.0%
24 hours
STONE/USD
8
1.0%
24 hours
Fundamental exchange Rate
SolvBTC/USD
8
1.0%
24 hours
SolvBTCb/USD
8
1.0%
24 hours
SolvBTCm/USD
8
1.0%
24 hours
USDC/USD
8
1.0%
24 hours
USDS/USD
8
1.0%
24 hours
USDT/USD
8
1.0%
24 hours
WBTC/USD
8
1.0%
24 hours
ezETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
uniETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
weETH/ETH
18
1.0%
24 hours
Fundamental exchange Rate
weETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
wrsETH/USD
8
1.0%
24 hours
wstETH/USD
8
1.0%
24 hours
Fundamental exchange Rate
xUSD/USD*
8
1.0%
24 hours
Fundamental exchange Rate
* The xUSD feed reflects the exchange rate published by Stream’s xUSD vault contract (https://etherscan.io/address/0xE2Fc85BfB48C4cF147921fBE110cf92Ef9f26F94), sourced from the roundPricePerShare function. This rate is based on Stream’s internal NAV reporting and is not backed by an independent proof of reserves.
Explorer: https://lineascan.build
Chainlist: https://chainlist.org/chain/59144
Compatible with AggregatorV3Interface.
BB.sNECT/NECT
18
0.5%
24 hours
BTC/USD
8
0.5%
24 hours
ETH/USD
8
0.5%
24 hours
HONEY/USD
8
0.5%
24 hours
NECT/USD
8
0.5%
24 hours
STONE/ETH
18
0.5%
24 hours
Fundamental exchange Rate
STONE/USD
8
0.5%
24 hours
Fundamental exchange Rate
SolvBTC/USD
8
0.5%
24 hours
SolvBTCBBN/USD
8
0.5%
24 hours
Fundamental exchange Rate
USDC/USD
8
0.5%
24 hours
USDT/USD
8
0.5%
24 hours
USDe/USD
8
0.5%
24 hours
WBERA/USD
8
0.5%
24 hours
eBTC/BTC
8
0.5%
24 hours
Fundamental exchange Rate
iBERA/USD
8
0.5%
24 hours
iBGT/USD
8
0.5%
24 hours
pumpBTC/USD
8
0.5%
24 hours
rsETH/USD
8
0.5%
24 hours
Fundamental exchange Rate
rswETH/ETH
18
0.5%
24 hours
Fundamental exchange Rate
sNECT/NECT
18
0.5%
24 hours
sUSDe/USDe
18
0.5%
24 hours
Fundamental exchange Rate
uniBTC/USD
8
0.5%
24 hours
Fundamental exchange Rate
weETH/USD
8
0.5%
24 hours
Fundamental exchange Rate
xBTC/BTC
8
0.5%
24 hours
Fundamental exchange Rate
xETH/ETH
18
0.5%
24 hours
Fundamental exchange Rate
xUSD/USD*
8
0.5%
24 hours
Fundamental exchange Rate
* The xUSD feed reflects the exchange rate published by Stream’s xUSD vault contract (https://etherscan.io/address/0xE2Fc85BfB48C4cF147921fBE110cf92Ef9f26F94), sourced from the roundPricePerShare function. This rate is based on Stream’s internal NAV reporting and is not backed by an independent proof of reserves.
Explorer: https://80094.routescan.io/
Chainlist: https://chainlist.org/chain/80094