Oracle Team Collateral Onboarding Methodology

Introduction

Now that the MIPs Framework has been ratified, a flood of collateral onboarding applications has started poping up. It’s great to see so much enthusiasm from partners to be added as collateral. The Maker Protocol needs a diverse portfolio of collateral to grow and reinforce the peg. As the Oracle Team, it’s my responsibility to help guide the community through the collateral onboarding process with respect to Oracles by giving recommendations. At the same time, my duties extend much further as is documented in the Oracle Team Mandate.

Just to give an idea of what I’m talking about, here’s a non-exhaustive list of what my team is currently juggling in no particular order.

Collateral Onboarding

  • Assist the community in onboarding new collateral types

Data Integrity

  • Integrate signed Coinbase data in the Oracles
  • Implement a robust and trustless way to query on-chain storage values along with Merkel proofs to validate data

Incentives

  • Come up with a proper incentive mechanism for relayers

Reliability

  • Create redundancy within every module of the Oracle client so a bug in any module can’t halt the Oracle protocol. Think geth vs parity in 2016 when all the geth nodes crashed simultaneously and the Ethereum network would have halted were it not for the parity nodes holding up the network on their own.
  • In particular I want to ensure we’re not reliant on Secure Scuttlebutt and instead have multiple Transport Layers for Feeds to gossip price data.

Resiliency

  • Create a tx manager service with more dynamic gas pricing algorithm when network is congested to stabilize Oracle latency even when the market is at its most volatile

Data Sensitivity

  • Implement more advanced stateful price querying tooling that can implement much more advanced Data Models such as CCCAGG (a time-sensitive volume weighted average price).

Decentralization

  • Find and assist new high quality partners to onboard as Light Feeds
  • Create a smart contract to fund Feed payments through the Maker Protocol in a decentralized manner.
  • Create a decentralized oracle freeze mechanism similar to the decentralized emergency shutdown module to enable MKR holders to intervene in the case of an Oracle attack.

Product

  • Find and assist Ethereum projects to get whitelisted Oracle access

Transparency

  • Implement an Oracle Feed Dashboard that shows all Oracle and Feed data in an easy to understand format.

Governance

  • Create a robust MIP framework around Oracle governance (formerly the Oracle Governance Framework defined in the Oracle Team Mandate).

Velocity

  • Find, hire, and onboard talented developers to my team to increase the quality and throughput of the Oracle Team’s work.

Yea, that’s a lot. The context for most of these is missing which I hope to remedy at a later date.

There needs to be a balance between time devoted to the collateral onboarding process and time devoted to ensuring the long term development of the Oracle protocol.
What collateral types are easy to create Oracles for? Which ones are difficult? Are there enough data sources to create a secure Oracle that is resistant to manipulation? Is there a key piece missing that is blocking us from onboarding this collateral type until some later point in time?

While I don’t presume to have all the answers, I’ve composed an initial methodology the Oracle Team is utilizing to prioritize collateral onboarding work versus other endeavors in the Oracle domain.


Collateral Onboarding Proposal Methodology

V1.1.0
Author: Niklas Kunkel
Contributors:
Date: 05/05/20

Disclaimer

The collateral onboarding recommendations published by the Oracle Team in accordance with MIP6 are independent and non-binding. In other words, a recommendation by the Oracle Team does not impede the collateral onboarding process. Ultimately it is the community’s decision whether to include a collateral in the Maker Protocol. Like other Domain Teams, The Oracle Team has no authority to unilaterally include or exclude a collateral, it merely provides guidance in relation to Oracles. This methodology is intended to evolve over time to adapt to the ever changing circumstances affecting Maker governance.

The recommendations are evaluated in relation to the technical feasibility and technical complexity to create a secure Oracle for the proposed collateral type and don’t take any other factors into account.

Technical Complexity Evaluation

Determine how complex integrating Oracles for this collateral type would be and how long it would take to implement such a solution.

Data Sources:
Are there sufficient high quality data sources available to construct a reliable, resilient, and secure Oracle for the proposed collateral asset?

Can the current tooling support the types of data sources that are needed?

Data Models:
Can the current tooling support the type of data modeling that is needed?

What would it take to expand the functionality of the current tooling to support the necessary data sources and data modeling?

Missing:
Is a solution currently not possible because of a key missing component?

Externalities:
What dependencies (both technical and system), risks, costs, and latency are added to the Oracle Protocol as a function of these added features?

Deliverables:
Does adding these features interfere with ongoing or planned development of new tooling on the Oracle Team roadmap?

Does adding these features require work and/or coordination from other stakeholders such as the Smart Contracts Team(s)?

How long would it take to design, implement, test, and deploy such a solution?

Recommendation

This is where the Oracle Team delivers its recommendation to the community whether to accept or defer the collateral onboarding along with a short summary of the reasons for doing so. Note that the Collateral Onboarding Greenlight Vote takes place regardless of the Oracle Team’s recommendation. It is merely one data point among many that Maker Governance needs to factor into their decision.


Changelog

  • Fixed formulas
  • Added ranges to define low economic impact and high economic impact
  • Changed variable names in formulas
  • Added disclaimer
  • Removed economic impact as a factor for the recommendation
  • Removed Evaluation matrix
  • Formatted Technical Complexity Evaluation
9 Likes

In ballpark figures what are we considering “Low economic impact” vs “High economic impact”

minCR? Should it not be avgCR?

Thanks for sharing this @NikKunkel , great to get a summary of what ‘Green Team’ is working on. Looking forward to seeing greenlight’s from you on the MIP6 proposals in the coming weeks.

Fixed the formula, it should divide by the CR not multiply. Now it should make more sense why the minCR correlates to the upper bound and the avgCR to the lower bound.

2 Likes

That’s a good question. I think these are dynamic and will go up as the system scales. At the current supply of ~111M I suggest the folllowing:

Low = [0, 2M] Dai
High = [2M, +] Dai

2 Likes

One more piece of quick feedback “Capture rate” might not be the best terminology to use here because it kind of conflicts with “Collateralization Ratio” when abbreviated.

Was kind of confusing to me when reading the your formula there as i would kind of interchange the two as i was reading it to myself. Maybe call it market cap utilization (MCU) or make it market cap capture rate (MCCR). Just a suggestion.

1 Like

Thanks for the input. I agree the overloading of the names is unccesarily confusing. Additionally captureRate is inappropriate because it’s not actually a rate. I like your idea of naming it Market Cap Utilization (MCU) because it makes the relationship to the Total Market Cap more evident.

Hey Nik, I had a question about the feed stipends. With the original oracle mandate, governance ratified the following:

MKR governors signal their intent to fund Feed Stipends through Stability Fees once the functionality for this has been added to the Maker Protocol.

I’m pretty sure this is technically possible using executive spells, but it sounds like there is a further technical solution planned?

I could see these stipends being paid through the MIPs process using a subproposal every month, such a solution would not be ideal with the current governance cycle though, as due to bundling if any monthly executive failed oracles would not receive their stipends.

I note there is a mention of this in the ‘Decentralization’ section above, how high a priority is this?

Thanks for bringing that up. Currently the Foundation pays the monthly Feed Stipends (1000 DAI per month per Feed). Having the funding replaced by the protocol I think is an important final step for governance to complete taking over ownership of the Oracles. That being said it’s quite messy to implement through the current executive spells, nor would the timing be particularly prudent given the current interest rate climate.

I sketched out a POC a while back that would support a generalized decentralized payment mechanism that governance could use among other things to distribute Feed Stipends. In the short term there are more pressing concerns on the roadmap so it’s difficult to estimate when this will be ready.

3 Likes

Can we add other factors to this evaluation process? For example risk profile/volatility metric with respect to the average volatility of current collateral? I’m sure there are additional factors that should be considered as well. Also the economic impact should probably be adjusted for percent of the token staked and therefore not a part of the circulating market cap.

1 Like

Can we add other factors to this evaluation process?

Of course!

For example risk profile/volatility metric with respect to the average volatility of current collateral?

I think those are all valid metrics to look at. But take note this methodlogy just represents the Oracles piece of the equation. To me the risk profile / volatility fall under the auspices of the Risk Team(s) rather than the Oracle Team.

Also the economic impact should probably be adjusted for percent of the token staked and therefore not a part of the circulating market cap.

Not all tokens are stakable. But even for the ones that are, why would we discount the amount staked? The stakers could remove their staked tokens and move them into the Maker Protocol no? I have a feeling staking is going to evolve similar to how hash power and mining evolved. A bunch of miners/stakers freely moving to whichever blockchain/token has the best returns.

Let me know what you think.

2 Likes

Ultimately we’ll need some sort of combined set of factors that we consider before accepting a collateral. We can split this into separate groups like oracle, risk team etc, but this increases bureaucratic load and we also have to take all these different results and distill them down into a conclusion at the end.

Might make sense to bypass all of that and have a set of factors out of the gate that’s applicable to the makerdao as a whole.

This is exactly what the Domain Greenlight process does. Each Domain Team (Risk, Smart-Contract, and Oracle) gives a reccomendation for a given collateral onboarding application, then governance has a vote to determine whether to proceed.

1 Like

I’ve overhauled the onboarding methodology after receiving feedback in private from the community.

The first thing I want to clear is up is that this methodology exists exclusively to generate recommendations. The Community Greenlight Polls for deciding whether to pursue the next steps in the collateral onboarding process take place regardless of the recommendation delivered by the Domain Teams. These recommendations are just one data point among many that the community should factor in to their decision.

Second, I’ve removed the Economic Impact section. While I do think the community should take into account the potential to generate Dai when deciding whether to onboard a collateral type, it’s not a parameter that should determine whether a collateral type is added. It’s merely in indicator for prioritization. The evaluation matrix of mapping technical complexity to economic impact did not properly represent this relationship.

I want to emphasize this is a live document. The community should adapt and update the methodology as we discover what works and what doesn’t.

5 Likes