Other Presentations and Updates
MIPs Portal
47:50
- Hello everybody, my company works in the creation of the MIPs portal. It’s challenging to find things on GitHub. We’re consulted for building this portal to show all the MIPs. I presented it on the forum, and here it is. This is a work in progress. It is the first iteration, and it’s completely functional. We are listing all the main MIPs with their status and relevant links to the MIP on GitHub. So far, they can be filtered by. There is an available search that would permit you to search for any MIP you want to find. The search is quite general, but we are defining some elements at the moment. It could change the behavior in the future. The portal also has filtering, so you can filter by status. If somebody is looking for a specific status, it can filter for more than one. You can then navigate to the details of a given MIP, where we have a general index for navigation. We have the information pulled directly from GitHub, so of course, this is similar to what we’ll see on GitHub. The data is also analyzed. You see here on the right that we are extracting information directly from the MIP. There is more work being done here. We expect to extract much more information shortly. With this portal, it’ll be possible to add much more information. The software will detect this information and will make it accessible from here. There will be links to the MIPs on the forum where discussions are being held. All the poll requests are being shown here for expectations. There is work being done to link the request specifically to this MIP. There is also work being done right now regarding the soup proposals. As you can see, we are listing only the proposals. We are trying our best to follow all the guidelines within the design and creation of components. We are working on proposals for displaying the soup proposals, which are coming soon. However, they are not available right now. Also, each MIP has a sentence, including a paragraph summary. I added the repository. If anyone wants to contribute to the development, that would be great. In the repository, we detailed the general architecture of the system. At the moment, we are indexing all this information on the backend. We have a webhook so that whenever any change happens on this repository, it will call the backend, updating the index. That will reflect directly here.
- Juan: Can we see the feedback button? It should be on the bottom right of the website. It appears to be hidden currently.
- Adrián Rivero: Please feel free to make recommendation requests. We are happy to hear from the community to help improve the portal.
- LongForWisdom: Great, Thank you Adrián, we’re looking forward to having further development on that in the coming months.
Nik Kunkel
Oracle Sensitivity Changes
55:46
- It’s no surprise that gas prices have been really high. You’ve seen people complaining that the gas prices are high, but it doesn’t just affect users. It affects us as a protocol as well. One key example is oracles. To update an oracle, we are spending an obscene amount of money on gas. It’s within the $20,000 per day range. I’ll let you do your own mental math about what this comes out to in a year; It’s a lot. And this is after we’ve done our tech optimizations to cut down gas usage relative to the previous version of oracles. Something has got to give here. We’ve looked at a variety of things, such as zero-knowledge proofs. We’ve looked across the board, but the fact of the matter is that there is no good solution. There are only solutions that come with trade-offs. One of those trade-offs is that you update the oracle less frequently. I’ve been talking with Primoz about this concerning the risk perspective because there is more collateral risk from a reviewer’s liquidation point if you update less often. Collateral may not get liquidated as quickly, and that’s more risk for the protocol.
- Juan has done a fantastic job. He compiled data showing the frequency of oracle updates at various oracle sensitivities. What is sensitivity? Right now, for significant assets like ETH and wBTC, we update them on a half of a percent type of sensitivity. That means that every time the price you see in the market differs from the price that the oracle is reporting by half a percent, the oracle is updated. For non-major assets, we update about every one percent.
- What Juan has done here map out if we were to select a different sensitivity regarding oracle updates’ frequency. It’s mapped out by asset. The important thing here is that you don’t compare the various assets because these collateral types were onboarded at different times. There may be a discrepancy between them. Therefore, you can’t really do relevancy between Comp token or YFI token. But what you can do for each asset is identify how the number of updates you need to make changes depending on how much you manipulate the sensitivity. ETH, for example, did about 2700 updates. If we were to raise that to about three percent sensitivity, we would get that down to 608 updates. That will be the order of magnitude change we could expect in terms of the number of transactions we’d have to do. This is directly proportional to the decrease in cost that we’d have. Right now, we’re spending about $20,000 a day. If we did something like this, we could get this cost down to $5,000 a day, which would be a huge improvement.
Open Discussion
1:02:15
- Matthew Rabinowitz: Are you doing updates based on percentage change or with frequency?
- Nik: We do both. The trigger for the medianizer is either a time-based update or a sensitivity update. I believe the time is a little over four hours, but we rarely hit that. It’s almost always the sensitivity that gets hit. First, that’s on the medianizer front for the oracle security module that gets updated at the top of every hour. That is a fixed variable. No matter what, you’re doing 24 transactions a day on the oracle security module. Then you have to add the number of transactions that you’re doing on the medianizer, and that’s for each asset.
- Matthew Rabinowitz: The reason I ask is that it falls into the trend we spoke about last week concerning risk. If the price is going up, it’s not a risk question. It’s only a risk when pricing is going down.
- Nik: Yes, that’s a very astute observation. I’ve discussed with the main digit actors as well as Primoz. Essentially, it’s for assets that have larger liquidation ratios above 150%. We would feel comfortable raising the sensitivity of those by either 3% or 4%. Looking at the ones we have here, that would be KNC, ZRX, MANA, LRC, BAL, BAT, and AAVE. Then the ones that are starred, which are YFI, UNI, and LINK, should be on here. We are still unsure about these. we have more DAI minted against the assets. Hence, the system has more exposure to them. There is essentially more risk, but I still think the cost savings are likely worth it. I’ll leave the decision on those three up to Primoz and the rest of the risk team.
- Primoz: The way this would affect vaults is that they would potentially be liquidated a bit later. As Nik said, we’re talking about a small percentage figure compared to the liquidation ratio that we’re using for these vaults. I don’t see a huge difference in changing risk profile by implementing this from a market perspective. Some could even argue that this could be positive for vault users that want to unwind. They would actually have more time to unwind. There may be some other implications such as integration risks and so on. Still, I’ll let others working on integrations explain any downsides. In regards to market risk, it’s not a huge problem, in my opinion.
- Will?: Nik and Primoz, can I make a suggestion on this one? Don’t base it only on the liquidation ratio perspective. Some of those assets also have liquidation ratios due to their volatility. Wait to increase the sensitivity with some volatility denominator standard deviation or whatever just to make sure that you’re still appropriately weighing the level of downside risk appropriately.
- Primoz: That’s a good point. I think the percentages that we’re talking about are a bit small. It’s not a massive change in market risk profile. Then you have the auction, which is something random on its own. I personally don’t see a huge problem.
- Christopher Mooney: The point I wanted to make is that a lot of the focus has been on the cost and additional risk that it makes for Maker. However, this was brought up and is being suggested in the mandated actor’s group. We have customers of our oracles, and this affects them. Nik, were you planning on doing this across the board for everything or limiting this to price feeds where we’re the sole consumer?
- Nik: Yes, right now, it’s going to be limited to ones where we’re the sole consumer. Fortunately, that’s the majority of them. The way that the distribution of our oracle demand is spread out is between ETH and wBTC. Those are the ones that we’re not changing. They’re on a higher sensitivity of a half percent. Those are the most expensive ones to run. I think it’s totally worth it, especially since the Maker protocol’s amount in exposure to those is huge. You definitely want to make sure your oracle for those is as accurate as possible. We also see people frequently request the oracle security module. They’re not using it because it’s a gauge of a fair market price. Instead, they’re using it because they want to perform vault management. The only thing they care about is that Maker will liquidate on the following price update in an hour or not. It doesn’t really matter if we change the sensitivity of the medianizer from an OSM point of view. The customer just cares about liquidation coming, yes or no, in the future. That is something that we’re going to have to look at. We could say that if you want to use an oracle, our default is 3-4% sensitivity. If you want a more sensitive one, then you can pay for the gas. I just want to bring up that this problem is not only exclusive to us. Chainlink has the same problem but much, much worse. This is an ecosystem-wide problem. The long and short result is that DeFi needs oracles, and DeFi cannot be run without oracles. There are two paths from here. Either people pay up because they realize running these oracles is too expensive. It’s a service that they need to invest a significant amount of money into. The other way is that people stop caring about the decentralization of oracles. If a decentralized oracle costs $100,000 a year, why would I pay that when I can run my own little bot sending a price update. This would be cheaper in gas, but it’s not decentralized. Our industry is full of people who are very ideologically driven and believe in decentralized finance because of the decentralized aspect. When we get an influx of new users, will those values be passed down to these new users? Is DeFi simply the latest casino? The idea of valuing decentralization is really lost where decentralization is a meme or a word without any real meaning rather than a principle by which to design and use systems. That sounds a bit somber, but that’s kind-of where we’re at. If you look at Binance smart chain, it’s an EVM in which many validators are all run by Binance. It’s not decentralized at all, but the growth has been insane. What do you make of that? On the one hand, you can say that Binance is a massive exchange with many users. It’s effortless for them to funnel their users onto the Binance smart chain. You could also say that there’s a lot of people who use the Binance smart chain who don’t care about decentralization. Thus, to each their own. From a business point of view, that’s something we need to take a look at. Are our oracles simply a necessary expense right now for the Maker protocol, or is this a business we want to get into? We know there is a need in the industry. Can we serve that need?
- Somebody: Does ETH 2.0 change anything at all?
- Nik: Yes and no. ETH 2.0 will bring a proof of stake where you get a 3x improvement in scale and sharding. Presumably, people would be running on various shards. That being said, it’s not this pretty picture that everyone makes it out to be because you need to be on the same shard if you want to have composability. That means that if Maker is on a shard, where does everyone else want to be? Does Uniswap want to be on the same shard as Maker? Probably, or maybe it’s the other way around. So you have this effect where everyone wants to live in Manhattan, but the cost to do a transaction on the DeFi shard will still be expensive. It’ll be like prime real estate. That being said, other things that generate a lot of activity right now don’t have to be on that shard. We will see some reduction, but it is unclear what degree and what the costs will look like. Nonetheless, I am sure that the prominent DeFi protocols in a world where ETH 2.0 is launched are that they will just be printing money. Even if the oracles are expensive, they can afford it. This I am 100% sure of.
- LongForWisdom: There were other questions in the sidebar that we could get to. Who wants to go?
- Nik: Someone asked about the UNI LP oracle audit. It’s currently underway. We have the initial feedback, but it’s still ongoing. The way this usually works is they give you an initial report. Then, we review the report, fix any issues, or argue that something they cited isn’t really an issue. After that, they go back and make a final report, which is what’s published. We’re still a good week out from getting the initial report.
- Nik: Tim also asks if we have any idea how much our customers are willing to pay for the oracles? Right now, it’s all free. To be honest, I think we have to keep it free. It’s the only way to gain market share. It doesn’t cost us that much more. We have to make these oracles either way for the Maker protocol. Expanding the usage to customers doesn’t cost us anything more. To give them a price wouldn’t make sense. You can look at the big picture of how critical oracles will be in the future. Chainlink market cap is a ridiculous number. For their business’s size at the moment and its profitability, yes, that’s a ridiculous number. Still, if DeFi scales the way I believe it will, then oracles will be a huge business. We want a piece of that pie. This is not the time to be cost neutral on oracles. We’re fortunate to be in the position where Maker protocol is making a ton of money. We should be investing that money. I think this is an excellent bet.
- Frank Cruz: With regards to flash loans. If you lower it to 3% or 4%, let’s say on AAVE, would that entice people even if they realize the sensitivity is down?
- Nik: Not quite sure what you mean. The collateralization ratio is significantly higher than this 3 or 4% sensitivity change. In the worst case, if the oracle is off by 4%. The collateralization ratio is 175. There’s no way to do a flash loan and run away with money because you still have to be significantly over collateralized.
- LongForWisdom: We’re approaching the end of the meeting. Any last questions or comments before we wrap up?
- Nik: I see Aaron had another question concerning our oracle customers’ payment structure and how it is going to look like? That’s up for debate. My personal opinion is that now is not the time to worry about charging people. Instead, we should focus on getting as much market share as possible. This pie is going to keep growing, and you want to secure your position in the pie.
- Tim: Yes, but then we shouldn’t worry too much about the money we have to spend on the oracle, right?
- Nik: Yes. Now, if there are no customers that are using it, then let’s cut costs. Let’s say that a customer wants a sensitive UNI oracle. We can take a look at whether we want to take a loss on that one or make a deal that subsidizes the actual cost. Those kinds of things are still up for debate. If anyone in the community feels strongly about that, I will work on a proposal with them.
- Juan: Going to the other side of the spectrum, you could have the oracle update every 0.1% move, and you will have a transaction on every block, which will be very expensive. However, that’s not a reason to not do it. It’ll make things more optimal.
- David Utrobin: Is it true that if there are enough customers for a given oracle, everybody’s overall cost goes down, and we could charge less for oracles?
- Nik: Yes, that’s true. To do that, we need market share. We need a critical mass of users, which we do not yet have. This is my goal. I want my core unit to take the business component of oracles much more seriously. We need to scale this up. We have a real opportunity here. Our oracles are just as good as Chainlinks’, and from a gas point of view, they’re ten times better. We need to leverage that and market it.
- Matthew Rabinowitz: Are you talking about expanding the scopes of MakerDAO to include pricing crypto assets as well as others using those oracles?
- Nik: Yes, and I think that we are uniquely positioned. I believe in a future where many RWAs want to use Maker as a credit facility. If you’re going to be onboarded to Maker, you’ll need Maker oracles to price your stuff. It’s essentially a moat business. Suppose the protocol says we don’t take Chainlink oracles because we only accept Maker oracles. In that case, this will drive business to us. People want to use us as a credit mechanism. There’s a lot of ways where we can get customers which Chainlink can’t poach from us. It’s intrinsic to using Maker in the first place. There are many ways to carve out a significant chunk of the market, which gives us a massive advantage to our oracle business. I’ll have my core unit proposal out in the next few months, and it’ll be extensive.
- LongForWisdom: Thank you, Nik. That was a very valuable discussion on the value that oracles could bring. We’ll wrap up here. Thank you, everyone, for coming.
Common Abbreviated Terms
MCD
: The Multi-Collateral Dai system
CR
: Collateralization Ratio
DC
: Debt Ceiling
ES
: Emergency Shutdown
GF
: Governance Facilitator
SF
: Stability Fee
DSR
: Dai Savings Rate
MIP
: Maker Improvement Proposal
OSM
: Oracle Security Module
LR
: Liquidation Ratio
RWA
: Real-World Asset
Credits
- Artem Gordon produced this summary.
- David Utrobin produced this summary.
- Denis Mitchell produced this summary.
- Jose Ferrari produced this summary.
- Everyone who spoke and presented on the call, listed in the headers.