Operational Data Core Unit - Survey Results Summary

Operational Data Core Unit [Incubation] - Survey Results Summary

During August and September, our team conducted a survey to assess the need for a core unit dedicated exclusively to providing and maintaining Maker-related data services, aiming mainly to improve upon the existing APIs that provide operational/transactional data to dapps, dashboards, bots, etc.

This effort aimed to gather more information about the community’s current needs, mainly the core units maintaining and building the next generation of tools for the DAO. Today we are sharing the results with the whole ecosystem because we believe our findings can be beneficial not only to ourselves but also to anyone who cares about the good health of the applications and infrastructure that support and help this community grow.

We’re firm believers in lean methodologies, so we conducted this research to ensure the service we provide to the DAO is the best one possible. The value-added will benefit all the existing working groups and core units and the hundreds that will come in the future. The data resulting from this survey is being analyzed in detail and poured into a Value Proposition Canvas and Empathy Map that will be part of our core unit proposal. These will help us shape our work into an excellent service for the community.

The survey consisted of 20 questions to understand the current data needs and how the MakerDAO stakeholders imagine these needs will change in the future. The survey also explores current data sources and how the existing solutions are handling these demands.

Seven existing and future Core Units and one independent team that provides services to the DAO replied to the survey. We may update this summary in the future as more teams collaborate by providing their input on the survey.

Key Findings

1. The need

From our analysis of this data, we conclude that the core units and teams creating and maintaining the main applications and infrastructure that give life to the DAO find the existing data solutions insufficient for their current and future needs.

To scale rapidly and reliably, we need open-source, decentralized solutions, always up-to-date with the latest protocol’s changes, highly reliable, easy to maintain, and easy to integrate with existing and new applications. Most importantly, these services need to ensure continuity, making this job a long-term commitment.

Below you can find some of the most relevant answers that take us to this conclusion. You can also skip to the end and access the raw data directly.

2. Level of satisfaction with the current solution

  • Average: 5 out of 10
  • Min: 3 out of 10
  • Max: 8 out of 10

3. Most important data attributes

  1. Data correctness/exactness
  2. Up-to-date with latest protocol changes
  3. Open-source
  4. Full-range historical data
  5. Reliability/uptime

4. Most essential attributes of a data provider

  1. Long-term availability. Knowledge that the data provider will be there for the period I plan to use it.
  2. That they provide a way out in case we can’t use their service anymore.
  3. Data export and interoperability.
  4. Straightforward onboarding. Support.
  5. Easiness to fork/deploy.
  6. Low maintenance cost.

5. Types of data consumed


  1. Governance: 6 out of 8 respondents
  2. Vaults: 5 out of 8 respondents
  3. Token info (MKR, DAI): 5 out of 8 respondents
  4. Liquidations: 3 out of 8 respondents
  5. Oracles: 4 out of 8 respondents
  6. Stats/analytics: 3 out of 8 respondents
  7. GAS Consumption: 3 out of 8 respondents
  8. DSR: 3 out of 8 respondents
  9. Protocol parameters: 3 out of 8 respondents

6. What is this data being used for


  1. Dapp and dashboards: 6 out of 8 respondents
  2. Docs and reports: 3 out of 8 respondents
  3. Notifications: 3 out of 8 respondents
  4. Bot: 2 out of 8 respondents
  5. Analysis: 2 out of 8 respondents
  6. Monitoring: 2 out of 8 respondents
  7. Marketing campaigns: 1 out of 8 respondents

7. Current data services used

  1. Maker’s API (Vulcanize, Spock): 5 out of 8 respondents
  2. web3 / contract events: 2 out of 8 respondents
  3. Subgraphs: 1 out of 8 respondents
  4. Dune: 1 out of 8 respondents
  5. Flaps Auction API: 1 out of 8 respondents
  6. OasisDex API: 1 out of 8 respondents

8. Drawbacks from current data sources

  1. No long-term commitment
  2. Unreliability
  3. Centralization
  4. Missing data or errors in processing/storage
  5. Proprietary
  6. Self-hosted
  7. Security
  8. Low community awareness

9. Main requests for a new data source

  1. Open-source
  2. Decentralized
  3. Updated with recent changes to the protocol
  4. Free access
  5. Good documentation
  6. Low cost
  7. Good / best coverage of the Maker Protocol

The source data from the survey can be found and downloaded here: https://docs.google.com/spreadsheets/d/1Rnilq4bAcE3dkNt4Ati3oyi6JJGn1iyP8mQ9V2gqkuM/edit#gid=2052605494

Also, if you would like to contribute your experience and complete the survey, you can do it here: https://www.surveymonkey.com/r/WVF28VY

You can find us at the SES Discord or dm us on Telegram: @leolower


What are the tools that are being looked at for decentralizing anything from a peer-to-peer DNS provider (maybe wishful thinking) to email services via something like an IPFS gateway?—and what Core Unit will lead in finding/researching such tools? And/or Is this something that is just being discussed/still at the planning stage?

Sorry for the late reply, I just saw the draft so I guess I did’t submit my original reply.

We will be using The Graph decentralized network for the new APIs, this means there will be many indexers/servers running our software and answering queries made by the other CU’s applications. Also, all of our code will be open source so anyone needing to understand how our subgraphs work or who wants to fork and create their own versions of them could easily do it.

Our core unit will only focus on data services, specifically operational/transaction data. That is APIs that expose what is happening on Maker’s smart contracts.